Cassini spacecraft completes last flyby of Saturn’s Dione

Cassini’s farewell tour officially got underway on Monday, as the NASA spacecraft made its final close flyby of Saturn’s icy moon Dione and conducted a gravity experiment during closest approach to the satellite, officials from the US space agency have confirmed.

The probe made its closest approach to Dione’s surface at 2:33pm EDT (11:33am PDT), coming within 295 miles (474 km) of the moon’s surface. During the flyby, it collected data about the internal structure of the satellite and the rigidity of its outer shell, as well as a set of observations from the well-lit, anti-Saturn side using its camera and spectrometer instruments.

In a statement, mission controllers said that they expected the first new images to start arriving on Earth within a few days following Cassini’s final encounter with the moon. During the flyby, the spacecraft’s instruments were expected to get a high-resolution look at Dione’s north pole, a feature that it had not previously been able to get a good look at.

In addition, the probe’s Composite Infrared Spectrometer (CIRS) was set to map areas on the icy moon that had unusual thermal anomalies, or regions that are especially good at trapping heat. In addition, Cassini’s Cosmic Dust Analyzer was set to continue its search for dust particles emitted by the moon during its fifth targeted encounter with Dione.

Taking one final look at ‘an enigma’

“Dione has been an enigma,” said Bonnie Buratti, a Cassini science team member at the NASA Jet Propulsion Laboratory (JPL) in California. She explained that it has been “giving hints of active geologic processes, including a transient atmosphere and evidence of ice volcanoes. But we’ve never found the smoking gun. The fifth flyby of Dione will be our last chance.”

“This will be our last chance to see Dione up close for many years to come,” added Cassini deputy project scientist Scott Edgington. “Cassini has provided insights into this icy moon’s mysteries, along with a rich data set and a host of new questions for scientists to ponder.”

Cassini, which has been orbiting Saturn since 2004, is currently completing a series of final close moon flybys, after which time it will leave Saturn’s equatorial plane to begin a year-long setup of its final year of operation. During that last phase of the mission, the spacecraft will be repeatedly diving through the space between Saturn and its rings, according to the US space agency.

Cassini’s closest-ever flyby of Dione was in December 2011, when it came to within 60 miles (100 km) of the moon’s surface. It has revealed that the bright, wispy terrain on the satellite is a system of braided canyons with bright walls, and project scientists are hopeful that they will be able to find out if it has geologic activity similar to that found on Enceladus.

—–

Feature Image: While not bursting with activity like its system sister Enceladus, the surface of Dione is definitely not boring. Some parts of the surface are covered by linear features, called chasmata, which provide dramatic contrast to the round impact craters that typically cover moons. (Credit: NASA/JPL-Caltech/Space Science Institute)

Can a basic shapes game squash anxiety?

 

Smartphone “brain games” may be all the rage, but few of them are back by evidence showing they actually improving cognition or mood.

Now, researchers at Michigan State University have announced the development of a smartphone application that seems to be able to lessen a player’s anxiety and increase his or her ability to focus, according to a new report published in the journal Behavior Therapy.

People often blame their inability to concentrate to an inherent flaw. However, anxiety makes it difficult for someone to concentrate on daily tasks – from office work to reading for pleasure. According to the anxiety blog Calm Clinic, when an anxious person feels distracted, his or her brain is working harder just to concentrate.

“What’s interesting is that calling them concentration problems may be misleading,” the website states. “Often you ARE concentrating – you’re simply concentrating on the wrong things, like your anxiety and the way it makes you feel.”

Therefore, by helping people to focus better, a technique could help to lower a person’s their racing anxious thoughts.

In the study, volunteers with both low and high anxiety performed a focus task involving the recognition of a particular shape among multiple other shapes. Later, volunteers were given an exercise intended to distract them: mixing in several colored shapes. By and large, volunteers were not distracted by this added layer of complexity.

Study author Jason Moser said in a news release the participants had better concentration and decreased anxiety, even after the distraction exercise.

The future of anxiety games

“Down the line we could roll out an online or mobile game based on this research that specifically targets distraction and helps people stay focused and feel less anxious,” said Moser, an associate professor of clinical psychology.

Moser noted other similar “brain games” make claims of being able to help people, but few offer up tangible proof.

“There have been other studies of video game-type interventions for anxiety,” he added, “but none have used a specific and simple game that targets distraction.”

Until Moser and his colleagues produce a viable smartphone app, experts recommend using background noise to lessen anxiety during work and sharpen focus. Keeping a “stress journal” and writing out anxious thoughts also seems to help some individuals.

(Image credit: Thinkstock)

That time when Robin Williams made a depressed gorilla laugh

 

The anniversary of Robin Williams’ death was less than a week ago, so we here at redOrbit wanted to share a story that highlights just how special he was: The story of how he helped Koko the sign language gorilla.

For those of you who don’t know, Koko was raised by humans in an attempt to see if she could be taught sign language. Beginning in 1972, the experiment was supposed to last around four years—but it still ongoing today.

In those 43 years, Koko has astounded the scientific community. She has learned over 1000 sign language words and can understand 2000 spoken English words. She not only knows words, but can use them flexibly—meaning that you can have full (albeit somewhat limited) conversations with her. Like in this video:

Koko also plays instruments for fun, creates paintings (which you can actually buy), is a cat lover, and enjoys making rhymes, among many, many other things. Further, two later gorillas (Michael and Ndume) have shown us that Koko is not unique in these traits—environment can conceivably make any gorilla surprisingly human.

But perhaps the saddest discovery they made was highlighted because of Michael. Thanks to Koko, researchers found that gorillas have a full range of emotions like we do, and this became extremely apparent in 2000, when Michael—Koko’s best friend for 24 years—passed away.

For six months, Koko did not smile once.

Williams—who was also battling depression at the time—decided to visit The Gorilla Foundation (where Koko lives) after taking an interest in ape conservation. They invited him to meet Koko, and what followed is possibly the best buddy film of all time:

“Not only did Robin cheer up Koko,” wrote The Gorilla Foundation in a press release, “the effect was mutual, and Robin seemed transformed.”

The two tickled each other and played, and became so close in that one encounter that she remembered him 13 years later, when her handlers informed her of his death—and mourned with the rest of us.

“Koko and Robin’s encounter is a supreme example of how humans and gorillas can overcome interspecies boundaries and express the highest form of empathy — embracing differences,” wrote The Gorilla Foundation. “Robin’s ability to just spend time with Koko, a gorilla, and in minutes become one of her closest friends, was extraordinary and unforgettable for Koko. We hope that it now becomes unforgettable for you too.

“And when you remember Robin Williams, remember that he is not only one of the world’s most beloved entertainers, he is also one of the world’s most powerful ambassadors for great ape conservation.”

More on Koko and The Gorilla Foundation can be found here, on their website.

(Image credit: The Gorilla Foundation)

Your car’s anti-theft device could be failing you

 

That anti-theft device might not make your automobiles as safe and secure as you might think, according to a newly-released study from Dutch researchers which claims that units made by 26 different automotive companies as “weak” and vulnerable to “trivial” attacks.

According to BBC News, study authors Roel Verdult, Flavio Garcia, and Baris Ege of Radboud University in Holland looked at the encryption system used by the Megamos immobilizer used by Porsche, Honda, and Volkswagen. They discovered that the systems are easily cracked, allowing potential thieves to make off with our vehicle.

Ideally, these systems prevent a car’s engine from being started unless the key with the correct radio chip is nearby. However, the researchers learned that by monitoring the data transmission between the car key and the anti-theft system only a few times, they could determine the secret cryptographic key used to scramble the information being sent and received.

In just 30 minutes, the researchers were able to discover which key was being used, and they claim that many automotive companies use extremely weak secret keys that could be found in only a few minutes using a laptop.

Easy to crack but hard to fix

Furthermore, the BBC said that the researchers had originally released their findings three years ago, but legal action by Volkswagen and French defense group Thales prevented publication of their paper until it was edited. Those restrictions have now been lifted.

Verdult, Garcia and Ege explained that it will not be easy to fix this issue, either. Correcting the flaws in the cryptographic system used for in data transfer process will require replacing both the radio chips used in the keys and the corresponding hardware in affected vehicles.

They also said that they had shared their findings with carmakers, and that measures were being taken to prevent some of the attacks from working. Previously, these systems were also found to be vulnerable if the chip signal on the key was boosted using an amplifier, BBC News said.

The study comes after several other security researchers have uncovered ways to hijack in-car computer systems, including one flaw that can disable cars simply by sending a text message to a specific onboard modem. In one demonstration, security researchers were able to hack a Chrysler Jeep from several miles away by accessing its onboard infotainment system.

(Image credit: Thinkstock)

NASA working on gecko-inspired, sticky-footed robot

 

Drawing inspiration from the super-sticky feet of geckos, NASA researchers are designing a new robot capable of climbing up the walls of the International Space Station to inspect and repair the exterior of the orbiting laboratory, the US space agency announced last week.

According to Space.com, scientists at NASA’s Jet Propulsion Laboratory facility in Pasadena, California are working on a “gecko gripper” system that would work similar to the way the millions of tiny hairlike structures on the bottom of the lizards’ feet help them become excellent climbers thanks to a phenomenon known as van der Waals forces.

Since the electrons orbiting the nuclei of atoms are not evenly spaced, a slight electrical field is created, meaning that neutral molecules have both a positive and a negative side. The positively charged part attract the negative side of nearby molecules, and vice versa, which in turn causes “stickiness” that allows geckos to climb walls and walk across ceilings.

JPL engineer Aaron Parness and colleagues are creating synthetic fibers, thinner than a human hair, that take advantage of these forces. Similar to how geckos “switch on” their stickiness by pushing their feet down to bend the hairlike structures on their feet, the robot’s fibers will have a force applied to them to make them bend and become adhesive.

Stick surface usable for repair robots, astronaut anchors

The gecko-inspired robot is called the Limbed Excursion Mechanical Utility Robot (Lemur), and Parness said that the grippers it will use “don’t leave any residue and don’t require a mating surface on the wall the way Velcro would.” Currently, they can support more than 150 Newtons of force, or the equivalent of 35 pounds, according to NASA.

Last year, the agency conducted a microgravity flight test in which the technology was used to grapple a 20-pound cube and a 250-pound person, and the gecko-inspired material underwent a separate test involving turning the stickiness on and off for more than 30,000 cycles back when Parness was a graduate student at Stanford University. It remained strong throughout the tests.

The JPL team has been testing Lemur 3 and its gecko-gripper feet in simulated microgravity, and they believe that it could be used to conduct inspections and perform repairs on the outside of the ISS. Parness said that the technology could ultimately be used to “grab satellites to repair them, service them, and we also could grab space garbage and try to clear it out of the way.”

In addition, the technology has been used to create three different sizes of “astronaut anchors,” hand-operated adhesive units that would make it easier for crewmembers to attach clipboards, photos, or other items to the interior walls of the station. Astronauts would attach the object to a gripper’s mounting post by pushing together the two components of the gripper, NASA noted. This part of the project is a collaboration between JPL and the Johnson Space Center.

(Image credit: NASA/JPL-Caltech)

Security experts: Stagefright security patch can be bypassed

 

A software patch designed to fix security vulnerability in Google’s Android operating system can be bypassed, again making devices running older versions of the mobile platform susceptible to a bug, and making it easy for hackers to gain access to apps and personal data.

According to BBC News, the flaw only requires cyberattackers to send a specific text message to hijack a smartphone. Google released a patch that fixed the issue, but the security experts at Exodus Intelligence explained to the British media outlet that the supposed fix itself is flawed and could cause owners of Android devices to have a “false sense of security.”

Exodus representatives said that they were able to bypass the update, and that general public is under the impression that “the current patch protects them when it in fact does not.” They added that the patch was only “four lines of code” that was presumably “reviewed by Google engineers prior to shipping,” but that it did not solve the core problem.

Patch issue part of a ‘bigger challenge’ for Android

The bug, known as Stagefright, was discovered in April and only requires a would-be hacker to send a video message to access data and apps on a potential victim’s device. Details of the flaw became public in July, after a patch released by Google was integrated into the latest version of the mobile OS, according to BBC News reports.

At the time, Google said that there had been no reported incidents in which vulnerability had been exploited, and the Android developer told the BBC that the majority of users were protected by a security feature called address space layout randomisation (ASLR). ASLR, they said, makes it harder for an attacker launch attacks capable of compromising a smartphone.

The fact that there are millions of Android devices that still run older versions of the software, and that the flaw itself is not 100 percent fixed, suggest that Stagefright “is the early warning alert to a much bigger challenge,” security expert David Baker told BBC News. Since so many device makers modify Android, he said, “There isn’t a comprehensive update solution.”

The UK news outlet said that only 2.6 percent of Android phones run the latest version of the software, while rival Apple calims that 85 percent of their users have the current edition of the iOS mobile operating system. Baker said that since Apple has control over both hardware and software, they can patch flaws more quickly than is possible on Google’s open-source OS.

(Image credit: Thinkstock)

Scientists have found, identified the ‘flying spaghetti monster’

 

Not only is the flying spaghetti monster the deity of its own satirical church, but it appears to be a real live creature, as workers at petroleum giant BP recently videotaped one while collecting video footage in the waters off the coast of Angola.

According to Discovery News and Live Science reports, the creatures was filmed by a remotely operated underwater vehicle (ROV) that was collecting footage at depths of nearly 4,000 feet (or 1,220 meters). The creature resembled “a bowl of noodles turned upside down underwater,” thus leading the oil and gas workers to name it in honor of the flying spaghetti monster.

The creature was later identified as a siphonophore, a group of animals that includes corals and jellyfish, by researchers at the at the UK’s National Oceanography Centre. Named B. conifer, it is a colonial animal made up of several different multicellular organisms called zooids.

As the websites explain, zooids attach to other zooids to form more complex organisms. Once a zooid is developed from a fertilized egg, others bud from it until an entire animal is formed. Each of these creatures has a job to do, and in the case of B. conifer, not all of the creatures catch and eat food, and only some reproduce, but combined they ensure mutual survival.

How they managed to identify it

The so-called spaghetti monster is a specific type of siphonophore that belongs to the suborder Cystonectae, Live Science said, and Catriona Munro, an ecologist and evolutionary biologist at Brown University, told the website that this particular cystonect species is somewhat rare.

She explained that cystonects have two main parts that are both affixed to a long stem. The top part is a bubble-like, gas-filled bulbous float called the pneumatophore. Located under that is the part known as the siphosome, which is where the zooids complete their various survival tasks such as eating food and reproducing.

While many other siphonophores have a body part known as a necrosome that helps them travel through the water, B. conifer lacks this feature, Munro told Live Science. The spaghetti monster also has armlike appendages called gastrozooids that help it catch food, she added, and the ptera (side wings) were what helped researchers determine the organism’s identity.

(Image credit: Serpentproject/YouTube)

Did technology kill the book?

 

Contrary to popular belief, the traditional printed book is not dead, and despite the belief that the rising popularity of e-books have all but murdered their old-school counterparts, recent published reports suggest that both paper and technology are thriving in their own niches.

According to BBC News, while it once looked like the reduced prices of e-books and the rise of devices like the Kindle and Nook would send traditional books to the endangered species list, printed manuscripts continue to survive.

For instance, the British media outlet added that in the UK about $614 million (£393 million) was spent on e-books last year, while more than $2.6 billion (£1.7 billion) was spent on traditional books. However, some genres have fared better than others, as adult fiction and romance novels are now primarily sold as e-books, while cookbooks and religious tests perform better in print.

BBC News also reported that Kindle sales peaked at 13.44 million in 2011 before falling to 9.7 million the following year, starting a downward trend. The Nook, meanwhile, has been losing an estimated $70 million per year while Barnes & Noble attempts to sell off its e-reader division.

How some firms are combining tradition and technology

Some companies are attempting to blur the gap between the two types of books, BBC News explained. Last year, The Little Girl Who Lost Her Name, (a publishing experiment centered around a printed book which could be digitally personalized to include the name of the person reading it), became the best-selling children’s picture book in both the UK and Australia.

SeeBook, a Spanish company, sells gift cards that can be bought in bookstores or online, and gives readers a QR code that, when scanned, allows them to download the book to their tablet or smartphone. Likewise, London-based start-up Bookindy is using a Chrome browser plug-in that tracks Amazon searches and compares prices with those of local booksellers.

“Digital technology and the rise in the digital reading culture has allowed authors and publishers many more new creative opportunities to develop ‘the book’ further and delight readers,” former Penguin Books Digital head Anna Rafferty said. “It also allows authors to publish directly, to connect intimately with their readers and, crucially, to create new ways of telling their stories.”

“[So] while there can be no denying that printed book sales have taken a massive hit with the rise of digital, there is some evidence that the rate of decline is slowing and that the excitement over e-readers is subsiding,” BBC News concluded in their report. “The book isn’t dead; technology is simply helping it evolve beyond its physical confines.”

(Image credit: Thinkstock)

Apollo astronaut: Aliens helped prevent Cold War

Edgar Mitchell, who was part of the Apollo 14 crew and was the sixth man to walk on the moon, said that extraterrestrial life not only exists, but also helped prevent a nuclear war between Russia and the US during the height of the Cold War.

A decorated Navy pilot who received Distinguished Service Medals from both the US Navy and NASA, was presented with the Presidential Medal of Freedom and who is also a member of both the Space and Astronaut Hall of Fame, Mitchell told the UK newspaper The Mirror that he saw a UFO flying over the White Sands military base in southern New Mexico.

He explained in an interview that aliens visited the base, where the first nuclear bomb had been detonated in 1945, because they “wanted to know about our military capabilities”. A native of the region, he added, “You don’t know the area like I do. White Sands was a testing ground for atomic weapons – and that’s what the extraterrestrials were interested in.”

“My own experience talking to people has made it clear the ETs had been attempting to keep us from going to war and help create peace on Earth,” Mitchell also said in the interview, telling the newspaper that other military officials told him that aliens had shot down test missiles.

Experts call Mitchell’s claim ‘far-fetched’

According to Gizmodo and other media reports, Mitchell has been outspoken in his belief that extraterrestrials have visited Earth since returning from the moon in 1971. Said to be one of the top figures in the global UFO community, Mitchell said he believes that aliens came here on a mission to prevent the Americans and Soviets from engaging in an all-out nuclear war.

The Mirror asked former UK Ministry of Defense UFO researcher Nick Pope about Mitchell’s claims, and he said the former astronaut was “an honorable and truthful man”, while noting that most of his claims were based on “things he’s been told by others”, not “things he’s experienced himself”. Mitchell would not name his sources, the newspaper noted.

Pope added that the “idea that peace-loving extraterrestrials are here to warn humanity about our destructive ways” was “a nice thought” and was “almost exactly the plot of the classic 1951 sci-fi movie The Day the Earth Stood Still”. He concluded that “if we’re being visited, it’s unlikely we’re dealing with a civilization just a few hundred years ahead of us, so stories of aliens managing to disrupt a few of our weapons tests are far-fetched.”

—–

Feature Image: Thinkstock

Technological advances that made modern roller coasters possible

August 16th is National Roller Coaster Day, and so it’s a good time to take a look at the science of the rides that paradoxically have us queuing for hours in order to scream in terror.

Roller coasters get more exciting year after year, but ever since 16th-century Russians first came up with primitive ice-coasting versions, the basic principle has been the same: Potential energy is stored up, often simply by using a big hill, and then gravity and kinetic energy do the rest.

Even in most modern roller coasters, the only energy assists come at the beginning and the end. The work is done by energy manipulation and the weight of the coaster itself.

Always advancing

So if the basics are the same, how did we progress primitive ice coasters to today’s scream-inducers?

Firstly, the wheels are very important. Running wheels guide the coaster on the track, friction wheels control lateral motion (movement to either side of the track), and a final set of wheels keeps the ride on the track even if it’s inverted.

The type of track used is also important, and a major breakthrough came when wooden tracks were replaced with steel ones. Roller coasters had a golden age just before the Great Depression, with classics such as “The Cyclone at Coney Island” using wooden tracks. There followed several decades of declined interest, before a new era began with steel-tracked designs.

In 1959, Disneyland introduced Matterhorn Bobsleds, the first roller coaster to use a tubular steel track. Unlike wooden coaster rails, tubular steel can be bent in any direction, which allows designers to incorporate loops, corkscrews, and other maneuvers into their designs.

Since then, what has changed more than the technology is the extent of the thrill that riders are seeking. By having people stand up, hang beneath the track, or by using “fourth dimension” coasters to spin or rotate seats as the ride moves, the experience is enhanced, even with the same basic principles.

Higher and higher

Then, of course, there is the sheer height of the drop. Coasters are categorized by this, with a “hyper coaster” having a height or drop that ranges from 61 to 91 meters, a “giga coaster” having a height or drop that ranges from 91 to 122, and a “strata coaster” being anything above that height. Only two strata coasters have been built to date– Top Thrill Dragster at Cedar Point, OH, which opened in 2003, and Kingda Ka at Six Flags Great Adventure, NJ, which opened in 2005. At a height of139 meters, Kingda Ka was and still is the tallest roller coaster in the world.

Inevitably, there have been technological attempts to make coasters go harder and faster, such as the use of hydraulic or pneumatic power, and even catapults using diesel engines or huge dropped weights. But from 16th-century Russia, through Coney Island and beyond, most roller coasters remain an exciting yet simple lesson in physics.

—–

Image credit: Thinkstock

 

Treatment reduces anxiety and memory loss in mice, shows potential for human use

Researchers believe they may have uncovered a potential breakthrough for treating memory, anxiety, and fear disorders in humans— all thanks to altering a single gene in mice.

The gene was changed to inhibit the activity of an enzyme called phosphodiesterase-4B (PDE4B)—an enzyme present in the brain as well as other organs in both mice and humans.

The mice with the inhibited PDE4B showed enhanced cognitive abilities when compared to unaltered mice; they tended to learn faster, have a better memory, and be more adapt at solving complex puzzles. For example, when the “smart” mice were introduced to another mouse, they were better at recognizing it a day later, and they were quicker to find hidden underwater platforms in Morris tests.

Even more interesting was what the mice suddenly lacked. The “smart” mice showed less recall of fearful events after several days elapsed and appeared less anxious in general—they spent more time in open, brightly-lit spaces than the regular mice (who typically prefer dark, enclosed spaces). Further, when presented with cat urine, the PDE4B mice were less fearful than the normal mice.

This seems to indicate that, along with altering learning and memory, the inhibition of the enzyme PDE4B also decreases fear and anxiety while increasing risk-taking behavior. This would be counterproductive in a wild mouse, for sure—but probably not so much in humans.

Potential for human usage

In a press release, Dr. Alexander McGirr, a psychiatrist in training at the University of British Columbia and co-leader of the study, said, “In the future, medicines targeting PDE4B may potentially improve the lives of individuals with neurocognitive disorders and life-impairing anxiety, and they may have a time-limited role after traumatic events.”

Dr Laura Phipps of Alzheimer’s Research UK, who were not involved in the study, added, “This study highlights a potentially important role for the PDE4B gene in learning and memory in mice, but further studies will be needed to know whether the findings could have implications for Alzheimer’s disease or other degenerative mental conditions. We’d need to see how this gene could influence memory and thinking in people to get a better idea of whether it could hold potential as a target to treat Alzheimer’s.

“There is currently a lack of effective treatments for dementia and understanding the effect of genes can be a key early step on the road to developing new drugs. With so many people affected by dementia, it is important that there is research into a wide array of treatment approaches to have the best chance of helping people sooner.”

The study is published in Neuropsychopharmacology.

—–

Image credit: Thinkstock

London excavation crew discovers 17th century Great Plague mass grave

Excavators for London’s new Crossrail line have stumbled upon what appears to be a mass grave for victims of the Great Plague.

30 skeletons were found on the Bedlam burial ground near a headstone marked 1665—the year the last plague struck London. The bodies appear to have been buried on the same day, lending weight to the notion that something killed them all at once—but oddly the bodies were placed in thin wooden coffins, unlike most mass burials.

“They were stacked up, some even on their side, some orientated north-south to try and squeeze as many as possible in,” said Jay Carver, lead archeologist for the Crossrail project. (West-to-east was the customary orientation at the time.)

This unusual find has many scientists are excited, especially because it might yield answers they long have searched for.

“This mass burial, so different to the other individual burials found in the Bedlam cemetery, is very likely a reaction to a catastrophic event,” said Carver. “Only closer analysis will tell if this is a plague pit from the Great Plague in 1665, but we hope this gruesome but exciting find will tell us more about one of London’s most notorious killers.”

This find is of special importance as genetic identity of the plague bacterium itself is a major mystery.

“The particular question is, what was responsible? Actually what pathogen, what bacteria formed the Great Plague outbreak in the 17th century?” asked Carver.

“It doesn’t seem to come back, so something changed in the way people were living. People say the Great Fire of London in 1666 had something to do with the ending of Great Plague events, but through the scientific studies we can do these days on DNA from samples of these skeletons, we might be able to tell what pathogen is responsible for that outbreak and perhaps why it stopped.”

—–

Pictured is some of the crew working on the Crossrail Line who discovered the grave. (Credit:BBC)

NASA launches space-watch UI design contest

NASA wants to bring their astronauts fully into the smartwatch age, and they want your help in making this possible, as the US space agency has launched a contest that is open the public and designed to help find a suitable user interface for crew devices on the International Space Station.

The purpose, according to NBC News, is to allow astronauts serving on the orbiting facility to have regular access to their schedules, the status of the station, and several other vital pieces of data all at the same time – without needing to lug laptops or tablets with them across the station.

Interface designs should used the Android-powered Samsung Gear 2 is a reference and should be created in PNG or JPEG wireframes covering a multitude of functions, such as a Crew Timeline app to keep track of the astronaut’s daily schedule, a color-coded cautions and warnings app, and an app that could be used to set timers for procedures or countdown to the next activity.

While the UI will need to cover those functions, Engadget notes that the apps themselves do not need to be created as part of the competition and will likely be handled in house at NASA. They emphasized that the design should “direct attention to the appropriate information for a task and increase efficiency,” provide the necessary feedback, and be easy to read.

A chance to earn prestige (and a small cash award)

What does the developed who designs and creates the winning interface get for his or her efforts? The sum of $1,500, which the folks at Engadget argue “seems a bit chintzy for what sounds like a crucial app.” However, as NBC News counted, the real reward is “the honor of helping design an app for astronauts.” Not to mention that it would look good on a resume.

Officials at the US space agency are increasingly turning to crowdsourcing to come up with new designs and engineering concepts for its missions, according to The Verge. Back in May, NASA announced its “Journey to Mars Challenge,” which solicited ideas for ways to keep the crew safe as they travelled to the Red Planet, while requiring limited resupply missions from Earth.

Last month, they also went online seeking for new tool designs that could be used by Robonaut 2, the humanoid robot on the ISS, CNN.com added. During those contests, NASA provided an image of what each tool should look like, and asked inventors to come up with a realistic three-dimensional model of a workable design, for which they could receive $50 to $100.

—–

Pictured is astronaut Scott Kelly aboard the International Space Station. (Image credit: NASA)

UK police investigates ‘cyber flashing’ incident

When you think about cybercrime, you typically think about hacking or data theft, but police in the UK are currently investigating a whole different kind of cybercrime – an incident believed to be the first ever instance of cyber-flashing.

According to BBC News reports, 34-year-old Lorraine Crighton-Smith said that she had been traveling on a train in south London when she began receiving pictures of an unidentified man’s penis on her iPhone through Apple’s “Airdrop” sharing function, which she had previously turned on in order to share photos with a fellow smartphone user.

Once she declined that image, a second one showed up on Crighton-Smith’s smartphone. “ I realized someone nearby must be sending them, and that concerned me. I felt violated, it was a very unpleasant thing to have forced upon my screen. My name on Airdrop says Lorraine so they knew they were sending it to a woman,” she said to the BBC.

“The images were of a sexual nature and it was distressing,” she said, adding that she reported the incident to British Transport Police (BTP) because she was “worried about who else might have been a recipient. It might have been a child, someone more vulnerable than me.”

How to keep this from happening to you

BTP officials told BBC News that they have investigated the incident, but since the photo was not accepted by Crighton-Smith, there was no evidence for them to work with. Superintendent Gill Murray said that the agency had previously dealt with cases involving Bluetooth, but said that a case involving indecent exposure via Airdrop was “new to us.”

She said that receiving such an image from a stranger “must be very distressing and something we would take very seriously. If it happens to you, our advice would be to remain calm, retain the image and report the matter to police as soon as possible. We have a dedicated Cyber Crime Unit who can analyze mobile phones and track data transfers back to suspects’ devices.”

Airdrop, a service unique to Apple’s iOS and Mac devices, uses Wi-Fi and Bluetooth technology to communicate with other iPhones over short distances. By default, it is set to “contacts only” so that only people that a phone’s owner knows can use it to send and receive files to that particular mobile device. However, the settings can be changed to “everyone,” BBC News said.

“This means that typically in a train carriage, or tube carriage, you can see other devices,” said Pentest Partners cybersecurity consultant Ken Munro. “That’s what’s happened in this particular case, someone has enabled everyone and then hasn’t then set it back. As a result anyone within Wi-Fi or Bluetooth range can send something to you that’s quite horrible.”

—–

Image credit: Thinkstock

Stanford researchers convert yeast into painkillers

It typically takes a full year to convert plants into hydrocodone, but researchers from Stanford University have developed a new technique involving baker’s yeast that allow them to transform sugar into painkillers in just three to five days, according to published reports.

Their method, described in a recent edition of the journal Science, could open the door for these opioids, as well as chemical relatives such as morphine and oxycodone, to be created faster and at a lower cost than many other types of plant-based medicines, the authors said.

As part of their research, Stanford bioengineer Christina Smolke and her colleagues successfully found more than 20 genes from five different organisms and inserted them into the genome of baker’s yeast. By doing so, they managed to create a pair of microbial assembly lines which they used to convert sugar into one of two medicinal compounds: thebaine or hydrocodone.

The process currently requires 4,400 gallons of bioengineered yeast to produce just one dose of the painkillers, but the study shows that complex plant-based medications can be made with the help of genetically altered microorganisms – and the authors say this is just the beginning.

As Smolke said in a statement, “The techniques we developed and demonstrate for opioid pain relievers can be adapted to produce many plant-derived compounds to fight cancers, infectious diseases, and chronic conditions such as high blood pressure and arthritis.”

Building on techniques used to create anti-malarial drugs

Experiments conducted prior to the Stanford team’s work demonstrated that genetically altered yeast could be used to produce the anti-malarial drug artemisinin. In that experiment, however, the researchers only added six genes to produce the desired result. The authors of the new paper needed to engineer 23 genes into the yeast genome to produce their hydrocodone assembly line.

“This is the most complicated chemical synthesis ever engineered in yeast,” Smolke said. She and her colleagues were able to locate and fine-tune DNA fragments from other plants, bacteria, and even rodents that they then equipped to the yeast. This gave it all of the enzymes required to change sugar to hydrocodone, a compound which deactivates pain receptors in the brain.

However, in order to get this yeast-based assembly line up and running, the researchers needed to find a way to replicate or replace the process through which opium poppies naturally convert the (S)-reticuline molecule into the (R)-reticuline molecule, which is what kick-starts the plant’s journey towards the production of pain-relieving molecules.

Smolke’s team was one of three labs which independently discovered the enzyme that causes (S)-reticuline to become (R)-reticuline, but even this did not create enough opioid compound, so they altered the next enzyme in this process in order to increase production levels. New enzymes were continually added down the line, the researchers explained, until they ultimately managed to create a molecule that could be plugged directly into the brain’s pain receptors.

“Biotech production could lower costs and, with proper controls against abuse, allow bioreactors to be located where they are needed,” said Smolke. “The molecules we produced and the techniques we developed show that it is possible to make important medicines from scratch using only yeast. If responsibly developed, we can make and fairly provide medicines to all who need.”

—–

Image credit: Thinkstock

‘Motherly robot’ evolves creations to improve performance

Researchers from the University of Cambridge have developed robots capable of reproducing, so to speak-– they have invented a “mother” robot which can design, build, and even test out its own “children,” then use its experience to improve the performance of future generations.

According to Engadget, the original robot can “give birth to” (build) 10 baby cube-bots at a time. Each of these “pint-sized machine children” had a motor and between one and five plastic cubes, moving and performing different tasks under the watchful eye of their robotic mother. Check it out:

The mom leaves the fastest ones on their own, but slower designs are changed in the next generation, similar to the evolutionary process. By passing down the best traits and improving upon the weaker ones, the mother robot was able to double the overall speed of its children by the end of the fifth generation of robots – all without human intervention.

As lead researcher Dr. Fumiya Iida of Cambridge’s Department of Engineering, whose work is detailed in a new PLOS One paper, explained in a statement, “Natural selection is basically reproduction, assessment, reproduction, assessment, and so on. That’s essentially what this robot is doing – we can actually watch the improvement and diversification of the species.”

‘Survival of the fittest’ observed in robots

Dr. Iida and colleagues from ETH Zurich conducted a series of five experiments, each of which involved the mother robot creating and constructing a generation of 10 children. As it observed the created cube-bots, the mother selected the preferential traits and continued to pass them down from one generation to the next, ensuing the survival of the fittest individuals.

Each child robot had unique “genomes” of one to give different genes, each of which contained data about its shape, construction and motor command, the researchers explained. As happens in nature, the evolution in the robots occurred through mutation (modification, addition or deletion) and crossover (the formation of a new genome by merging genes from two individuals).

The mother determined which children were fittest through a trial where the cube-bots were given a set amount of time to travel as far from their starting position as possible. The successful ones remained unchanged so that their abilities could be preserved, while the weaker robots were subject to mutation and crossover in order to improve.

Overall, the study authors found that design variations emerged and overall performance on the time trials improved with each generation. The fastest in the final generation travelled twice the speed, on average, as the fastest from the first generation. This was attributed to tweaked design parameters, and the mother’s use of new shapes and gait patterns for the offspring.

“One of the big questions in biology is how intelligence came about – we’re using robotics to explore this mystery,” Dr. Iida said. “We think of robots as performing repetitive tasks, and they’re typically designed for mass production instead of mass customization, but we want to see robots that are capable of innovation and creativity.”

—–

Image credit: Youtube/Cambridge University

Air pollution kills more than 4,000 people per day in China

New research led by a California-based climate research organization has found that polluted air is responsible for 1.6 million deaths per year in China, with 17 percent of that country’s fatalities each day directly linked to the “unbreathable” air inhaled by 38 percent of its citizens.

In their new study, co-authors Robert Rohde and Richard Muller of Berkeley Earth reviewed hourly air pollution data from more than 1,500 sites recently made available by the Chinese government. They analyzed airborne particulate matter and other pollutants over a four-month span and found that 38 percent of Chinese people breathe air that would not meet US standards.

Furthermore, according to the Washington Post, they found that the polluted air was responsible for killing more than 4,000 Chinese men and women each day, or approximately one-sixth of all premature deaths nationwide. In fact, 99.9 percent of eastern China has a higher annual average for small particle haze than the most polluted city in America– Madera, California.

In a statement, Muller, the scientific director at Berkeley Earth, said that air pollution was “the greatest environmental disaster in the world today. When I was last in Beijing, pollution was at a hazardous level; every hour of exposure reduced my life expectancy by 20 minutes. It’s as if every man, women, and child smoked 1.5 cigarettes each hour.”

Can China’s air be cleaned up by the 2022 Olympics?

The most harmful pollutant, according to the study authors (whose paper has been accepted for publication in the journal PLOS One), is particulate matter 2.5 microns and smaller. Also known as PM2.5, this matter includes soot, dust and smoke and can penetrate deep into a person’s lungs, leading to heart attack, lung cancer, stroke, and asthma.

“Nearly everyone in China experiences air that is worse for particulates than the worst air in the US,” Rohde told the Associated Press on Thursday. “It’s a very big number. It’s a little hard to wrap your mind around the numbers. Some of the worst in China is to the southwest of Beijing,” the nation’s capital and the city that was recently awarded the 2022 Winter Olympics.

He added that since Beijing itself isn’t the source of most of the pollution, and that it might be hard to get the air cleaned up enough before the Olympic games get underway. Most of the harmful air found in Beijing originates from “distant industrial areas, particularly Shijiazhuang,” a city found 200 miles to the southwest, the researchers explained.

—–

Image credit: Thinkstock

Humanity overspent on 2015 natural resource budget

Turn off the TV, put down those Cheetos, and don’t even think about getting into the car to go to the mall – you’ve already consumed your allotment of energy and natural resources for the entire year, according to the folks at the Earth Overshoot Day initiative.

It’s not just you, though. The whole planet has exhausted its natural biocapacity budget for 2015, consuming more than the Earth can generate for the entire year, project representatives explained to Wired UK on Friday. We officially surged past the limit on Thursday, August 13, or about two months earlier than we did just 15 years ago (2000’s Earth Overshoot Day was in October).

Global overshoot, the organization explained on its website, occurs when the annual demand for fruits, vegetables, meat, fish, wood, cotton, and carbon dioxide absorption – anything that we use that is produced by the planet – exceeds the supply that Earth’s ecosystems can renew in a single year. Instead of living off interest, they said, we’re drawing on the planet’s principal.

Humanity’s carbon footprint most to blame

Mathis Wackernagel, co-creator of the system that calculates Earth’s annual budget and president of the Global Footprint Network, told Wired that humanity’s carbon footprint had doubled since the early 1970s, when the planet first went into “ecological overshoot.” Currently, the group said, we consume so much that it would take 1.6 Earths to produce an adequate supply of resources.

If things continue to progress as they have been, Wackernagel’s group said that Earth Overshoot Day would occur on June 28 in the year 2030, essentially meaning that we would have used up a whole year’s worth of resources in just over six months. However, reducing carbon emissions by 30 percent would cause overshoot day to be pushed back until September 16, 2030.

Earth Overshoot Day is calculated by looking at the renewable, naturally-provided resources we consume and the amount of those resources that the planet can produce, not unlike balancing the credits and debits in a checkbook. Overshoot day itself is “the day when humanity falls into the red,” Wired said. This year’s is the earliest one yet, beating 2014’s old record by four days.

In a statement, Wackernagel said that “humanity’s carbon footprint” was “the fastest growing component of the widening gap between the Ecological Footprint and the planet’s biocapacity. The global agreement to phase out fossil fuels that is being discussed around the world ahead of the Climate Summit in Paris would significantly help curb the Ecological Footprint’s consistent growth and eventually shrink the Footprint.”

—–

Image credit: Thinkstock

Gang-related homicides follow a specific pattern, study finds

 

In research that they believe could help prevent these violent crimes in the future, criminologists at Michigan State University have discovered that gang slayings move in a systematic pattern over time, spreading like a disease from one vulnerable area to another.

Their findings, which have been published online in the American Journal of Public Health, found that there is “a potentially systematic movement of gang-related homicides,” lead author and MSU associate professor of criminal justice April Zeoli explained in a statement.

“Not only that, but in the places gang homicides move into, we see other types of homicide – specifically, revenge and drug-related killings – also clustering. Taken together, this provides one piece of the puzzle that may allow us to start forecasting where homicide is going to be the worst – and that may be preceded in large part by changes in gang networks,” she added.

Zeoli previously was also a member of the MSU team that, in 2012, reported that homicide as a whole spreads through cities like infectious disease. They applied public health tracking methods to more than 2,300 homicides from 1982 through 2008, and discovered that murders followed a pattern, starting in the center of a city, then spreads to the south and west.

Tracking different types of murder in Newark

Using police data from Newark, New Jersey, the study authors reported that like diseases, homicide needs a susceptible population, an infectious agent, and a vector in order to spread. The infectious agent could be street code (protecting one’s reputation by all costs, including violence) while the vector could be word-of-mouth publicity.

As part of the new study, Zeoli’s team once again analyzed data from Newark to see if specific types of homicides cluster and spread differently. Along with gang-related murders, they looked at revenge killings and homicides linked to domestic violence, robbery, and drugs. As it turns out, each different type of homicide does, in fact, have a different pattern.

Deaths linked to domestic violence and robberies showed no signs of clustering or spreading out, the study authors found, and while revenge and drug-motivated homicides unrelated to gangs did not spread out, they did tend to cluster – often in the same area as the gang activity.

Gang-related murders were the only ones found to spread out in a specific pattern, as they found four contiguous clusters that started in central Newark and moved in a roughly clockwise pattern from July 2002 through December 2005. The authors wrote that tracking how different homicide types spread through a community can improve prevention and intervention efforts.

(Image credit: Thinkstock)

How do you get to a woman’s heart? Through her stomach!

 

“The way to a man’s heart is through is stomach,” the old saying claims, but a new study from researchers at Drexel University and the University of California, San Diego has revealed that the saying may also be true when it comes to women.

In research published online in the journal Appetite, first author Dr. Alice Ely, who worked on the study while pursuing her doctoral degree at Drexel, and her colleagues found that the female brain is less apt to respond to romantic cues on an empty stomach than on a full one.

Dr. Ely’s team explored the brain circuitry in women who were hungry, comparing the results to those who were satiated, and found out that young women who had recently eaten experienced higher levels of brain activity in reward-related neural regions in response to romantic images.

The results contradict several previous studies which found that people were, on average, more sensitive to a rewarding stimulus when hungry than when full, she explained in a statement. The results seem to indicate that eating may prepare or sensitize young women to rewards other than food, and also supports the notion that food and sex share the same brain neurocircuitry.

Discussing the research with Dr. Alice Ely

Dr. Ely, whose research centers around how people experience reward (particularly in terms of what people who are obese or have eating disorders find pleasant or important), told redOrbit via email that this research was part of a larger study designed to investigate how hunger and satiety influenced how dieters responded to highly palatable food cues, such as pizza or ice cream.

“I looked at brain response to food pictures in the first study, because for past dieters that’s the reward they’re likely most sensitive to, but we also wanted to include something more general, to see if the response was specific to food or generalized to other kinds of rewards,” she explained. “I chose the ‘sexual’ pictures most positively rated by women from the International Affective Picture System, which actually turned out not to be particularly sexual at all, but were mostly people holding hands and hugging – hence ‘romantic.’”

Dr. Ely said that her team was “definitely surprised by the results,” as typically there is “greater response to appetitive cues when people are hungry, so seeing it after having eaten was notable.” She also pointed out that the study looked at two groups of women – those that had dieted in the past and those with no history of dieting – and found that weight gain-prone dieters tended to be more responsive when full in a region of the previously linked to perceived attractiveness.

The brain’s reward center, the ventral corticolimbic neurocircuitry, responds to such things as food, sex, money, and drugs of abuse, she said, adding that her team’s research was one of the first studies to find that one of those domains can influence the other. What this means, Dr. Ely said, is that one class of rewards could potentially make others seem more exciting or pleasant. She added that she is now working on projects looking at the neural mechanisms for food and monetary reward in search of “ways to clinically target alterations in reward and self-control.”

(Image credit: Thinkstock)

How to read classroom body language

As vacation is wrapping up and the first day of school approaches (if it already hasn’t arrived), the prep time for the classroom is vital. As a teacher or professor, how can we tell if our students are learning during the first days of school–or just zoning out? Are they leaning forward focused on the material? Or are they mid-collapse into a narcoleptic nap?
Dr. Joanne Chesley, Ed. D. CETL Pedagogy Specialist, helps uncover the meaning of students’ body language to guide teachers to know when their lessons are going in one ear, and when they are coming back out the other.
An open body signals an open mind: TRUE
According to Dr. Chesley, attentive body language “signals interest in the other person and the message.” Some signals to look for when teaching students is if they are ignoring distractions like their phones and friends and leaning forward as if involved in the conversation. Even a furrowed brow shows focus, but beware—furrowed brows may also show attentiveness to the point of confusion. With tilted heads and still posture, the student is sure to be paying attention to the lesson.
The best body language a professor can receive is open body language, as it signals change in how the student thinks or feels, added Chesley. With relaxed arms and legs, good eye contact (not staring, glaring, or deer-in-headlights mode), and a face directed towards the professor, the student is sure to be learning effectively.
Checking the time means students are bored: TRUE
This is stating the obvious.
We’ve all checked our phones, watches, and wall clocks to watch the time pass when we no longer want to be where we are. Checking the time, tapping toes, talking to others, and blank/yawn-filled stares are all signals that the student has lost interest in the lesson. This body language signals “that we would rather not be there, or that the material is uninteresting or irrelevant,” said Dr. Chesley.
Don’t lose hope, though, if your class seems disinterested. If the lecture is creating more yawns than notes, create an active class discussion!
Crossed arms are used only for the mad at heart: FALSE
This closed body language can mean multiple things when it comes to students. A set of crossed arms can signal someone feeling threatened and wanting to create a barrier for protection. It can signal the need to be nurtured, or the need to hide something. It can signal the room is too cold (crossed arms keep a person warm). Or it could just be a relaxing position to sit in.
Students are in the classroom to learn and grow, whether they will admit it or not, so reading the body language of the room can help create the strongest influence on the class learning. Don’t forget what the post-summer days were like, and give the students—and the professors—a break. The start of the semester is near!
Disclaimer: While reading body language is all in good fun and can help us read a situation, you must use caution when attempting to read your students. Not all body language is as it seems, and so using context clues will help. Read the person, read the room, and then read body language with caution. Give the students time and your attention, and it may very well be the lesson they need the most.
—–
Feature Image: Thinkstock

Actually, apes can (kinda) speak

 

Humans continually search for that thing that makes us unique—the traits that distinguish us from base animals—but it seems the more we try to prove we’re superior, the more we find out we’re not that special. For example, the long-held tenet is that apes cannot vocalize like humans to create speech, but now some researchers think we might be wrong on that count.

The breakthrough came after Marcus Perlman of The Gorilla Foundation and Nathaniel Clark of the University of California, Santa Cruz studied 71 hours of video of Koko the gorilla interacting with her handlers.

“I went there with the idea of studying Koko’s gestures, but as I got into watching videos of her, I saw her performing all these amazing vocal behaviors,” said Perlman in a press release.

Koko was part of a project to see if gorillas could be taught sign language—and since its inception in 1972, many major discoveries have been made thanks to Koko. For example, she learned over 1,000 signs and can combine them to express new meanings she wants to convey, including emotional expressions. (When her pet kitten All Ball died, Koko signed, “bad sad bad” and “frown cry-frown” before crying herself. She also cried when her friend Robin Williams died.) Further, she can understand and respond to about 2,000 spoken English words.

However, the one breakthrough Koko never had involved controlling her vocal communication, as it was thought apes only made vocal noises reflexively, thanks to previous research.

“Decades ago, in the 1930s and ’40s, a couple of husband-and-wife teams of psychologists tried to raise chimpanzees as much as possible like human children and teach them to speak. Their efforts were deemed a total failure,” Perlman explained. “Since then, there is an idea that apes are not able to voluntarily control their vocalizations or even their breathing.”

Koko has control over her vocalizations 

This notion fit nicely with a common theory of the evolution of language—that the human ability to speak is only found in us and not the non-human primates.

“This idea says there’s nothing that apes can do that is remotely similar to speech,” Perlman said. “And, therefore, speech essentially evolved — completely new — along the human line since our last common ancestor with chimpanzees.”

However, after Perlman and Clark examined the videos of Koko, they realized she was performing nine different—and voluntary—behaviors that required control over her vocalizations and breathing. Further, these behaviors were entirely learned; they are not part of the normal gorilla repertoire.

For example, when Koko wants treats, she can blow a raspberry into her hand. She can also blow her nose into a tissue, play wind instruments, blow onto a pair of glasses so she can clean them with a cloth, and mimic phone conversations by prattling wordlessly into a telephone.

“She doesn’t produce a pretty, periodic sound when she performs these behaviors, like we do when we speak,” Perlman said. “But she can control her larynx enough to produce a controlled grunting sound.”

Further, she has enough control over her larynx that she can cough on command—an especially impressive feat because it requires the larynx to be closed off entirely.

“The motivation for the behaviors varies,” Perlman says. “She often looks like she plays her wind instruments for her own amusement, but she tends to do the cough at the request of Penny and Ron [her caretakers].”

From this discovery, the evolution of the human ability to speak is pushed back much further, to around 10 million years ago—the time of our last common ancestor with gorillas.

“Koko bridges a gap,” Perlman says. “She shows the potential under the right environmental conditions for apes to develop quite a bit of flexible control over their vocal tract. It’s not as fine as human control, but it is certainly control.”

This study can be found in Animal Cognition.

Humans, not climate, caused giant ancient mammal extinction

 

The largest mammals ever to roam the earth, a group of gigantic ancient mammals known as megafauna, were wiped out by human activity and not as the result of a changing climate, new research from experts at a quartet of prominent UK universities has revealed.

Lead investigator Lewis Bartlett from the University of Exeter, along with colleagues from the universities of Cambridge, Reading, and Bristol, state that their findings conclusively prove that mankind was the dominant force that wiped out the massive creatures over the last 80,000 years, although they also noted that climate change may have also played a minor role.

Bartlett’s team used advanced statistical analysis, running thousands of scenarios mapping the different ranges of times during which each species is known to have become extinct, as well as when humans were known to have arrived in different parts of the world. The authors then took that data and compared it to climate reconstructions from the past 90,000 years.

The results of their analysis show beyond a reasonable doubt that man was the main reason these creatures were wiped out, as the correlation between the spread of humans and the extinction of different species show that the former was the main cause of the latter.

How and why humans caused megafauna demise remains unknown

In a statement, the researchers explained that humanity was “the main agency causing the demise [of megafauna], with climate change exacerbating the number of extinctions.” In Asia and other parts of the world, however, they found patterns of species loss that couldn’t be accounted for through either of these two catalysts and require additional analysis.

“As far as we are concerned, this research is the nail in the coffin of this 50-year debate,” Bartlett said. “Humans were the dominant cause of the extinction of megafauna. What we don’t know is what it was about these early settlers that caused this demise. Were they killing them for food, was it early use of fire or were they driven out of their habitats? Our analysis doesn’t differentiate, but we can say that it was caused by human activity more than by climate change. It debunks the myth of early humans living in harmony with nature.”

“Whilst our models explain very well the timing and extent of extinctions for most of the world, mainland Asia remains a mystery,” added Dr. Andrea Manica. the lead supervisor on the paper. “According to the fossil record, that region suffered very low rates of extinctions. Understanding why megafauna in mainland Asia is so resilient is the next big question.”

The findings have been published in the journal Ecography.

(Image credit: Thinkstock)

Man grows ear on his own arm to broadcast his life online

 

An Australian artist and Curtain University professor is growing a functional human ear on his arm, with the intention that the organ will eventually be wired to pick up the various sounds of his life and broadcast them over the Internet to a global audience.

The man’s name is Stelarc, and according to Huffington Post UK, he said that the odd project has been two decades in the making. The ear has been inserted underneath the skin of Stelarc’s forearm, and the next step is to have a miniature, WiFi-connected microphone inserted.

“This ear is not for me, I’ve got two good ears to hear with. This ear is a remote listening device for people in other places,” the professor told ABC News Australia. “They’ll be able to follow a conversation or hear the sounds of a concert, wherever I am, wherever you are.”

“People will be able to track, through a GPS as well, where the ear is,” he added. “There won’t be an on-off switch. If I’m not in a Wi-Fi hotspot or I switch off my home modem, then perhaps I’ll be offline, but the idea actually is to try to keep the ear online all the time.”

Project has been in the works since 1996

Stelarc, who leads a team at the Alternate Anatomies Laboratory at Curtin University, explained that the idea first came to him in 1996, but that it took a considerable amount of time to locate a medical team willing to perform the procedures necessary to make the project a reality.

The Huffington Post explained that doctors inserted a bio-polymer scaffold underneath his skin, and less than six months later, blood vessels and tissue began to form around it. The next step is to make it lift off the arm a little more so that it looks more three-dimensional, and to take some of Stelarc’s stem cells to form an grow an ear lobe.

After that, it will be time to implant the microphone. The doctors had reportedly already made one previous attempt at doing so, but the device had to be removed from the third ear because of infection. If and when it is successfully installed, audio from his life will be broadcast online.

“People’s reactions range from bemusement to bewilderment to curiosity, but you don’t really expect people to understand the art component of all of this,” the artist told ABC News Australia. “Increasingly now, people are becoming internet portals of experience… imagine if I could hear with the ears of someone in New York… [and] see with the eyes of someone in London.”

(Image credit: Stelarc)

Wanna get your ash off this planet? It’s now cheaper

If you had your heart set on having your final resting place be on the moon, but thought that the $12,500 starting rate charged by Houston-based Celestis was a little bit too steep, odds are you’ll be thrilled to learn that a second firm has announced plans to offer similar services.

According to Space.com, San Francisco-based Elysium Space announced on Wednesday that it would be partnering with Pittsburgh-based Astrobotic Technology to use the company’s Griffin lander to deliver people’s ashes to the lunar surface for the low price of just $11,950.

“From the first day we started Elysium Space and imagined awe-inspiring memorials, we thought that the Moon could create the quintessential commemoration,” Elysium founder and CEO Thomas Civeit said in a statement. “Offering this exceptional tribute within the reach of most families is an important part of this new chapter opening for our civilization.”

To sweeten the deal, Elysium is promising to knock an extra $2,000 off the price for the first 50 people that order their “Lunar Memorial” service, in which customers provide the company with a “symbolic portion” of their loved one’s cremated remains. Those remains are then placed into a personalized capsule and sent them to the moon on Astrobiotic’s lander.

Meet the companies that want to send your ashes into space

Like Elysium, Celestis also plans to fly cremated remains to the moon using the Griffin, as well as a second lunar lander developed by Silicon Valley-based Moon Express. Likewise, both firms offer alternative resting places for those who would rather not have their bodies spend eternity on the moon, including having their ashes placed in Earth’s orbit or shot into deep space.

Also, while neither company operates its own rockets, at least one of them has a track record of successful space burial missions, Space.com said. Celestis has flown 13 total missions, sending the remains of Star Trek creator Gene Roddenberry and nearly two dozen others into space, and even successfully sending a payload to the moon as part of a 1998 NASA lunar mission.

Elysium, on the other hand, plans to launch its first space burial mission later on this year when it sends remains into orbit around the planet. The lander developers, Astrobotic and Moon Express, intend to deliver payloads to the moon for government organization, universities, and commercial partners. Both are also completing in the $30 million Google Lunar X Prize competition.

“Astrobotic’s mission is to make the Moon accessible to the world,” noted CEO John Thornton, adding that it was “a privilege to provide an experience that will allow families to commemorate and honor loved ones by directly connecting them with the Moon in the night sky.”

—–

Feature Image: Thinkstock

DNA analysis proves President Harding had daughter with mistress

Before there were tales (alleged or otherwise) of presidential infidelity involving the likes of Marilyn Monroe or Monica Lewinsky, there were rumors that Warren Harding, the married 29th President of the United States, had an affair and sired a daughter with his mistress.

Now, according to BBC News and New York Times reports, those rumors have now been proven true, thanks to a genetic test that has essentially confirmed Elizabeth Ann Blaesing, child of Harding’s alleged mistress Nan Britton, shares DNA with the president’s living relatives.

The Britton family’s claims had long been disputed by the Harding family, but nearly a century after Elizabeth’s mother first went public with her claims, genealogists have conducted genetic tests that for the first time confirmed that Harding was the child’s biological father after all.

The former president’s grand-nephew, Dr. Peter Harding, helped lead the effort to have the tests conducted and told the BBC that he was “totally jubilant” to know the truth about Blaesing. He said, “This has been a family mystery since I became aware of it [and] there was no way to really resolve it. Back in the 1920s, there was only whether someone looked like someone else.”

Experts call results conclusive, but some Hardings remain skeptical

The results of the test have been confirmed by Ancestry.com, which provided the test with their AncestryDNA service. Stephen Baloglu, an executive at the company, told BBC News that their results showed that “the family connection is definitive.” He also said it “amazing to imagine the power DNA can have in tracing one’s family story and in this case rewriting history.”

Dr. Harding, his cousin Abigail Harding, and Britton’s grandson James Blaesing were the people who had pursued the testing, hoping to finally put the issue to rest one way or another. The DNA test revealed that James Blaesing was a second cousin to Peter and Abigail Harding, meaning his mother Elizabeth Ann had to be President Harding’s daughter, the New York Times said.

“We’re looking at the genetic scene to see if Warren Harding and Nan Britton had a baby together and all these signs are pointing to yes,” Baloglu told the newspaper. “The technology that we’re using is at a level of specificity that there’s no need to do more DNA testing. This is the definitive answer.”

Abigail Harding told the Times that she had “no doubts left… When he’s related to me, he’s related to Peter, he’s related to a third cousin – there’s too many nails in the coffin, so to speak. I’m completely convinced.” However, Harding’s grandnephew Richard remained skeptical. “In my mind still to be proven,” he said, adding that he would welcome the new family members if the tests are conclusively proved to be valid.

—–

Feature Image: “Wharding” by Edmund Hodgson Smart. (Credit: WhiteHouseResearch.org/Wikimedia Commons)

What causes a heart attack?

 

In its simplest form, a heart attack occurs when blood flow to the heart is reduced or completely stopped, keeping the muscle from getting the oxygen it needs to stay alive. Most people link heart attacks to atherosclerosis, but many other common causes are ignored. With this in mind, we at redOrbit decided to tell you some other factors, so you might better know your risk. And as always, if you’re concerned, please speak with a doctor.

Atherosclerosis

Atherosclerosis is the top cause of heart attacks and doesn’t always happen just because someone ate fat, like most people think. A lot of things are associated with or contribute heavily to atherosclerosis, although we don’t always know why. For example, risk factors (besides diet) include smoking, older age, high cholesterol and triglycerides, kidney problems, high blood pressure, inflammation (like from arthritis or regular infections), and high blood sugar thanks to insulin resistance or diabetes.

These risk factors often cause the walls of the coronary (heart) artery to become damaged, which can trigger substances like blood cells and plaque (fatty deposits) to build up and harden. As plaque deposits build up, the artery become narrower, meaning less blood is reaching the heart—which can deprive parts of oxygen, leading to part of the heart becoming damaged or dying.

However, it can progress from there: If the plaque ruptures, blood clots form around them. The clot plugs up the artery, completely blocking blood flow to the heart.

Coronary Artery Spasm

Besides something blocking the coronary artery, the artery itself can contract to the point of blood restriction (also known as a coronary artery spasm). While not all causes of this are known, they can include taking drugs like cocaine, exposure to extreme cold, cigarette smoking, and emotional pain or stress, like in the Parks and Rec episode where Leslie and Anne scare poor, poor Jerry into having a fart attack.

Stress is a major cause, especially in the long term. However, acute stress can lead to a heart attack as well. If one becomes overwhelmed by an emotion like fright, massive amounts of adrenaline are released. Martin A. Samuels, chairman of the neurology department at Brigham and Women’s Hospital in Boston, explained why to Scientific American:

“Adrenaline from the nervous system lands on receptors of cardiac myocytes (heart-muscle cells), and this causes calcium channels in the membranes of those cells to open. Calcium ions rush into the heart cells and this causes the heart muscle to contract. If it’s a massive overwhelming storm of adrenaline, calcium keeps pouring into the cells and the muscle just can’t relax.

“There is this specially adapted system of muscle and nerve tissue in the heart—the sinoatrial (SA) node, the atrioventricular node, and the Purkinje fibers—which sets the rhythm of the heart. If this system is overwhelmed with adrenaline, the heart can go into abnormal rhythms that are not compatible with life. If one of those is triggered, you will drop dead.”

Tearing & kidneys

A tear in your coronary artery may cause a heart attack as well, as it may lead a significant amount of blood away from your heart.

Problems with your kidneys can lead to heart attacks in many ways—including anemia and imbalances in phosphate and calcium that can lead to calcification of blood vessel walls.

However, some things linked to heart attacks don’t seem to have a good explanation as to why they trigger them. In a 2012 study of nearly 24,000 people, calcium supplements (but not dietary calcium) were linked to an increased incidence of heart attack. Depression is similarly linked and unexplained.

(Image credit: Thinkstock)

Robots show that mass extinctions accelerate evolution

 

A computer science team at The University of Texas at Austin has discovered that virtual mass extinctions push robots to evolve more quickly and efficiently, suggesting that mass extinctions speed up evolution by encouraging new features and abilities in surviving lineages.

In the world of the living, mass extinctions are associated with utter destruction and the loss of a significant amount of genetic material. This is something that’s generally seen as a negative, especially since surviving populations may be small, leading to inbreeding and eventually more extinctions.

However, some evolutionary biologists hypothesized that such events are actually more positive, as generally only the most evolvable species survive—meaning evolution can leap forward as they fill in the gaps left behind by now defunct species.

Or, as co-author and computer scientist Risto Miikkulainen phrased it, “Focused destruction can lead to surprising outcomes. Sometimes you have to develop something that seems objectively worse in order to develop the tools you need to get better.”

In order to test this theory, the researchers used simulated robot brains known as artificial neural networks (ANNs) on which scientists can use evolution-inspired algorithms in order to help the “brains” improve at a task from one generation to the next.

Evolving computer-simulated robot legs

In a computer simulation, they connected several ANNs (“brains”) to simulated robotic legs with the goal being that evolution would allow the legs to walk smoothly and stably. Like with real evolution, random mutations were introduced in the ANNs, and many niches were created for the ANNs to fill as they evolved. (Similar to biology, a niche here referred to a behavioral specialization of a robot.)

The ANNs were tested in six conditions: Control, where no extinction occurred; Extinctions 300, 600, and 900, where 90% of the population was randomly killed off every 300, 600, or 900 generations, respectively; and Extinction Random, where mass extinctions killed off 90% at a random interval between 300 and 900 generations. Each condition was tested for 5,000 generations.

The end result: the conditions including extinctions evolved the best solutions to the task of walking as compared to the control. Further, in the extinction simulations, each extinction event resulted in an indirect selection for more evolvable individuals—thereby selecting for the lineages with the most potential to produce new behaviors. Of all the conditions, Extinction 300 resulted in lineages with the greatest evolvability.

While this is extremely interesting in the context of evolution, it has practical applications as well. For example, this could lead to the development of robots that can better search for survivors in earthquake rubble, or even navigate minefields.

(Image: At the start of the simulation, a biped robot controlled by a computationally evolved brain stands upright on a 16 meter by 16 meter surface. The simulation proceeds until the robot falls or until 15 seconds have elapsed. Credit: Joel Lehman)

Faint ‘young Jupiter’ spotted with methane atmosphere

 

A newly discovered exoplanet that weighs just twice as much as Jupiter is believed to be the lowest-mass exoplanet ever directly imaged using a space telescope instrument, according to a new study published in the latest edition of the journal Science.

Known as 51 Eridani b, this new world is the first exoplanet to be discovered using the Gemini Planet Imager (GPI), a new imaging instrument perched on top of the Gemini South Telescope in Chile that was initially deployed in 2013. It is classified as a young Jupiter, is a million times less bright than its central star, and has a rather unique atmosphere, the study explains.

“Of all the directly imaged planets so far, this is the first one where we’ve gotten a spectrum that shows methane,” Bruce Macintosh, the head of the GPI team, a professor of physics at Stanford University and a member of the Kavli Institute for Particle Astrophysics and Cosmology, said to redOrbit via email. “Methane is very common in Jupiter’s atmosphere (and all other giant planets seen so far), but has been really hard to see in extrasolar planets.”

Why has it been so difficult to see methane in exoplanets?

Macintosh believes it is likely because they tend to be either “too hot” or “too cloudy”. Conversely, 51 Eridani b “has cooled off enough that the clouds are breaking up and the methane is stable. It’s exactly what we were looking for.”

Furthermore, the faint nature of the planet, combined with its relatively close proximity to the star it orbits, makes it “the most Jupiter-like planet ever imaged – except, of course, that it’s so young, which is why we can see it,” he added. The new planet is believed to be just 20 million years old – “still warm from energy released when it formed,” the Stanford professor noted.

Unlike Kepler, GPI can directly detect exoplanets

As mentioned above, 51 Eridani b is the first exoplanet discovered by the Gemini Planet Imager, an instrument designed to find and analyze faint young planets that orbit around bright stars. While NASA’s planet-hunting Kepler mission has successfully found thousands of never-before-seen alien worlds, the two tools use different techniques to find new planets.

Kepler searches for a loss of starlight as a planet transits, or passes in front of, its central star. GPI, on the other hand, was designed to search directly for light from the planet itself. Kepler looks for the shadow of a planet, while GPI seeks out their glow, using a process called direct imaging to find planets of lower mass with a closer proximity to their stars.

“GPI was designed from the beginning just to see these planets,” Macintosh said. “We have special masks to block the light from the star and let us separate the planet from the star, and a very advanced adaptive optics system that can do a extremely good job fixing the turbulence in the earth’s atmosphere and the slight imperfections in the telescope that would hide a planet.”

“The planet light is fed into a spectrograph specifically designed for analyzing what we see in planets, so it’s nice to see it doing what it’s supposed to do,” he said, adding that he and his colleagues were currently looking at 600 young stars near the sun in the hopes that they will be able to find and study more planets.

“The big question, of course, is whether earth-like planets are common,” he concluded. “We can’t image those directly, but if we can understand how the giant planets form, that might provide clues to how smaller planets also form, and help us understand if, for example, all the Kepler ‘super-earths’ are really nice rocky planets, or ‘mini-neptunes’, or something else even weirder… it’s very clear that our solar system just isn’t typical in any way.”

(Image credit: Danielle Futselaar and Franck Marchis, SETI Institute)

Eye movements ‘change scenes’ during dreams

 

By recording data from individual brain cells during the rapid eye movement (REM) phase of sleep, researchers from France, Israel, and the US have found that the flickering of our eyes acts as a way to “change the scene” and create a new image in our dreams.

In research published by the journal Nature Communications, the study authors recorded brain cell activity in patients with implanted electrodes used to monitor seizures, and found that these eye movements triggered the parts of out brain involved in processing visual images during our waking hours, according to BBC News and Discovery News reports.

The discovery could help explain why people are able to remember vivid dreams when they are woken during this phase of sleep, explained co-author and Tel Aviv University neuroscientist Dr. Yuval Nir. Furthermore, it could help sleep researchers solve the longstanding mystery involving the relationship between dreaming and eye movement during slumber.

“Since they discovered REM sleep, they knew that in that state of sleep people experience vivid dreams and move their eyes frantically with their eyelids closed,” Dr. Nir explained to Discovery News, “but any attempt to relate these two phenomena has been very challenging.”

Analyzing the brain activity of epilepsy patients

The researchers worked with 19 different patients over the course of a four-year study, recording activity from electrodes located throughout the brain, but primarily in the media temporal lobe of people undergoing treatment for severe epilepsy. The individuals had electrodes implanted deep within their brains to monitor electrical activity taking place during seizures.

This set-up allowed Dr. Nir and his colleagues to monitor electrical activity while these patients were sleeping, matching it with their eye movements, which were recorded using stickers placed near their eyes. They discovered that rapid eye movement is followed immediately by a burst of electrical activity in the medial temporal lobe.

The medial temporal lobe, the researchers explained, is not directly involved in vision. Rather, it plays a key role in visual image processing by signaling to the brain about concepts, Dr. Nir told BBC News. For instance, if someone closed his or her eyes and thought about a specific object, the neurons would fire, implying “a refresh of the mental imagery and the associations.”

“[It is] more of a bridge between the visual parts… and the memory systems of our brain,” he explained to Discovery News. “Neurons in these regions are active when we view a picture of the Sydney Opera House, but also when we close our eyes and imagine the Sydney Opera House, and sometimes also even when we just hear the words ‘Sydney Opera House.’”

Some mysteries solved; others remain unanswered

Dr. Nix and his colleagues found a similar pattern of activity during sleep, and especially after the eye movements that occur during REM sleep – the phase of sleep in which we dream. While it was long believed that these movements could signify the visual components of dreams, this is the first study to find evidence that this is, in fact, the case.

“We are intimately familiar with the activity of these neurons. We know they are active every time you look at an image, or when you imagine that image. And now we see them active in a similar way when you move your eyes in REM sleep, so it becomes very probable that the eye movements represent some type of reset, or ‘moving onto the next dream frame,” he told BBC News, comparing it to moving to the next slide when using an old-fashioned projector.

Dr. Nir’s team also reported that these parts of the brain are involved in abstract perception, not details, meaning that even if we dream of someone who is familiar, there are certain aspects that we may not remember. The activity in these neurons during REM sleep indicates that new images or concepts may be being formed in our mind while we dream.

However, as Imperial College London neuroscientist Professor William Wisden pointed out to the BBC, there are still unanswered questions: “Why do we have to have REM sleep? Why does our brain have all this circuitry to do that? This paper doesn’t answer that, but it does emphasize how similar being awake and in REM sleep are, for particular circuits in the brain.”

(Image credit: Thinkstock)

When a ‘UFO’ flies by, does it upset bears?

 

If a person saw an unidentified flying object suddenly shoot past overhead, odds are that his or her heart would begin to beat more rapidly. Now, new research led by experts at the University of Minnesota, St. Paul has found that the same thing happens to bears.

Writing in the August 13 edition of the journal Current Biology, Dr. Mark Ditmer of the UM Department of Fisheries, Wildlife and Conservation Biology and his colleagues explained that the unmanned aerial vehicles (UAVs) used by wildlife researchers to observe creatures in their natural settings cause the heart rates of American black bears to soar.

The UAVs, which help experts monitor species (particularly endangered ones), do not appear to have any effect on the bears’ demeanor, and the creatures rarely run away or seem startled. Yet the new research found that the heart rate of bears subjected to drone flybys can rise by as much as 400 percent.

“Going in, we had four hypotheses: 1) no strong behavioral or physiological response 2) a behavioral only, 3) physiological only and 4) both a behavioral and physiological,” Dr. Ditmer explained to redOrbit via email. “Given that bears in this area are routinely exposed to human sights, sounds and smells in forms of farming equipment, agricultural areas and vehicle traffic, we expected the bears to mostly take the UAV flights in stride.”

“Therefore I hypothesized that we might see a slight rise in heart rate but I thought we would mostly just see a behavioral response,” he added. “The magnitude of some of the heart rate spikes were shocking. An adult female with cubs’ heart rate was 41 beats per minute prior to the unmanned aerial vehicle flight but it spiked to over 160 beats per minute during the flight.”

Should the use of conservation drones be discontinued?

As part of their research, Dr. Ditmer and his fellow investigators placed Iridium satellite GPS collars and cardiac biologgers on free-roaming American black bears living in the northwestern part of Minnesota. These collars sent the team an email with the location of the bears every two minutes, while the biologgers provided a consistent record of their heartbeats.

Next, they programmed a UAV to fly to the bear’s most recent location, and monitored the data to see how the bears reacted to the five minute long drone flights. Over the course of 18 flights conducted in the vicinity of four different bears, only twice did the animals show any significant change in their behaviors. However, each of the bears had strong physiological responses in the form of elevated heart rates, from which they recovered quickly.

Dr. Ditmer’s team said that it will now be necessary to consider the additional stress on wildlife caused by UAV flights when developing regulations governing the drones and when deciding the best scientific practices for the flying machines. In addition to being a valuable research tool, the UAV has been used to discourage poaching and to locate animals for ecotourism.

“By no means are we advocating against the use of UAVs, especially for research or conservation,” Dr. Ditmer said. “UAV’s hold tremendous potential for scientific research and as tools for conservation. Our research is a cautionary tale to let people know we need to understand the trade-offs of UAV use. If using a UAV to keep poachers away causes some stress in an individual I would certainly say it is very worthwhile.”

“However, until we know which species are tolerant of UAVs, at what distance animals react to the presence of UAVs, and whether or not individuals can habituate to their presence we need to remember that their presence may have a negative impact that isn’t always obvious,” he added.

Stars you can only see in the Northern Hemisphere

Take it from Danny and Sandy, stargazing is a must on summer nights. But it can also be worth it to brave the cold as you wait for a meteor shower with a cup of hot chocolate.

Stars represent the other galaxies, universes, and whatever else that lays far beyond our night sky. And because of this, we will never get tired of looking for constellations.

But our friends in the Southern Hemisphere can’t see some of our beloved constellations, and we can’t see some of theirs. If you’re an avid stargazer, you may need to start planning a trip, and we’ve compiled a list of hemispheric-specific constellations for your reading pleasure.

Stars you can only see in the Northern Hemisphere

1. Ursa Major (The Big Dipper)

If you live in the Northern Hemisphere, the Big Dipper is a constellation staple.  You just need to look north, and the 7 stars that form the Big Dipper appear in the familiar shape. Chances are, this was the first constellation you learned. According to Space.com, the Big Dipper (or Plough) is one of the most important constellations in the night sky. For anyone in the latitude of New York or higher, this constellation never goes below the horizon. To see the Big Dipper in all of its big dipper glory, you must be north of latitude 25 degrees south.

2. Cassiopeia (The Queen)

Looking like a flat “W” pressed against our Milky Way, Cassiopeia can best be seen in the late fall and winter months. The star in the middle of the constellation, Gamma Cassiopei, is about 15 times bigger than the sun, and if you added up all of its energy, it would be 40,000 times brighter. Cassiopeia is home to a large accumulation of young stars.

3. Cepheus (The King)

Cepheus is an old constellation, discovered in the 2nd century, and it looks more like a house than a king. The star at the very top of the house-like structure is a Cepheid, or a giant star used as a reference point for measuring distances. If we lived on Mars, this would be like our North Star. Cepheus contains the hyperluminous quasar S5 0014+81 that hosts the biggest black hole in the universe.

4. Ursa Minor (The Little Dipper)

Ursa Minor is best known for the star at its tail, the North Star, or Polaris. It’s called the North Star because it never budges from its spot at the end of the Little Dipper.  It is the brightest star in the constellation and the brightest Cepheid in the night sky.

Click here for Stars you can only see in the Southern Hemisphere

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Despite recent successes, is NASA in trouble?

 

With stories of the Pluto flyby, the in-depth analysis of the dwarf planet Ceres, and the ongoing exploration and search for life on Mars, it might seem as though NASA is at the top of its game right now, and close to recapturing the glory days of its past.

However, BBC News reports suggest that the organization could actually be in serious trouble, and that while the New Horizons mission has been one of its larger triumphs in recent memory, the US space agency itself has faced “upheaval and a funding crisis” since its launch.

“Some of the agency’s robotic exploration projects have been mismanaged and over budget, leading the space agency to cut some of its planetary exploration missions,” the British media outlet said, specifically mentioning the James Webb telescope and Mars Curiosity rover.

Designed to be the successor to the Hubble space telescope, the James Webb telescope was supposed to have been launched in 2011 and cost $1.6 billion. Currently, it is not scheduled to launch until 2018, and the price tag has soared to upwards of $8 billion. As for Curiosity, while the rover has undoubtedly been a success, it was also over budget and three years late.

Delays, cancelled collaborative missions could be warning signs

In a story published back in 2010, Nature called the Webb instrument “the telescope that ate astronomy” and called it “the key to almost every big question that astronomers hope to answer in the coming decades.” Without it, they said, most of the science goals of the decade would be “unattainable.” Since then, its launch has been pushed back seven years (so far).

This sort of thing has become business as usual for the US space agency, former NASA scientist Keith Cowing, author of the blog NASA Watch, told BBC News. “As upset as NASA proclaims to be when these overruns happen, they just go off and do another one,” he said. “It is an ongoing chronic issue with NASA,” Cowing added, calling their financial management “a mess.”

Money issues and other factors have caused some planned collaborations between NASA and the European Space Agency (ESA) to fall apart over the past few years, the UK media outlet added. Affected missions included a plan to explore the icy moons of Jupiter (EJSM/Laplace) as well as a joint mission to the Red Planet (ExoMars) that the ESA plans to pursue on its own.

“NASA has had a series of successes, notably the landing of rovers on Mars,” Professor John Logsdon of George Washington University told BBC News. “But the planetary exploration program has struggled for adequate funding. Its funding has been cut by between 10 percent and 15 percent, and no flagship missions seem to have been put in place under Obama.”

“The James Webb is a big hiccup in the progress of robotic science missions – we are in this period of re-establishing our human space flight capability and getting ready to explore,” he added. “NASA is recovering and doing well in the missions that it is involved with. So I think the outlook is more positive than not.”

(Image credit: Thinkstock)

Understanding the interplay between Kadian and Fibromyalgia Symptoms

Kadian and fibromyalgia symptoms are getting a lot of coverage lately thanks to some recent studies that have shown that this opioid analgesic can be very effective in relieving pain.

It can be helpful to understand more about the pain of fibromyalgia, and how Kadian and other opioids before deciding whether you want to try this medication for your symptom relief, or just to have as a backup pain reliever.

Kadian and Fibromyalgia Symptoms

Fibromyalgia may be many things, but it’s about chronic pain first

There is a large set of symptoms that come with fibromyalgia, but the one of the most striking characteristics of this disorder is the chronic and pervasive pain. The pain is very similar to a bursitis or arthritic pain and can take the form of stiffness, inflammation or sharp stabbing pains.

The experience of living with chronic pain may also be what contributes to the other common cluster of symptoms which is IBS, depression and fatigue.

First, you have to understand the rules of the fibromyalgia flare-up

People who don’t have fibromyalgia don’t understand how it really works. When someone refers to having a fibromyalgia flare-up, it can cover a whole range of symptoms – not just muscle pain.

Sometimes the symptoms may be focused on mood, cognition and digestion; sometimes it will be stiffness and pain – and even then, each flare-up will follow a scale, rather than a predicted pain rate. This is what can make finding an effective medication so difficult.

Then you have to understand why opioids work, and why they sometimes don’t

If you talk to a cross section of those who suffer from fibromyalgia you will find that they have a range of experience with taking different pain relieving drugs, opioid and not.

Some people have a natural resistance to opioid drugs, others respond so greatly to them that they are considered to be sensitive to the drugs.

For most people, the general rule of thumb with opioids is that the first dose will greatly relieve your pain, but each consecutive dose will then not affect you as greatly. This is because you are going to build up tolerance to the drug.

The answer isn’t as easy as just switching to another opioid, like going from Kadian to Percocet, because not all opioids are designed to do the same thing.

Aren’t all opioids going to relieve pain?

No, opioids are powerful drugs that work on the various nerve transmission processes in the body and the brain. They rarely work to relieve pain, but they do work to block pain – and that is a very important difference.

If your pain is more associated with inflammation, this may not be the right medication for you as it will not help to get the cause of your pain under control.

What about the risks of addiction?

There is no secret about the fact that the biggest risk with any opioid medication is the risk of addiction. The problem with that statement is it doesn’t reveal how subtle addiction may be.

Opioids work within the body, but only for so long. If you take too many of them, or too often, your tolerance will increase for them. Once the tolerance rises, it will take more and more to get the effect that you are seeking.

When it comes to using an opioid analgesic to relieve pain, the other problem is that many people are using them for pain management – but to try and eradicate the pain altogether.

That is too much too often. Your goal should be to lower your pain to a tolerable level. Chronic pain isn’t something that will ever go away, but you can find better ways to live with it.

Why is Kadian so effective?

One of the reasons that Kadian is singled out as the opioid analgesic of choice for those with fibromyalgia is that the medicine has a different delivery structure than most.

In its injected form, Kadian is highly effective because the medication is held in micro-spheres that break down uniformly within the body.

This means that wherever the medication was placed is where all of it will evenly hit. With many of the other opioid analgesics you may have an uneven experience with the delivery of pain relief as they break down differently.

None of the studies have looked specifically at where the tablet or capsule form of Kadian is as effective. In the studies, Kadian was applied by injection, and by a license health care professional – which may have greatly influenced the outcome of the pain relief.

What if I don’t want to go the opioid route?

Taking any kind of opioid can be frightening. While it can remove the pain, the side effects can be hard to handle. Many people prefer to keep them as a last resort only.

There are some very effective non-opioid pain relievers, like Vistiril and Indomethecin, that can be surprisingly effective for some. Alternative therapies have also been studied extensively with fibromyalgia and have proven to be very effective.

Homeopathic and alternative remedies for fibromyalgia that work

If you want to avoid taking opioids, or are medication adverse, you may want to explore the different homeopathic treatments for fibromyalgia that have a good history of working and providing relief.

For many people, the question isn’t as simple as whether or not taking Kadian is going to relieve their fibromyalgia symptoms, it’s about finding a range of tools to relieve your pain.

Flare-ups come in a wide variety of intensity and you may not want to always take something that you could build a tolerance to that you need in more severe attacks. This is why looking at everything, and trying it all, may be the best route to take.

Acupuncture, meditation and homeopathic treatments may bring your pain levels down enough so that you don’t have to take a medication every day, and when you have to take it – it will work most effectively.

Further reading:

http://www.fibromyalgia-symptoms.org/kadian.html

http://www.rxlist.com/script/main/rxlist_view_comments.asp?drug=kadian&questionid=fdb1509_pem&page=3

http://www.fmcpaware.org/medication/fibromyalgia-and-chronic-opioid-analgesic-therapy

Nutritionist: Is the Bulletproof Diet actually healthy?

Dave Asprey, founder of The Bulletproof Diet, would like to dispel the notation that, “Everything in moderation is the key to success when dieting.”

In fact, he believes nothing could be further from the truth. The Bulletproof Diet is based on the theory that harmful, “antinutrients”, or toxins, should be the major analytical tool when elevating dietary patterns and the nutritional value of foods. Foods that contain the least amount of antinutrients should be valued as the best for our health by decreasing hunger and increasing hormonal regulation. While foods that contain a small amount of antinutrients could be an okay choice for some individuals depending on tolerance. Further, true toxins, or in his words “kryptonite“, should be avoided at all cost.

During one of his podcasts, Asprey described a scenario that made this idea hit home. To paraphrase, science and the general population have widely accepted the notion that bacteria and viruses (that are invisible to the naked eye) can bring someone to their knees–or even their death bed. Traditional nutrition science, however, is more apt to write off the undetectable chemical or compound that is harming us as too small to make any real damage. In the Bulletproof paradigm, moderation has no value when it comes to kryptonite.

So what is kryptonite to Bulletproofers?

Here are the major classes and Asprey’s rational of why these should be avoided:

  1. Fructose: Found in fruits, honey, and high-fructose corn syrup. Reported to raise leptin (hormone causing satiety) resistance which increases food cravings and high triglyceride levels.
  2. Sugar: Reported to cause dopamine resistance which incurs decreased satiety from higher amounts of sugar (a.k.a. pleasure), alter insulin production over time that results in chronic fatigue and lack of focus, and promote fat storage.
  3. Processed Foods: Items such as chips, salad dressings or sauces, cookies, cereals. The argument is that processed foods all contain some sort of chemical alteration to the natural food component and does not metabolize efficiently in the body.
  4. GMO Ingredients: In Asprey’s words, “If you define processed as chemically altered, then GMOs are indeed as processed as any other Frankenfood.” GMOs are allegedly linked to immune and reproductive issues, and long-term studies are not available to prove/disprove their harmfulness.
  5. Vegetables Oils: Asprey categories fat is a much different way: chemically stable versus unstable (can be easily oxidized). Vegetable oils (for example, canola, corn, safflower, soybean, peanut) are considered inflammatory, not only, due to their oxidation capacity, but also, their high composition of omega-6 fatty acids.
  6. Grains: Wheat and gluten, in particular, are accused of acting as an opiate which promotes a cycle of food cravings. Gluten is also linked to chronic inflammation, gut dysfunction, decreased thyroid function, and malabsorption of vitamin D.

Ok, but what are the elements that make these bad for some people?

Asprey’s suspect list includes antinutrient-rich foods that MIGHT be tolerable for certain individuals, but act like kryptonite in others. These antinutrients come in the form of four chemical classes: lectins, phytates, oxalates, and mold.

Plant lectins come into contact with the gut by way of beans, nuts, grains, and some nightshade vegetables. Sensitivity with lectins varies from individual to individual. The major repercussion that occurs with lectin is a hormonal deficiency (leptin) that results in the body’s inability to receive the “I’m full” signal.

Phytates are a part of some plants’ immune systems, and attach themselves to harmful compounds, inhibiting their uptake. What seems like a good thing is actually harmful in humans, as it decreases mineral absorption and promotes gut disruption. Phytates are found is some whole grains, nuts, and seeds.

Oxalates are most known for their ability to form kidney stones. Oxalates are found in some traditional superfoods: kale, chard, and spinach. When oxalates come into contact with calcium in the body, crystals form and accumulate in less than desirable locations–resulting in muscle pain and weakness.

Finally: Mold. Mold is the most common antinutrient in the environment. It can be found in a variety of food crops, from corn, wheat, and wine. Mold toxicity in small, chronic doses can lead to lack of focus or even cardiomyopathy and brain damage. Asprey recommends that all of these antinutrients should to be tested in every individual’s own gut to determine if that person is sensitive.

Here is a further listing of Asprey’s suspect foods:

  1. Vegetables: Nightshades (artichokes, tomatoes, bell peppers, eggplant, cayenne peppers), green beans, garlic, beets, and peas
  2. Fats: extra virgin olive oil, palm and palm kernel oil, unheated nut oils, pastured bacon, grain-fed butter, pastured duck fat, pastured chicken fat
  3. Proteins: pastured poultry, pork, duck, factory-farmed eggs, whey protein isolate, sprouted legumes

My thoughts on this as a nutritionist

The Bulletproof creator has effectively helped to alter the paradigm of nutritional analysis. He has proven through valid and current research that there are elements in our food that are toxic to our bodies. And, for the most part, we are not paying attention to these.

For example, one patient in recent memory can into my office with advice for his kidney stones. I analyzed his diet and notice a very high consumption of almonds. Even though almonds are a great source of vitamin E, almonds also have one of the highest sources of oxalates. He did not want to stop eating them because he had been assured how HEALTHY they were for him. To him, the almond was determined “healthy” because of its concentration of “good stuff”. Thus, to him, it would have absolutely no “bad stuff”, perfectly illustrating that HEALTHY is relative to everyone.

The point that I would like to exaggerate is that all humans are composed differently. We cannot make a hard and fast rule that one thing is bad for all humans across the board. It is simply not that black and white. Secondly, I believe the bigger spotlight should be placed on body awareness. Knowing what feels good to your body and knowing what your body needs would be the best medicine. The caveat is that one must know where the possible problems may lay, and this is why I think Dave Asprey is completing ground breaking work.

Continue on to Part 2 of this series.

Mari-Chris Savage is a licensed and registered dietitian with years of experience in nutritional consultation and corporate wellness. As a nutrition specialist, Mari-Chris has consulted individuals on improving their current health status and focusing on preventative methods for a lifetime. Mari-Chris also holds a certification in personal training to promote physical activity guidance and motivation. Her background spurs an extremely active lifestyle filled with running, hiking, barre, and yoga classes. For more from Mari-Chris, check out the first part of this series: “What is the Bulletproof Diet?

500-year-old sea monster figurehead raised from the depths

The sea monster figurehead from a 15th century ship believed to be the best preserved vessel from the late medieval period has been raised from the Baltic Sea near Sweden, approximately 500 years after the Danish warship sunk while anchored at a port in Ronneby.

Johan Rönnby, a professor of maritime archaeology at Södertörn University, told BBC News that the creature on the figurehead is “some kind of fantasy animal” which looked like “a dragon with lion ears and crocodile-like mouth. And there seems to be something in his mouth.”

The figurehead, which belonged to a 15th-century warship owned by Danish King Hans known as the Gribshunden, weighed 660 pounds and was carved from the top of an 11-foot-long beam, according to Discovery News. The vessel reportedly went down in 1495 after it caught fire while en route from Copenhagen, Denmark to Kalmar on the eastern coast of Sweden.

Well-preserved wreckage could reveal much about ancient ships

The Gribshunden, a contemporary of Christopher Columbus’ flagship Santa Maria, is believed to be the best-preserved specimen of a 15th-century shipwreck ever discovered, as experts said that precious few wrecks from that era escape the ravages of sea worms, the website added.

“The ship comes from a time just when Columbus was sailing across the ocean and Vasco da Gama also went to India,” Marcus Sandekjer of the Blekinge Museum, a group involved in the salvage effort, told Reuters. “This is the same period and we can learn very much about how the ships were made, how they were constructed since there are no ships left from this time.”

“It’s unique in the world and I think there are going to be more excavations around here and we’re going to find some more unique objects,” he added. “But this… today is just fantastic.”

Sandekjer, Rönnby and the rest of their team said that they hope to recover more of the wreck of the Gribshunden to the surface of the near future. The location of the discovery is fortuitous, as it was found in the Baltic Sea’s brackish waters. which are unattractive to sea worms. Armor and weapons have previously been recovered from the wreckage, said ABC News Australia.

—–

Feature Image Credit: Södertörn University

Chinese ‘graffiti’ tells tale of 500 years of climate change

Researchers from the University of Cambridge, along with an international team of colleagues, have discovered a unique set of inscriptions on the walls of a cave in central China that tells how drought affected the population living there over the span of five centuries.

As reported in the journal Scientific Reports, the Cambridge-led team used inscriptions from the walls of Dayu Cave in the Qinling Mountains to detail the impact of seven drought events which took place over a 400-year span from 1520 to 1920. That information, combined with an in-depth chemical analysis of stalagmites in the cave, painted a detailed picture of those events.

Co-author Dr. Sebastian Breitenbach of Cambridge’s Department of Earth Sciences explained to redOrbit that people would enter the cave in groups of 100 or more to pray for rain, and to give a written account of their activities on the cave walls. Much to his surprise, his team found that the geochemical reconstruction closely followed the record left behind by the population.

Dr. Breitenbach called the discovery “amazing” and said that it marked “the first time we see such a clear cut link between our geochemical evidence and historical information right in very same place.” The link was made possible in part due to “the high chronological precision” of the dating process used by the research team as they analyzed stalagmite composition.

Combining historical accounts and chemical factors

“What I take as two of the of the most important implications from our study,” he said in an email, “is that we have now for the first time a direct link between historic information (the inscriptions) and our palaeoclimatic reconstruction from the same place,” which he added, “makes the interpretation of the geochemical evidence much more robust and less ambiguous.”

Previous research conducted in caves and lakes in China have found a possible link between global warming and the downfall of several Chinese dynasties, including the Tang, Yuan, and Ming Dynasties, the researchers said in a statement. Some of the inscriptions found in Dayu Cave show that residents prayed for rain during droughts in 1528 and May 1891.

Both of those dates coincide with drought events in China, the study authors said. The lack of rainfall in the 1890s led to severe starvation and local social instability, culminating in a conflict between the citizens and the government in 1900. They also noted that the drought of 1528 led to a lack of food, starvation, and even some reports of cannibalism. Evidence of such drought events were found in the form of specific elemental concentrations in the cave itself.

According to Dr. Breitenbach, his team looked at several different chemical factors, including the ratios of heavy-to-light stable oxygen isotopes (d18O), carbon isotopes (d13C) and strontium to calcium ratios in the stalagmites. The oxygen and carbon isotopes in stalagmites reflect the oxygen isotope composition of the parent drip water from which they formed, he explained, and this drip water itself originated from the rainfall coming in from the top of the cave.

What the elemental content tells about the cave’s formation

Drip water has a certain isotopic composition which depends on a combination of factors known as its “moisture history,” he explained, which itself depends upon the water’s place of origin, the length of its transport path, the amount of rainfall and other factors. Also, the d18O signal is only understandable within the context of modern climatic conditions, he noted.

Likewise, the carbon isotope signal in stalagmites is complex, but less dependent upon the cave’s temperature, and changes along with variations in the vegetation cover above the cave, as well as the CO2 composition within the soil. In this particular region of China, however, d13C is altered in correlation with changes in drip rate within the cave. In short, droughts show up as intervals in which there are less negative (heavy) d13C values, which is what the research team found.

During dry periods, calcium is removed from the small amounts of water that enters the cave, Dr. Breitenbach said, which is why strontium to calcium ratios can be associated with drought. When calcium is removed before it reaches the rock, strontium becomes enriched in the water, and then moves on to the cave, eventually making its way into the stalagmites.

“The inscriptions are amazing in that they give us direct information on the climate conditions above the cave at the time they were written. Because multiple factors can influence our geochemical proxies, their interpretation in terms of drought or flooding or other environmental changes might be questioned without extra information,” he said. “Here we have a completely independent account on what happened at the time – people tell us about their misery.”

Dr. Breitenbach added that the discovery “is a stark reminder of the influence climate has on us as society, and the vulnerability of civilization to even relatively small changes in climate. That our highly industrialized lifestyle is quite different from pre-industrial society in China is clear, but bearing the drought in California in mind, it is evident that sustained shifts in hydrological pattern can very severely impact large populations – especially so in the developing world.”

—–

Feature Image: This is an inscription from 1891 found in Dayu Cave. It reads: On May 24th, 17th year of the Emperor Guangxu period (June 30th, 1891 CE), Qing Dynasty, the local mayor, Huaizong Zhu led more than 200 people into the cave to get water. A fortuneteller named Zhenrong Ran prayed for rain during a ceremony. (Credit: L. Tan)

Iceland hits geothermal jackpot – will Japan be next?

Geoscientists in Iceland have produced the first realistic simulation of how an energy rich “magmatic intrusion” is created. The intrusions occur when a rising mass of viscous magma becomes stuck in the bedrock, and water above the intrusion reaches a “supercritical” temperature, so hot that it creates a reservoir which can produce massive amounts of geothermal electricity.

The new model predicts these natural phenomena may be widespread in highly volcanic areas like Japan and New Zealand. Somewhere out there is a lot of super-efficient energy just waiting to be tapped, and the Icelandic simulation makes finding it a whole lot easier.

Icelanders love their thin crust

Iceland lies on the Mid-Atlantic Ridge where the Eurasian and American continents are drifting apart. The earth’s crust below Iceland is less than four miles thick in places, and magma can easily reach the surface, giving the country its unique volcanic landscape and abundance of geothermal energy.

In 2008, Icelandic researchers from the Iceland Deep Drilling Project (IDDP) found the first magmatic intrusion while drilling a borehole on the Krafla lava field. But their drilling heads kept getting stuck on something, and directly above the magma chamber was a geothermal reservoir of water reaching a “supercritical” temperature of 450 degrees. (Above 374 degrees, water’s gas and liquid phases become indistinguishable and the resulting fluid can be as dense as a liquid but still flow as easily as gas).

While standard geothermal boreholes only have a capacity of 3 to 5 megawatts, a single borehole in a magmatic intrusion contains enough thermal energy to produce 35 megawatts.

Freak of nature or untapped bounty?

The team wondered if the discovery was just a freak of nature, or if it might mean such reservoirs were widespread. To answer these questions, three ETH Zurich geoscientists simulated this unusual geothermal system using a new computer model. This helped them understand how it occurs and under what conditions. It should also help the search for other similar systems. Their results and the associated modeling have been published in Nature Communications.

“The simulations offer a realistic representation of this reservoir’s behavior, even though we kept the model as simple as possible and built in only a few parameters,” said Thomas Driesner, senior lecturer at ETH Zurich’s Institute of Geochemistry and Petrology.

Driesner said the most satisfying thing about the simulations was that they reproduced what the Icelandic IDDP researchers observed at the original borehole. His doctoral student Samuel Scott used the simulations to show that the formation of such a geothermal reservoir depends on the permeability of the surrounding rock. At 1,000 degrees, the magmatic intrusion is absolutely impermeable to water.

If surrounding rock is highly permeable, water can easily rise through it, taking heat away from the intrusion and cooling the magma more quickly. But if the surrounding rock is only slightly permeable, the water remains trapped above the intrusion and heats up to beyond its critical point. In such a scenario, the magma chamber also stays hot for longer.

Basalt, the perfect incubator

Where temperatures and pressures are high, the host rock can change from brittle fracturing to plastic flow, closing up the cracks and fissures through which water might flow. Different types of rock become “plastic” at different temperatures. Basalt, a typical volcanic rock, becomes plastic and impermeable at around 500 to 800 degrees. But this transformation occurs in more silicon-rich granite at just 350 degrees, so magma intrusions in basaltic rock are more likely to create a geothermal reservoir.

“That’s why we expect this type of heat reservoir to be more common than previously thought in volcanic areas such as Iceland, New Zealand, or Japan,” concluded the researchers.

“The model gives us some idea of the criteria by which these zones develop, and how to recognize them,” Driesner explained.

The Icelandic researchers will soon drill a second test borehole on the Reykjanes Peninsula and the search for more “jackpot” reservoirs continues.

(Image credit: Thinkstock)

ADD vs. ADHD: What’s the difference?

 
Despite ADD/ADHD being one of the most common neurodevelopmental disorders in childhood—around 11% of U.S. children age 4-17 have ever received a diagnosis of it, according to a 2011 CDC statistic—few people seem to know one really basic fact about it. So what the crap is the difference between ADD and ADHD?
Turns out, there really is no difference between the two, except for the fact that one of them technically isn’t a thing anymore. There are mentions of what appears to be the disorder as far back as 1798, and it seems no one could settle on a name or the criteria since; ADD and ADHD are just the most recent attempts to name and categorize it.
But which one is right, then?
ADD, or attention deficit disorder, was first recognized by that name in 1980 when it was added to the DSM-III (the third edition of the handbook used by healthcare professionals to diagnose mental disorders). The disorder had been in previous DSM versions, but had not been referred to as ADD. Further, in the DSM-III­, the criteria for diagnosis had been altered to strike out hyperactivity as a main criterion, noting that those with ADD did not always have hyperactivity, but rather had issues with attention and impulse control.
ADD became ADHD (attention deficit-hyperactivity disorder) only seven years later, in the revised version of the DSM-III. In this version—after more research was conducted on the matter—the criteria for diagnosis changed again. This time, the symptoms of inattention, impulsivity, and hyperactivity were combined into one list of possible symptoms, and diagnosis required a certain score from the list.
The name has remained the same since 1987, so technically referring to ADHD as ADD is incorrect. The most recent DSM (the fifth edition) has yet again a different diagnosis determination, involving having at least five or six inattention symptoms and/or five or six hyperactivity-impulsivity symptoms, depending on age, plus a set of other conditions. The full criteria list can be found here on the CDC’s website.
Based on these new criteria, there are three kinds of ADHD one can have: combined, meaning they have five or six each in the inattention and hyperactivity-impulsivity categories; predominantly inattentive, signifying a significant score in inattention but not hyperactivity-impulsivity; and predominantly hyperactive, meaning the opposite.
Often, ADHD fades away as children age, but an estimated 4.4% of adults have ADHD—which is associated with higher levels of unemployment and divorce, as well as drug and alcohol abuse and mental disorders like depression. These problems can be helped though; treatment for adults and children is very similar, including medication, psychotherapy, and treatment of concurrent mental health issues like depression.
(Image credit: Thinkstock)

This telescope warns us of asteroid impacts and saves our butts

 
The first of two telescopes designed to help astronomers spot potentially dangerous space rocks as part of an asteroid-detection system has been successfully installed at an observatory atop the Haleakala volcano in Maui and is now fully operational.
The instrument is known as the first Asteroid Terrestrial-impact Last Alert System telescope or ATLAS 1, and according to Space.com, it is one of two telescopes developed at the University of Hawaii and funded by NASA that will be used to protect Earth from potential asteroid impact.
ATLAS project representatives confirmed late last month that ATLAS 1 had been mounted and assembled inside the Haleakala Observatory’s ASH dome in what they referred to as “a series of remarkably smooth operations” and that it was “working well and producing useful images.” The telescope was expected to provide full resolution images once a few adjustments were made.
Warning system will provide enough time for evacuations
Once the ATLAS project is completed later on this year, it will consist of two telescopes located about 100 miles (62 km) apart. The instruments will automatically complete several scans of the entire sky every night in search of moving objects, serving as an early warning system for nearby objects that could pose a danger to cities, towns, or counties on Earth.
It is hoped that ATLAS will be able to provide a one-day warning when a 30-kiloton asteroid is on a trajectory that could result in an impact with the planet’s surface, a one-week warning for a larger five-megaton asteroid, and three weeks for a massive 100-megaton rock, according to Space.com.
University of Hawaii Institute for Astronomy professor and ATLAS team member John Tonry previously stated, “That’s enough time to evacuate the area of people, take measures to protect buildings and other infrastructure, and be alert to a tsunami danger generated by ocean impacts.”
The second telescope will be placed on Mauna Loa, a volcano located on the main island of Hawaii. ATLAS team officials were expected to meet with officials from the US space agency and South Africa during the International Astronomical Union meeting in Honolulu, the website said. Those talks will reportedly be focused on a possible third telescope in South Africa.
(Image credit: Asteroid Terrestrial-impact Last Alert System Team)

Octopus genome reveals cephalopod secrets

 

Scientists are one step closer to finding the genes responsible for the unusual biology of the octopus – including the cephalopod’s ability to change skin color – after successfully sequencing the genome of a type of the creature commonly found in California.

The genome sequencing effort, which will be detailed in the August 13 edition of the journal Nature, was led by scientists from the University of California, Berkeley, the Okinawa Institute of Science and Technology Graduate University (OIST), and the University of Chicago.

The researchers examined and annotated the genetic code of the common California two-spot octopus (Octopus bimaculoides), and found a series of significant differences between the DNA of this creature and the DNA of other invertebrates, they explained in a statement.

Project co-leader Dr. Daniel Rokhsar, a professor of molecular and cell biology at UC Berkeley, told redOrbit via email that he and his colleagues “found a family of related genes, called reflectins, that have been implicated in skin color change. These are found only in cephalopods. However, we don’t know how the changes in skin color are controlled.”

Genes for complex neural circuit, independent arm movement discovered

In addition, Dr. Rokshar said that his team also discovered “a large family of genes that, in vertebrates, are known to enable complex neural circuits to form. Invertebrates typically have only a few of these genes. Even though octopus and vertebrates (including humans) both have a great variety of such genes… the way that this gene diversity is set up in cephalopods is completely different from how vertebrates do it. This is an example of convergent evolution.”

Essentially, the genome research revealed that the nervous system of the octopus is organized in a completely different way than that of humans. Their central brain surrounds their esophagus, a feature commonly found in invertebrates, but it also has groups of neurons in its arms that allow them to move relatively independently and autonomously. Better understanding the way the brain of the octopus works with each of its eight arms could help engineers develop new flexible, prehensile arms for robots that could outperform jointed ones under water.

“The octopus genome makes studies of cephalopod traits much more tractable, and now represents an important point on the tree of life for comparative evolutionary studies,” co-author Clifton Ragsdale, associate professor in neurobiology, organismal biology, and anatomy at the University of Chicago, explained in a statement. “It is an incredible resource that opens up new questions that could not have been asked before about these remarkable animals.”

Dr. Rokshar added that the genome of the octopus was “in general… much larger and more scrambled than other invertebrate genomes that have been studied to date.” Among the unusual discoveries was the fact that the ‘hox’ or ‘homeotic’ genes (genes which control the body plan of an embryo along the anterior-posterior axis) of an octopus were “dispersed,” not “organized in a very well-controlled cluster” as in other genomes.

(Image credit: Thinkstock)

Stripes may not protect animals from predators after all

Contrary to popular belief, stripes may not provide zebras or other animals that typically live in groups with any kind of protection from predators, researchers from the University of Cambridge reported Wednesday in the open-access scientific journal Frontiers in Zoology.

As part of their study, Anna Hughes, a Ph. D. student at Cambridge’s Department of Physiology, Development, and Neuroscience, and her colleagues recruited 60 different people to play a video game to test whether or not striped influenced their perception of moving targets.

The participants were asked to perform a task using a touch screen in which they tried to catch moving targets. One game had just one target on the screen, and another had multiple targets visible at the same time. When single targets were present, horizontal stripes targets were easier to capture those with vertical stripes, diagonal ones, or those uniform in color.

However, when there were several targets on screen at the same time, all of the striped targets, regardless of the direction of those stripes, were found to be more easily captured than uniform grey-colored targets. The findings appear to contradict the long-held assumption that stripes had evolved as a way to make it harder to capture group animals, the authors explained.

Findings call into question the effectiveness of ‘motion dazzle’

Hughes said that it was “surprising” that the striped targets “were easier to capture than a comparison grey target when the number of targets present was increased, as previous work has suggested that single striped targets are similarly difficult to catch to grey targets.”

“It’s not yet clear why this might be the case,” she told redOrbit via email. “One difference between our single target and multiple target games was that the single target game was time limited, whereas the subjects had as much time as they liked to catch all the targets in the multiple target game. So it could be that it’s only a very ‘split second’ effect. However, we’d obviously need to do more experiments to be able to say anything for certain.”

One might think that stripes and other high-contrast animals are more visible to predators, and in the past, researchers wondered if a concept known as “motion dazzle” could be used to explain why these markings evolved. Motion dazzle emphasizes the importance of movement, claiming that patterns cause predators to misread the speed and direction of a creature in motion, and that this phenomenon was strongest when creatures like zebra travelled in groups.

“The findings suggest that ‘motion dazzle’ effects may not be stronger and may even be weaker when multiple targets are present, which is surprising, as it is easy to imagine that stripes might be especially confusing and difficult to track in a large group,” Hughes explained. “Of course, this study is preliminary in many ways. Human eyes are different to those of the real predators of zebras, and therefore the effects of stripes on visual perception may also be different.”

“Similarly,” she added, “our model of group movement was very basic, with all the targets moving independently from each other: it may be the case that more complex designs, where the targets try to ‘stick together’ in the manner of a real herd would give a different result. Finally, the evidence for motion dazzle effects of stripes needs to be pieced together with the evidence for the other functions of the zebra’s distinctive pattern, as it is possible that the stripes have evolved for multiple purposes.”

(Image credit: Thinkstock)

Autism, schizophrenia linked to same receptors in brain

 

Researchers from the Salk Institute for Biological Studies may have just found a mechanism in the brain that leads to autism or schizophrenia.

The research focused on a specific brain cell, the parvalbumin-positive interneurons, and a specific receptor on it known as mGluR5, two components of the brain tied to development and neural disorders.

Parvalbumin-positive (PV) interneurons are inhibitory brain cells, meaning they act on other neurons to keep them from firing. They’re thought to be critical in the brain, especially for certain kinds of memory and brain development in general—as when signaling from these cells was disrupted during development during a prior study, the brain’s networks didn’t form correctly.

Meanwhile, the receptor mGluR5 is found on PV neurons, as well as other cells in the brain. This receptor pairs with the neurotransmitter glutamate, allowing glutamate to interact with the cells. In non-PV neurons, the mGluR5 receptor has been found to be important in general cognition and in creating some types of oscillatory wave patterns in the brain. Further, previous studies linked mGluR5 to addiction disorders, anxiety, and Fragile X Syndrome.

Knowing how important the PV cells are for development, the Salk researchers wondered what effects mGluR5 may have on them. After partnering with a team from the Department of Psychiatry at the University of California, San Diego, they studied the effects of deleting the receptor from mice after initial brain formation was complete.

Behavioral deficits similar to schizophrenia

“We found that without this receptor in the parvalbumin [PV] cells, mice have many serious behavioral deficits,” said Terrence Sejnowski, head of Salk’s Computational Neurobiology Laboratory, in a press release. “And a lot of them really mimic closely what we see in schizophrenia.”

The mice with the knocked-out receptors in fact displayed a host of developmental problems, with some indicative of autism as well: obsessive, repetitive grooming behavior and anti-social tendencies, along with patterns of neural activity which resembled those seen in humans suffering from schizophrenia.

“This discovery implies that changes after birth, not just before birth, are affecting the way the network is set up,” says Margarita Behrens, corresponding author and Salk staff scientist.

Which would make sense, as some with autism spectrum disorder often go through what is known as a regression—meaning they appear to develop regularly for months or years, until they suddenly begin to lose previously acquired skills, like speech. Schizophrenia, meanwhile, usually begins to appear between the ages of 16 and 30. In both cases, it could be possible that genes, epigenetics, the environment, or any combination of the three suddenly causes the PV neurons to lose mGluR5 receptors, resulting in the development of the disorders.

Some good news

Sejnowski considers this to be good news, because these changes might be reversible.

“The cells are still alive, and if we can figure out how to go in and change some of these molecular switches, we might actually be able to put the cells back into healthy, functioning states,” he said.

Behrens, however, thought their results should be a warning to those attempting to modulate the effects of mGluR5 in attempts to resolve other disorders the receptor is linked to. “There are a lot of clinical trials ongoing looking at modulating mGluR5 for anxiety and Fragile X Syndrome,” she says. “But our results suggest that if you affect parvalbumin neurons, you might get behavioral changes you weren’t expecting.”

The full study can be found in Molecular Psychiatry.

(Image credit: Thinkstock)

Very thirsty UK butterflies could be extinct by 2050

 

Global warming could wipe out six types of butterflies during the next 35 years, including heat-sensitive species such as the Cabbage White and Speckled Wood, experts from the UK’s Centre for Ecology & Hydrology reported this week in the journal Nature Climate Change.

In fact, even the lowest anticipated levels of warming coupled with habitat loss could decimate populations of these drought-sensitive creatures across the UK, according to BBC News reports. However, they also found that reducing greenhouse gas emissions and the habitat fragmentation of these species could help mitigate the effects of the increasing temperatures.

Lead author Dr. Tom Oliver called the results “worrying” in a statement, adding that he “hadn’t quite realized the magnitude and potential impacts from climate change” prior to the new study. For these butterflies, he said, “widespread population extinctions are expected by 2050,” and the only way to combat them is through both habitat restoration and CO2 emission reduction.

The southeastern parts of England could be the worst affected, the researchers told BBC News, but if warming conditions become less extreme and the habitats become reconnected, the probability of butterfly survival could increase by as much as 50 percent.

Habitat restoration could help mitigate the effects

As part of their research, Dr. Oliver and his colleagues studied how 28 different butterfly species responded to an extreme drought in 1995 (the driest summer in the UK since 1776). They found that six out of the 28 – the green-veined white, ringlet, speckled wood, large skipper, large white, and small white butterflies – suffered dramatic post-drought declines, said Live Science.

The study authors went on to use computer models to examine how the six most-affected species could recover in the four years following a drought, and created monitoring sites to monitor them in semi-natural habitats over a 1.9-mile (3km) radius, the website added. Species that had access to more habitats tended to recover more quickly than those with less available terrain.

“There’s good news and bad news here,” said co-author Mike Morecroft from Natural England. “The good news is that we can increase the resilience of species to climate change by improving our natural environment, particularly increasing areas of habitat and we are working hard at this. However, this approach will only work if climate change is limited by effective controls on greenhouse gas emissions.”

“If our habitats are very fragmented, the impacts will be much more severe. In places where it isn’t those populations might persist,” Dr. Oliver told BBC News. “It allows us to buy time until we get those global emission cuts in place.”

(Image credit: Thinkstock)

Scientist: I’m going to code a film into DNA

Harvard genetics professor Dr. George Church, known for his recent attempts to restore the wooly mammoth, is reportedly attempting to code a film onto miniscule strands of DNA as a way to preserve the contents for hundreds of generations.

According to the Los Angeles Daily News, Dr. Church is attempting to code the 1902 French silent film “A Trip of the Moon” (believed by many to be the first science-fiction movie ever made) onto an unusual, denser type of genetic material known as “unnatural DNA.”

Unnatural DNA, the publication explains, was designed specifically to store high quantities of data and is different that the genes typically found in living organisms. With the financial support of film industry heavyweight Technicolor, Dr. Church’s lab is taking the hundreds of miniscule pixels that make up each image of a movie and assigning them a code based on color.

These codes are then converted into the chemical bases that comprise DNA – adenine, cytosine, guanine and thymine. Similarly, the film’s audio is broken down into smaller bits of data, given a numerical code and converted into DNA base pairs. Each DNA strand is then carefully labeled with a chemical index that denotes its place in the movie, so that a computer program can place each of the genetic fragments into the proper order and recreate the film.

Building upon previous research involving books

Once the data-filled genetic material is created, it can be easily copied and distributed, and the DNA that contains the film is about the size of a single atom, or smaller than a speck of dust, the Daily News said. Reading the DNA and watching the encoded movie requires a computer and access to a DNA sequencer, a unit which places the stands in their proper order.

Lest this sound impossible, Dr. Church has already pulled off a similar feat. Back in 2012, the Harvard geneticist encoded copies of his book Regenesis: How Synthetic Biology Will Reinvent Nature and Ourselves onto DNA, converting every page into the four chemical bases and even taking copies of the encoded book onto the Comedy Central TV show The Colbert Report.

In reality, though, Dr. Church’s technique is not going to be used by the general population to read any books or watch any movies anytime in the near future. As he explained to the Daily News, this is “a baby technology” currently best suited for data storage and archival purposes. He added that he and his colleagues “don’t want people to get expectations too high.”

—–

Feature Image: Thinkstock

Asteroid mining could be less than 10 years away, experts claim

Sending a spacecraft to a far-off asteroid in order to mine its resources may seem like a long-term proposition, but companies could be transforming water contained on these rocky space objects into rocket fuel within the next 10 years, according to recent reports.

As Space.com explained on Tuesday, Washington-based asteroid mining company Planetary Resources has already deployed its first probe from the International Space Station (ISS), and hopes to launch a series of improved spacecraft over the next few years. At first, the goal is to make propellant from H20, but ultimately they hope mine metals from space rocks.

In a statement, Dr. Peter H. Diamandis, co-founder and co-chairman of Planetary Resources, called last month’s deployment of the firm’s Arkyd 3 Reflight (A3R) spacecraft “a significant milestone for Planetary Resources as we forge a path toward prospecting resource-rich asteroids. Our team is developing the technology that will enable humanity to create an off-planet economy that will fundamentally change the way we live on Earth.”

Likewise, Planetary Resources president and chief engineer Chris Lewicki told Space.com that the company had “every expectation” that they would be gathering water and working to create an in-space refueling station within “the next 10 years,” and possibly as early as “the first half of the 2020s… We’re moving very fast… those things will come… sooner than we might think.”

First resource-harvesting missions could happen by 2020

Along with competitor Deep Space Industries, Planetary Resources is looking to harness natural resources found in asteroids. To start with, they will be looking to draw water from a special type of space rock known as a carbonaceous chondrite. The water could be used for drinking, to keep astronauts safe from radiation, and broken down into oxygen and hydrogen for refueling.

Carbonaceous chondrites could also be harvested for minerals such as iron, nickel, and cobalt, the website explained, and ultimately the companies hope they will be able to extract rare metals from the platinum group on asteroids, for use in electronics and other high-end technology.

“Ultimately, what we want to do is create a space-based business that is an economic engine that really opens up space to the rest of the economy,” Lewicki told Space.com. The next step for the company will be to launch the Arkyd-6, a probe that is twice as large as the one launched back in July. By late next year, they hope to have an even bigger version ready for deployment.

Eventually, that model (the Arkyd 100) will be replaced by the 200 and the 300, both of which will be designed to perform up-close inspections of near-Earth asteroids in search of a potential mining target. If all goes well, the company hopes to send the 200 into orbit on a test flight by 2018 and the 300 to a target asteroid still to be identified by late 2018 or early 2019.

Deep Space Industries, on the other hand, is still in the process of designing and building probes, according to Space.com. Company representatives have previously said that they intend to send a resource-collecting mission to a near-Earth asteroid no later than 2020, the website added.

—–

Feature Image: Thinkstock

Have we finally found Nefertiti’s tomb?

A new analysis of ultra-high resolution images of Tutankhamen’s tomb has revealed cracks and fissures in two places on a wall, possibly indicating the presence of two passages that were, at some point, sealed off with plaster and painted over. They also may hide a massive secret.

According to BBC News and Archaeology, the researcher behind the analysis, Nicholas Reeves of the University of Arizona, believes one of the hidden passages leads to a storeroom. The other, however, could open to a corridor and a queen’s burial chamber. In other words: By studying these images, Reeves may have just uncovered the hidden final resting place of Queen Nefertiti.

As Reeves explained to The Economist, there are several unusual characteristics to King Tut’s tomb. For one thing, it is small in comparison to other tombs in the same region, and the objects that it contains appear to be primarily second-hand and hurriedly placed. The young monarch’s funeral mask features pierced ears, and the main axis of the tomb was angled to the right of the entrance shaft – something usually found in the tombs of queens, but not kings.

One of the two hidden passages he found, however, aligns perfectly with both sides of the tomb’s entrance chamber, he explained, meaning that it could hide a corridor continuing along the same axis. This would be along the same shape and scale as other royal tombs in the same vicinity, he noted, and considering that the decoration and construction were completed in multiple stages, it appears as though the corridor could lead to the burial chamber of a queen.

But is it Nefertiti?

Reeves told BBC News that he believes Nefertiti could lie within the hidden chamber, but some other Egyptologists are not yet convinced. US archaeologist Kent Weeks told The Economist that it was “a fascinating argument and an impressive first step,” and that a radar scan could be used to easily and noninvasively examine the structure and find any hidden chambers.

“I think there are certainly some signs that there might have been some activity around those doorways,” Joyce Tyldesley, an Egyptologist at the University of Manchester, told BBC News. “Whether we can deduct from that that we actually the burial site of Nefertiti might be a step too far. But if it was true, it would be absolutely brilliant.”

If Reeves is correct, it would indicate that the tomb used to store Tutankhamen might not have actually been built for him, and that Nefertiti, the wife of Tutankhamun’s father Akhenaten and the boy king’s stepmother, might have been the one who was supposed to wear the burial mask with the pierced ears that wound up being used for her stepson, The Economist said.

“Each piece of evidence on its own is not conclusive, but put it all together and it’s hard to avoid my conclusion,” Reeves said. “If I’m wrong I’m wrong, but if I’m right this is potentially the biggest archaeological discovery ever made” – and, as he told the BBC, “if I’m right… the world will have become a much more interesting place – at least for Egyptologists.”

—–

Feature Image: Thinkstock

The evolution of beer yeast

As any beer snob worth their salt will tell you that a good brew depends largely on the quality of the yeast – but what exactly do we know about the evolution of the fungal organisms frequently used as ingredients in the various ales and lagers enjoyed the world over?

Chris Todd Hittinger, a professor in the University of Wisconsin-Madison’s genetics laboratory, graduate student Emily Clare Baker, and their colleagues set out to discover the origins of hybrid yeast strains used to create lagers, using cutting-edge sequencing techniques in order to complete and assemble a high-quality genome of a newly-described species found in Patagonia.

This new wild species, Saccharomyces eubayanus, was compared to the domesticated hybrids used to brew larger-style brews, allowing researchers to study the complete genomes of both parental yeast species (S. cerevisiae and S. eubayanus) used to craft these popular beverages for the first time. The findings appear in the journal Molecular Biology and Evolution.

Lager hybrids had at least two independent lineages

Oddly enough, Hittinger and Baker’s research revealed that there are two independent origin events for the S. cerevisiae and S. eubanyus hybrids used to brew lager beers, and that despite significant genetic differences between the two different types of yeast, there were at least two distinct hybridization events involving strains of these two different organisms.

Identified by the authors as the Saaz and Frohberg lineages (based on their area of origin), the two evolutionary origins involved nearly identical strains of S. eubayanus and relatively more diverse ale strains of S. cerevisiae, the authors explained in a press release. The study clarifies the origins of these two major hybrid yeast origins, and could direct future research focused on the domestication of lager yeasts.

In addition to discovering that hybrid lager yeasts had originated at least twice, Hittinger told redOrbit that he and Baker discovered that “the domesticated S. eubayanus subgenomes in the hybrids experienced a dramatic increase in their rate of evolution (specifically protein versus neutral changes), as is frequently observed in domesticated plants and animals.”

“There were two major hypotheses about the origins of the Saaz and Frohberg groups, [and] there was some prior support for each, so both models were plausible,” he added via email. “I was surprised at how clear the results were with over 10x more differentiation between the S. cerevisiae subgenomes.” The findings, Hittinger said, reveal “how little we still know about natural Saccharomyces diversity,” and since only a subset of the species have been utilized by industry, “there is a lot of potential to create novel, custom brewing or biofuel strains.”

(Image credit: Thinkstock)

Lost Roanoke colony mystery may finally be solved

For those of you who don’t remember all of your US history: Roanoke is one of those enduring mysteries of the past, as puzzling as the true identity of Jack the Ripper or the fate of Amelia Earhart. Discoveries announced recently, however, may shed some light on what happened in Roanoke, North Carolina more than 400 years ago.

In 1587, more than 100 English colonists settled on Roanoke Island in an attempt to found the first permanent English settlement in the New World. By August 18, 1590, the entire colony was gone—vanished seemingly into thin air, leaving behind only their abandoned and looted colony and two carvings: “Croatoan” on a post and the letters “CRO” on a tree.

The colony leader, John White, was the one to make the alarming discovery. He had sailed to England in 1587 to fetch desperately needed supplies for the colonists, leaving behind his wife, his daughter, and his granddaughter Virginia Dare—the first non-indigenous child born in the New World.

Despite his best efforts to find them, he never saw them again. The Lost Colony has remained an enormous mystery ever since, with theories abounding. Croatoan was the name of an island south of Roanoke (now Hatteras Island)—so did the colonists go there? Were they killed by Spaniards or Native Americans? Did they try to sail back to England and die on the journey?

New evidence, possible answer

Now, two independent research teams claim they have found archaeological evidence that points to what happened to the vanished 115 men, women, and children: They may have survived, joining two friendly Native American tribes in the area.

The evidence is a collection of goods from two separate sites, including a sword hilt, fragments of English bowls, gun flintlocks, and a fragment of a writing slate still inscribed with the letter M. Even more exciting: The sites are found both on the mainland and on Hatteras Island—also known as Croatoan.

“The evidence is that they assimilated with the Native Americans but kept their goods,” Mark Horton, an archaeologist who heads the excavation on Hatteras, told National Geographic.

These aren’t the first major items to be found on Hatteras Island. In 1998, a 10-carat gold signet ring was discovered, believed to date back to the 16th century. This find prompted further excavations—including the one that unearthed the European items found in July. The bowl fragments are especially exciting, because the style of pottery (Border Ware) stopped being imported into the New World by the early 17th century—decades before the first recorded settler moved into the area in 1655.

However, both the Hatteras team and the mainland team don’t claim this information is the end-all be-all, especially since exact dating of these objects is difficult.

“You have more work to do,” Ivor Noel Hume, a former Colonial Williamsburg archaeologist who excavated at Roanoke Island in the 1990s, warned both teams at a meeting.

According to the New York Times, one of the teams will make further announcements on the discoveries today. Stay tuned!

—–

Feature Image: John White discovers “Croatoan” inscription in Roanoke. (Credit: Wikipedia)

Woah: Droplets levitate over blue light cushion

Researchers from the French Alternative Energies and Atomic Energy Commission have found a new way to levitate liquid droplets, a discovery that could lead to a new way to generate freely movable microplasma (and it also comes with a rather colorful side-effect).

As physicist Cedric Poulain and his co-authors explained in the latest edition of Applied Physics Letters, their new technique also resulted in the creation of a tiny light show: the liquid droplet was observed sparkling while it floated above a faint blue glowing gap.

“We were interested in a better understanding of the boiling mechanism,” Poulain told redOrbit via email, “Namely, the formation of bubbles (nucleation), as well as what happens at high heat flux when suddenly all the bubbles coalesce, leading to the well-known film boiling.”

Results similar to the Leidenfrost levitation effect

The floating effect is described as similar to a phenomenon known as Leidenfrost levitation, in which droplets essentially dance on a hot vapor cushion. However, by using electricity to make the curtain instead of heat, the research team found that they were able to ionize the gas, creating plasma that glowed with a soft blue light.

In a press release, Poulain explained that whether a hot plate or an electrode is used to heat a liquid, the differences in temperature result in a flux of electrons. At low voltages, this flux occurs “in a conductive or convective manner,” he said, but at higher voltages, bubbles of gas or vapor are created. Upping this voltage to even higher levels results in the Leidenfrost levitation effect.

The researchers believe that the deformability of a liquid drop could allow them to create a new machine capable of moving the plasma created in this manner along a surface. However, Poulain said in a statement that they weren’t thinking of a device when they first came up with the idea for this experiment; they were focused on exploring the limits of the analogy between boiling and water electrolysis, or the breakup of water into elemental gases using electricity.

“The initial goal was rather fundamental,” he told redOrbit. “We wanted to better understand the boiling phenomenon, and the process of nucleation in general. Our findings emphasize the value of exploring a phenomenon (like bubble production) through two different processes which often are studied by two different communities.”

Sending voltage through hydrochloric acid

The researchers created a system designed to send electricity through conductive droplets, while filming the behavior of the droplets at high speeds. They took a tiny drop of weak hydrochloric acid (a conductor) and suspended it above a metal plate and applied a voltage across the drop. As the drop touched the plate, electricity started flowing, causing water in the acid to break down into hydrogen and oxygen gas.

At 50 volts, the bottom of the droplet started sparking and levitated, rising over the surface of the plate. This caused a faint blue glow to emanate from the gap, and while the researchers initially believed that the drop might be resting on a cushion of hydrogen gas from the breakup of water, they eventually found that the gaseous cushion was made primarily out of water, which had been vaporized by energy from the electric current.

The research may provide new insights into fundamental questions of physics, and Poulain noted that the method used in the experiment could provide an easy, original, and inexpensive new way to create plasma. Next, they intend to analyze the composition of this plasma layer, as it appears to be a superposition of two different types of plasma.

As for that blue light given off during the experiments, the team said that the small gap between the droplet and the metal plate likely caused the highly electric field required in order to generate a long-term, dense plasma with little energy.

(Image credit: Cedric Poulain, et al/CEA)