Sleep Medications More Common Among Older, Educated Americans

Brett Smith for redOrbit.com – Your Universe Online

A new report from the Centers for Disease Control and Prevention (CDC) indicates that people who use prescription sleep aids are more likely to be older, female or well-educated.

The report found that approximately 4 percent of all American adults say they recently used a sleep medication. That percentage was higher, however, for those between the ages of 50 and 59, as well as those 80 and older.

Study author Yinong Chong says that people in their 50s could have trouble sleeping because of compounded stress from work and family responsibilities.

“It gives the picture of a sandwiched group who has family, not only children but also probably elderly parents but still you’re likely to be in the workforce, so you get squeezed at both ends in terms of family responsibility and job responsibility,” said Chang, an epidemiologist at the CDC. Chang also notes that sleep habits may improve during the 60s as people retire but before potential chronic health problems usually set in.

In the study, the CDC’s National Center for Health Statistics surveyed adults 20 and older about their use of prescription sleep aids within the last 30 days. Participants were also asked to show interviewers their relevant medication or prescription information.

“You get how many people are actually using them,” Chong said. “This is actual use.”

The researchers found that 5 percent of women surveyed had reported recently taken a sleep aid compared to about 3 percent of the men surveyed. The CDC team found that education was also a factor: Approximately 3 percent of those without a high school diploma reported using sleep aids compared to 4.4 percent of those with a high school diploma or higher.

Critics of prescription sleep aids point out that some patients continue to feel their effects after they wake up – making driving and other activities potentially dangerous. the US Food and Drug Administration (FDA) is starting to take these criticisms seriously and has ordered that dosages of some of the drugs be lowered.

In their report, the study authors noted that prescriptions for sleep medications have tripled over the past 20 years. The study, which examined trends from 2005 to 2010, is the first to look at changes in sleep aid use through the lens of social and demographic groups, say the authors. They also noted that between 50 million and 70 million Americans have sleep issues or are sleep-deprived.

“Prescription sleep aids are one of the treatment options for trouble going into or maintaining sleep,” the authors wrote, adding that their long-term use can be harmful.

According to the nonprofit advocacy group the National Sleep Foundation (NSF), the typical adult needs seven hours of sleep to avoid a higher risk or death or other health issues. The amount of sleep is determined by two factors: basal sleep – the amount of sleep your body regularly needs – and sleep debt – sleep that is lost due to sickness, environmental factors or other reasons, according to the NSF.

“Two studies suggest that healthy adults have a basal sleep need of seven to eight hours every night, but where things get complicated is the interaction between the basal need and sleep debt,” said a statement on the group’s website. “For instance, you might meet your basal sleep need on any single night or a few nights in a row, but still have an unresolved sleep debt that may make you feel more sleepy and less alert at times.”

“The good news is that some research suggests that the accumulated sleep debt can be worked down or ‘paid off,’” the group said.

200,000 Early Deaths A Year Due To Air Pollution, Says Study

Peter Suciu for redOrbit.com – Your Universe Online

While the 1970 Clean Air Act was put in place by the United States Environmental Protection Agency to protect the public health, a new study by researchers from MIT’s Laboratory for Aviation and the Environment have found that these efforts likely aren’t doing enough.

This new study found that vehicle emissions are the biggest contributor to premature death. According to the study more than 200,000 early deaths occur in the United States each year due to combustion emissions, and the leading causes are road transportation and power generation.

The findings were published in the journal Atmospheric Environment.

“Combustion emissions adversely impact air quality and human health,” the paper’s abstract noted. “A multiscale air quality model is applied to assess the health impacts of major emissions sectors in United States. Emissions are classified according to six different sources: electric power generation, industry, commercial and residential sources, road transportation, marine transportation and rail transportation. Epidemiological evidence is used to relate long-term population exposure to sector-induced changes in the concentrations of PM2.5 and ozone to incidences of premature death.”

The MIT researchers tracked the different emission sources, and found that road transportation is the most significant contributor, causing 53,000 premature deaths, followed closely by power generation, with 52,000.

The researchers noted that California suffers the worst health impacts from air pollution, in a state-by-state analysis, with about 21,000 early deaths annually, mostly attributed to road transportation and to commercial and residential emissions from heating and cooking. In addition, the researchers sought to map local emissions in 5,695 US cities. They found that the highest emissions-related mortality rate was in Baltimore, where 130 out of every 100,000 residents are likely to die in a given year due to long-term exposure to air pollution.

“In the past five to 10 years, the evidence linking air-pollution exposure to risk of early death has really solidified and gained scientific and political traction,” said Steven Barrett,  an assistant professor of aeronautics and astronautics at MIT. “There’s a realization that air pollution is a major problem in any city, and there’s a desire to do something about it.”

These findings are in line with a new study from NASA that found a strong connection between pollution and population density. That study utilized satellite observations to calculate air pollution’s dependence on population for cities in the United States, Europe, China and India.

The MIT study found that a person who dies from an air pollution-related cause typically dies about a decade earlier than he or she otherwise might have. In looking at the effects of air pollution the team obtained emissions data from the EPA’s National Emissions Inventory, a catalog of emissions sources nationwide, to determine the number of early deaths.

The researchers looked at collected data from the year 2005, which was the most recent data available at the time of the study. From here the study’s authors divided the data into six emissions sectors: electric power generation; industry; commercial and residential sources; road transportation; marine transportation; and rail transportation. Barrett’s team fed the emissions data from all six sources into an air-quality simulation of the impact of emissions on particles and gases in the atmosphere.

While the study is based on data that is more than seven years old, Barrett said the results are likely representative of today’s pollution-related health risks.

Curiosity Takes Sharpest Images Yet Of Solar Eclipse On Mars

Lee Rannals for redOrbit.com – Your Universe Online

NASA said its Curiosity rover has taken the sharpest image of a solar eclipse ever taken on Mars.

The Martian rover was able to snap an image of Phobos, the largest of Mars’ moons, passing in front of the sun. Curiosity snapped a set of three frames, taken three seconds apart using its Mast Camera (Mastcam).

The rover took a break from driving on August 17, 2013 in order to record the event. The images are the first full-resolution frames of a series that could eventually be combined to create a movie of the eclipse.

“This event occurred near noon at Curiosity’s location, which put Phobos at its closest point to the rover, appearing larger against the sun than it would at other times of day,” Mark Lemmon of Texas A&M University, College Station, a co-investigator for use of Curiosity’s MastCam, said in a statement. “This is the closest to a total eclipse of the sun that you can have from Mars.”

Curiosity, as well as the still-active Mars rover Opportunity, is helping scientists understand even more about the orbits of the Martian moons. The position of Phobos during the latest eclipse was a mile or two closer to the center of the sun’s position than research anticipated. It is data like this that will enable scientists to hone in on the precise orbits of the Martian moons.

“This one is by far the most detailed image of any Martian lunar transit ever taken, and it is especially useful because it is annular. It was even closer to the sun’s center than predicted, so we learned something,” Lemmon said.

Last December, researchers from the Complutense University of Madrid said they developed a method to allow Curiosity to navigate around the Red Planet using Martian eclipses. The scientists said they were able to use previous Curiosity observations to help pinpoint the rover’s position on Mars with an error of just a few feet to a few miles. The latest observations could potentially help scientists sharpen up this method even more in the future.

NASA said earlier this week that it has allowed Curiosity to have a little more freedom, equipping the rover with autonomous navigation for the first time. This software enhancement will allow for the rover to traverse across the Martian terrain on its own, helping it determine the safest path to take as it makes its way closer to Mount Sharp.

Fukushima Radioactive Ocean Plume Expected To Reach US Shores By 2014

redOrbit Staff & Wire Reports – Your Universe Online

The radioactive ocean plume created as a result of the 2011 Fukushima nuclear power plant disaster is expected to reach North America by next year, according to research appearing in the latest edition of the journal Deep-Sea Research 1.

Fortunately, the study authors report the plume will be harmless by the time it reaches US shores.

“Observers on the west coast of the United States will be able to see a measurable increase in radioactive material three years after the event,” study author Dr. Erik van Sebille of the Climate Change Research Centre at the University of New South Wales said in a statement.

“However, people on those coastlines should not be concerned as the concentration of radioactive material quickly drops below World Health Organization (WHO) safety levels as soon as it leaves Japanese waters,” he added.

Some degree of atmospheric radiation was detected on the west coast of the US just days following the accident at the Japanese power plant, but Dr. van Sebille’s team explains the actual radioactive particles in the ocean plume take far more time to travel the same distance.

In the study, the authors used a series of ocean simulations in order to track the path of the radiation. The models identified it would most likely spend the better part of the next decade travelling through the world’s oceans.

A pair of energetic currents off the Japanese coast (the Kuroshio Current and the Kurushio Extension) has played a key role in diluting the radioactive material. Thanks to those currents, the radioactivity was considerably below WHO safety levels in under four months time, and the dilution process continued since then because of eddies, giant whirlpools, and other currents in the open ocean.

“Although some uncertainties remain around the total amount released and the likely concentrations that would be observed, we have shown unambiguously that the contact with the north-west American coasts will not be identical everywhere,” explained Dr. Vincent Rossi of the Institute for Cross-Disciplinary Physics and Complex Systems (IFISC).

“Shelf waters north of 45°N will experience higher concentrations during a shorter period, when compared to the Californian coast,” he added. “This late but prolonged exposure is due to the three-dimensional pathways of the plume. The plume will be forced down deeper into the ocean toward the subtropics before rising up again along the southern Californian shelf.”

According to the investigative team, the majority of the radioactive material will remain in the North Pacific, with only minute amounts crossing south of the Equator during the first 10 years. However, a measurable yet harmless signature of the radiation will spread into the Indian and South Pacific oceans over the course of several decades.

“Australia and other countries in the Southern Hemisphere will see little if any radioactive material in their coastal waters and certainly not at levels to cause concern,” Dr. van Sebille said.

He added the researchers had developed a website to help people keep track of the path of the radiation. “Using this website,” Dr. van Sebille said, “members of the public can click on an area in the ocean and track the movement of the radiation or any other form of pollution on the ocean surface over the next 10 years.”

FDA Allows Taylor Farms de Mexico To Reopen Following Cyclospora Investigation

Lawrence LeBlond for redOrbit.com – Your Universe Online

A salad-mix packaging facility associated with the Cyclospora outbreak that has so far sickened more than 600 people in 22 US states has been given the green light to return to production following an extensive investigation.

Over more than a week, from August 11-19, the FDA with the cooperation of the Mexican government, conducted a thorough environmental assessment of the processing facility owned and operated by Taylor Farms de Mexico, S de RL de CV as well as five farms identified during an initial Cyclospora outbreak traceback investigation.

Following the initial traceback investigation, which was jointly conducted by the FDA, the CDC, and state health departments in Iowa and Nebraska, it was discovered prepackaged salad mix used at Olive Garden and Red Lobster restaurants was directly related to the Cyclospora outbreaks in Iowa and Nebraska. That salad mix was found to be produced and distributed by Taylor Farms de Mexico.

Following that news, Taylor Farms de Mexico reported to the FDA on August 12 it had voluntarily ceased production and halted deliveries of its product on August 9. However, the company also noted it would return to production even without FDA approval once it deemed its facilities were safe. Still the agency conducted a thorough environmental assessment of the produce company, which wrapped up last week.

Upon completion of this assessment, FDA officials found conditions and practices observed at these facilities were in accordance with current food safety protocols. As well, no illnesses have cropped up in Iowa or Nebraska since July 2 that could be tied to Taylor Farms. Nor, states the FDA, has any other US state currently dealing with a Cyclospora outbreak been linked to this company.

As a result, the FDA has agreed to the firm’s plan to resume its operations. Taylor Farms de Mexico “has committed to a comprehensive Cyclospora sampling program for leafy green and other products from their farms and processing facility in Mexico,” said the FDA “This will include both sampling of their products and water and continued monitoring of the sanitary conditions of their facilities.”

The company had resumed operations as of August 25.

While the investigation of Taylor Farms is likely over for now, the FDA, CDC and state health departments continue to probe the Cyclospora outbreak that is gripping at least 20 other states in the Union.

As of August 23, the CDC has been notified of 610 cases of cyclosporiasis – the disease associated with the single-celled Cyclospora parasite.

The following states have been affected by this outbreak: Arkansas, California, Connecticut, Florida, Georgia, Illinois, Iowa, Kansas, Louisiana, Minnesota, Missouri, Nebraska, New Jersey, New Hampshire, New York, Ohio, South Dakota, Tennessee, Texas, Virginia, Wisconsin and Wyoming.

Texas continues to receive the highest amount of cases with the latest official number at 258 infections. Iowa has seen the second-highest number of cases with 156 infections. Nebraska has reported 86 infections and Florida 31. All other states have seen fewer than 20 cases with most of those fewer than 6 cases.

It is still unclear if the cases seen in states besides Iowa and Nebraska are all linked to a similar source, or if they are all part of separate outbreaks. The FDA is continuing its investigation and has not ruled out any possibilities.

According to an Associated Press report, it may continue to be a difficult investigation due to the fact that Cyclospora is not commonly found in the US and there may be many illnesses that have gone unidentified. Cyclospora testing must be done specifically and many doctors don’t bother because of the rarity of the illness. The CDC also said it doesn’t have the tools to distinguish one strain from another, which makes it even harder to determine if the outbreaks are from a unique source or from several.

It may be likely the investigators will not find the source of infection in the latest round of outbreaks. Still, the probe continues.

Supervolcanic Ash Can Transform Into Lava Miles Away From Eruption

Lawrence LeBlond for redOrbit.com – Your Universe Online

New research has taken a closer look at how ash produced from supervolcanoes can turn back into lava once it falls back to Earth.

Supervolcanoes, such as the Yellowstone caldera, are capable of producing eruptions thousands of times stronger than normal volcanoes. Such a massive eruption can produce an ash cloud that is so hot it has the ability to re-form into lava once it hits the ground much farther away.

This evidence was previously revealed by California State University Bakersfield’s Graham Andrews.

Andrews discovered lava flows produced from an ancient Yellowstone super eruption that occurred some 8 million years ago were the result of ash that had traveled into the atmosphere and formed as lava only after returning to the ground tens of miles from the initial eruption.

During a typical eruption, lava will flow directly from the volcano until it cools enough to solidify. But during a supervolcanic eruption ash can remain superheated long after it leaves the volcano.

Now, Alan Whittington, an associate professor in the University of Missouri department of geological sciences in the College of Arts and Science, and lead author Genevieve Robert and Jiyang Ye, both doctoral students in the geological sciences department, reveal how this is possible.

“During a supervolcano eruption, pyroclastic flows, which are giant clouds of very hot ash and rock, travel away from the volcano at typically a hundred miles an hour,” Robert said. “We determined the ash must have been exceptionally hot so that it could actually turn into lava and flow before it eventually cooled.”

But the team believes another reaction was involved in this process, as ash should have cooled too much to turn into lava as it landed. They suggest that a process known as “viscous heating” – the degree to which a liquid resists flow – could have a lot to do with this type of event.

In explaining the process of viscous heating, Whittington likens it to stirring a pot of molasses.

“It is very hard to stir a pot of molasses and you have to use a lot of energy and strength to move your spoon around the pot,” Whittington said. “However, once you get the pot stirring, the energy you are using to move the spoon is transferred into the molasses, which actually heats up a little bit. This is viscous heating.

“So when you think about how fast the hot ash is traveling after a massive supervolcano eruption, once it hits the ground that energy is turned into heat, much like the energy from the spoon heating up the molasses. This extra heat created by viscous heating is enough to cause the ash to weld together and actually begin flowing as lava,” Whittington explained.

For this to happen, the team notes the ash produced by the eruption must be at least 1,500 degrees Fahrenheit. Since the ash should have lost some of this heat in the air, viscous heating likely accounted for anywhere between 200 and 400 degrees of additional heating for the ash to transform into lava.

A paper on this research is published in the journal Geology. The study was funded by the National Science Foundation.

Space Technology Could Improve Vision Treatment

Brett Smith for redOrbit.com – Your Universe Online

Astronomers have developed groundbreaking optical techniques to bring distant stars into focus and this same technology is now being appropriated by eye care professionals to improve their analysis and corrective methods.

According to a recently published review in the journal Optometry and Vision Science by Indiana University optometrist Larry N. Thibos, the concept of ‘wavefront optics’ is transforming how vision professionals are looking at issues they encounter in their everyday practice.

“Instead of light arriving at a lens from a star, imagine a point source of light reflected from the retina and emerging from the eye’s optical system as a wavefront,” Thibos explained in the review. “If we can measure the slope of the emerging wavefront at many points on the wavefront, then we have all of the information needed to reconstruct the shape of that wavefront, thereby obtaining a comprehensive description of the eye’s optical aberrations.”

Thibos said the traditional method for assessing the eye involves complications that are eliminated by wavefront optics.

“Although it is possible to accomplish the same result by measuring the intersection point of light rays with an image plane near the focus point, that approach is more difficult because the rays overlap and get confused near the focus point, so they must be isolated and measured sequentially,” he said “By measuring wavefront slope near the eye, where individual rays are well separated, it becomes possible to make many measurements simultaneously.”

According to Dr. Anthony Adams, Editor-in-Chief of Optometry and Vision Science, an array of ‘higher-order’ abnormalities in the eye can cause problems for both the patient and the eye care professional looking to diagnose any problems.

“In the past two decades, optometry and ophthalmology researchers have borrowed techniques for measuring and correcting these higher-order abnormalities,” Adams said. “Astronomers already used these techniques to enable a clear telescopic view of planets and stars, undistorted by the focusing aberrations resulting from the earth’s atmosphere.”

In his review, Thibos predicted important advances in vision care that would result from pursuing wavefront optics approaches, such as monitoring potential deterioration of the tear film, assessing the outcomes of certain corrective therapies, and tracking visual abnormalities in growing eyes. He added that some cutting-edge methods are already being used.

“Some of these corrections are even finding their way into contact lens and spectacle designs,” he said.

In addition to describing the rewards of pursuing a wavefront optics approach, Thibos’ review includes several diagrams and teaching tools that could be used in an educational setting.

As noted in the journal article, Thibos was recently named the 2012 winner of the American Academy of Optometry’s Charles F. Prentice Medal, which is awarded annually to someone who has contributed significantly to the visual sciences.

“Dr. Thibos is quite unique in his extraordinary ability to relate these advances in optics to the very fundamentals of ophthalmic optics which the ‘Father of Optometry,’ Charles F. Prentice, articulated more than 120 years ago,” Dr. Adams comments. “It is fitting that he was awarded the Charles F. Prentice Medal for his work—the highest Award of the American Academy of Optometry.”

Microneedle Patch Could Be Surefire Way To Diagnose Tuberculosis

Michael Harper for redOrbit.com – Your Universe Online

Engineers from the University of Washington (UW) have developed what they believe to be an accurate and effective skin test for diagnosing tuberculosis.

Current tests are difficult to administer and can often yield inaccurate results. Using a patch with an array of embedded microneedles, doctors and nurses could one day administer a tuberculosis (TB) test as easily as they could place a bandage on a small wound. Doctors, teachers and those who travel to other countries are often given a TB test to ensure they’re free of the respiratory infection. As TB can live dormant in a person’s lungs for years, it’s important that any skin test get accurate results every time.

A recent study shows one-third of the global population may carry latent TB, an inactive variety of the disease. This latent TB will turn active in about five to ten percent of those who carry the inactive disease, according to the Philippines Star.

According to senior author and UW assistant professor of materials science and engineering Marco Rolandi, the current method of administering a TB skin test requires a steady hand. A small amount of a substance called PPD tuberculin is injected into the dermis of the forearm, or the layer beneath the skin. If a hard and red bump appears in two to three days after the injection, it’s likely a TB infection is present. The size of the bump can determine the severity of the infection.

This test isn’t always accurate, however, and those administering the test could inject the PPD in the wrong area. For instance, the PPD could be administered too high and into the skin or too low and past the dermis. Moreover, if the needle isn’t inserted at the right angle, the PPD may not be injected properly.

Rolandi says his team’s microneedle patch eliminates potential error by placing an array of tiny, biodegradable needles on a single piece of material.

“With a microneedle test there’s little room for user error, because the depth of delivery is determined by the microneedle length rather than the needle-insertion angle,” explains Rolandi.

This test is painless and easier to administer than the traditional skin test with a hypodermic needle.

Rolandi’s UW team partnered with the Infectious Disease Research Institute in Seattle to develop the microneedle patch and say they believe this is the first time such a patch has ever been developed.

“It’s like putting on a bandage,” Rolandi said. “As long as the patch is applied on the skin, the test is always delivered to the same depth underneath the skin.”

The UW team first tested the patch on guinea pigs and found the test gave the same results as a standard and correctly-administered hypodermic needle test. Rolandi also says the patch could be more affordable to those living in areas where health care is expensive.

Microneedles have been used in other medical and therapeutic applications before, delivering drugs to people’s arms or legs. These patches are often made of metals or silicon, but the UW and Infectious Disease Research Institute felt it important to develop a biodegradable option. Their patch, therefore, is made of silk and chitin, a main component in the exoskeleton of crabs, lobsters, shrimp, and cicadas. This material has proven strong enough to break the skin and deliver the PPD tuberculin used in TB tests.

Rolandi and paper co-author Derek Carter say they hope the patch will prove successful enough to one day be a commercially available product.

Study Reveals Language Influences What We See

April Flowers for redOrbit.com – Your Universe Online
People naturally assume the sense of sight takes in the world as it is, simply passing on what the eyes collect from light reflected by the objects around us. The truth is more complicated, however, as the eyes do not work alone. Our vision is not only a function of incoming visual information, but how that information is interpreted in light of other visual experiences as well. A new study from the University of Wisconsin-Madison and Yale University reveals language may influence our sight as well.
University of Wisconsin–Madison cognitive scientist and psychology professor Gary Lupyan, and Emily Ward, a Yale University graduate student, demonstrate that words can play a powerful role in what we see. Their findings were published in a recent issue of Proceedings of the National Academy of Sciences (PNAS).
“Perceptual systems do the best they can with inherently ambiguous inputs by putting them in context of what we know, what we expect,” Lupyan says. “Studies like this are helping us show that language is a powerful tool for shaping perceptual systems, acting as a top-down signal to perceptual processes. In the case of vision, what we consciously perceive seems to be deeply shaped by our knowledge and expectations.”
Lupyan says these expectations can be altered with a single word.
The researchers used a technique called continuous flash suppression to render a series of objects invisible for a group of volunteers. This allowed them to show how deeply words influence perception.
Each participant was shown a picture of a familiar object in one eye. The objects included items such as a chair, a pumpkin or a kangaroo. In their other eye, the participants were shown a series of flashing, “squiggly” lines.
“Essentially, it’s visual noise,” Lupyan says. “Because the noise patterns are high-contrast and constantly moving, they dominate, and the input from the other eye is suppressed.”
Each study participant heard one of three things immediately before seeing the images: the word for the suppressed object (“pumpkin,” when the object was a pumpkin), the word for a different object (“kangaroo,” when the object was actually a pumpkin), or just static.
The participants were asked to indicate whether they saw something or not. The researchers found when the word they heard matched the object that was being suppressed by the visual noise, the subjects were more likely to report they did indeed see something than in cases where the wrong word or no word at all was paired with the image.
“Hearing the word for the object that was being suppressed boosted that object into their vision,” Lupyan says.
Hearing a word that did not match the suppressed image hurt the participant’s chances of seeing an object.
“With the label, you’re expecting pumpkin-shaped things,” Lupyan says. “When you get a visual input consistent with that expectation, it boosts it into perception. When you get an incorrect label, it further suppresses that.”
Continuous flash suppression has been shown to interrupt sight so thoroughly that the brain receives no signals to suggest the invisible objects are perceived, even implicitly.
“Unless they can tell us they saw it, there’s nothing to suggest the brain was taking it in at all,” Lupyan says. “If language affects performance on a test like this, it indicates that language is influencing vision at a pretty early stage. It’s getting really deep into the visual system.”
The new study reveals a deeper connection between language and simple sensory perception than previously thought. This connection made the researchers wonder about the extent of language’s power. They suggest the influence of language may extend to other senses as well.
“A lot of previous work has focused on vision, and we have neglected to examine the role of knowledge and expectations on other modalities, especially smell and taste,” Lupyan says.
“What I want to see is whether we can really alter threshold abilities,” he says. “Does expecting a particular taste for example, allow you to detect a substance at a lower concentration?”
For example, Lupyan says if you are drinking a glass of milk, but thinking about orange juice, your thoughts might change the way you experience the milk.
“There’s no point in figuring out what some objective taste is,” Lupyan says. “What’s important is whether the milk is spoiled or not. If you expect it to be orange juice, and it tastes like orange juice, it’s fine. But if you expected it to be milk, you’d think something was wrong.”

Controlling Your Emotions May Be Difficult Even Under Mild Stress

April Flowers for redOrbit.com – Your Universe Online

A team of neuroscientists from New York University (NYU) has found even mild stress can foil therapeutic measures to control emotions. The findings point to the limits of clinical techniques, while at the same time illuminating the barriers that must be overcome in addressing afflictions such as fear or anxiety.

“We have long suspected that stress can impair our ability to control our emotions, but this is the first study to document how even mild stress can undercut therapies designed to keep our emotions in check,” said Elizabeth Phelps, a professor in NYU’s Department of Psychology and Center for Neural Science. “In other words, what you learn in the clinic may not be as relevant in the real world when you’re stressed.”

The study findings were published in the Proceedings of the National Academy of Sciences (PNAS).

Therapists sometimes employ cognitive restructuring techniques which encourage patients to alter their thoughts or approach to a situation to change their emotional response when addressing the patients’ emotional maladies. These techniques might include focusing on the positive or non-threatening aspects of an event or stimulus that might normally produce fear.

The research team questioned whether these techniques hold up in real world situations when accompanied by the stress of everyday life. To answer this question, they designed a two-day experiment in which the study participants employed techniques like those used in clinics as a way to combat their fears.

On day one, the research team used a “fear conditioning” technique to create a fear among the participants. The subjects were shown pictures of snakes or spiders. Some of the pictures were accompanied by a mild shock to the wrist, while others were not. The researchers used physiological arousal and self-reporting to document that the participants had developed fear responses to the pictures that had been paired with shocks.

Next, in order to learn to diminish the fears brought on by the experiment, the participants were taught cognitive strategies akin to those prescribed by therapists and collectively titled cognitive-behavioral therapy (CBT).

On day two, the participants were separated into a “control” group and a “stress” group. The stress group participants’ hands were submerged in icy water for three minutes—a standard method for creating a mild stress response in psychological studies. The control group’s hands, in contrast, were submerged in mildly warm water. To determine if the participants in the stress group were indeed stressed, the researchers gauged the salivary cortisol levels of each participant. The human body produces salivary cortisol in response to stress. The control group showed no change, while the stress group showed a significant increase in cortisol.

The participants were given a short rest period, then they were shown the same pictures of snakes or spiders as on day one. This allowed the researchers to determine if stress undermined the utilization of the cognitive techniques taught the previous day.

The control group showed a diminished fear response to the images. This suggested to the researchers they were able to employ the cognitive training from the previous day. The stress group received identical cognitive training; however they were unable to use these cognitive techniques to reduce fear on the second day.

“The use of cognitive techniques to control fear has previously been shown to rely on regions of the prefrontal cortex that are known to be functionally impaired by mild stress,” Phelps observed. “These findings are consistent with the suggestion that the effect of mild stress on the prefrontal cortex may result in a diminished ability to use previously learned techniques to control fear.”

“Our results suggest that even mild stress, such as that encountered in daily life, may impair the ability to use cognitive techniques known to control fear and anxiety,” added Candace Raio, a doctoral student in NYU’s Department of Psychology. “However, with practice or after longer intervals of cognitive training, these strategies may become more habitual and less sensitive to the effects of stress.”

Coffee Could Slow Prostate Cancer Progression And Reduce Recurrence Risk

redOrbit Staff & Wire Reports – Your Universe Online

Coffee consumption could help limit the progression of prostate cancer, as well as helping prevent the disease from recurring, experts from the Fred Hutchinson Cancer Research Center (FHCRC) claim in a new study.

Writing in the journal Cancer Causes & Control, the researchers reported that men who consumed at least four cups of the caffeinated beverage each day experienced a 59 percent decrease in their risk of prostate cancer recurrence and/or progression in comparison with those who only drank no more than one cup each week.

Corresponding author Janet L. Stanford, co-director of the Program in Prostate Cancer Research in the Fred Hutch Public Health Sciences Division, and her colleagues conducted their research in order to determine whether or not the bioactive compounds found in coffee and tea could help prevent recurrence of this typically slow-growing cancer of the walnut-sized gland.

Stanford’s team did not find a link between coffee consumption and a reduction in the risk of mortality from prostate cancer, though they said that the study included too few men who succumbed to the ailment in order to directly address that issue. Furthermore, the authors were unable to reach any conclusion regarding the potential impact of drinking them on prostate cancer-related death.

“To our knowledge, our study is the first to investigate the potential association between tea consumption and prostate cancer outcomes,” they wrote. “It is important to note, however, that few patients in our cohort were regular tea drinkers and the highest category of tea consumption was one or more cups per day. The association should be investigated in future studies that have access to larger populations with higher levels of tea consumption.”

Over 1,000 prostate cancer survivors between the ages of 35 and 74, all of whom were diagnosed between 2002 and 2005 and were residents of King County in Washington, participated in the study. They answered a series of questions regarding their food and beverage consumption two years prior to their diagnoses, and were also interviewed regarding their demographics, lifestyle information, family history of cancer, medication use and screening history for prostate cancer. The researchers followed up with each for a period of over five years after diagnosis.

Six-hundred thirty of the participants answered questions regarding coffee intake, fit the follow-up criteria and took part in the final analysis, the authors said. Of those men, 61 percent said that they drank at least one cup of coffee each day, while 12 percent said that they consumed four or more on a daily basis.

Stanford’s team reports that their findings are consistent with the Harvard’s Health Professionals Follow-up Study (HPFS), which found that men who drank at least six cups of coffee each day had a 60 percent decreased risk of metastatic/lethal prostate cancer versus non-drinkers.

“The researchers emphasize that coffee or specific coffee components cannot be recommended for secondary prevention of prostate cancer before the preventive effect has been demonstrated in a randomized clinical trial,” FHCRC explained. “Further, there’s ongoing debate about which components in coffee are anti-carcinogenic, and additional large, prospective studies are needed to confirm whether coffee intake is beneficial for secondary prevention.”

Earlier this month, experts from National University of Singapore and Duke University published research suggesting that drinking coffee could reduce fatty liver in people with non-alcoholic fatty liver disease (NAFLD), a condition found in 70 percent of people diagnosed with diabetes that can currently only be treated through diet and exercise.

Furthermore, the caffeinated beverage has also recently been linked to a 21 percent increase in overall mortality risk for those drinking over 28 cups a week, and a more than 50 percent increase in both men and women younger than 55 years of age. According to the latest National Coffee Drinking Study from the National Coffee Association, over 60 percent of US adults drink coffee every day, consuming slightly more than three cups a day on average.

Uber Returns the Favor, Buys 2,500 Google Driverless Cars

Michael Harper for redOrbit.com – Your Universe Online

Transportation company Uber announced today they plan to buy 2,500 of Google’s automated cars from the search engine giant. Moreover, Uber also plans to give transportation data to Google to help them improve their routing algorithms. The official news comes after a week of speculation about Google’s interest in Uber. It was discovered late last week that Google invested  $250 million in Uber in July. News also broke last week that auto parts maker Continental AG is ready to announce a partnership with Google to build components for their self-driving car. Former Wall Street Journal reporter Jessica Lessin also claimed last week that Google isn’t just planning to build the system to power a self-driving car, they’re expecting to make their own car after talks with a major auto maker fell through.

Uber plans to invest up to $375 million in Google’s GX3200 automobiles. This is the third generation of their fleet, but the first to be cleared for commercial use in the United States. Uber runs an app with which users can hire a car in America’s largest cities to ferry them around without having to hail a cab or worry about payment. Uber handles payment between the driver and passenger.  New updates to the app now let multiple riders split up the fare between them.

According to Bloomberg Businessweek, Uber is getting much more than a fleet of self-driving cars with this deal; they’re also getting the political clout of Google to move them forward into a future filled with autonomous vehicles. Google has so far been able to convince the California government to legalize their driverless vehicles as they test them around Silicon Valley. They also spent $18.2 million on lobbying last year, giving them extra pull with the politicians they support.

Uber seems primed to set up a future wherein users can hail their own car with the tap of a smartphone. Yet, instead of a cab service picking the user up, it will be a car devoid of any driver, a car owned and maintained by Uber itself. The company has already experimented with autonomous cars, but they’ve still asked a driver to climb onboard in case something were to go wrong with the automated system.

The company also plans to have their first fleet of driverless cars in place by the end of 2013 and operating in at least one of their markets. Should all go well, Uber says they could have the service ready to go in ten of their markets before 2014.

With a stake in the company and a large order of cars on their register, Google stands to take away quite a bit of good fortune from Uber. Not only do they have a company willing to buy up 2,500 of their first vehicles to roll off the line, they also have a company willing to share data with them. A data-driven company, Google’s main motivation in anything they do is information. This data is then either used to improve their systems or fuel ads. The information received from driverless cars could likely be used to fuel location-aware advertisements, meaning passengers could one day hear about the deal of the day as they pass by their local burger shop.

Last week a German news source claimed auto parts maker Continental AG was in talks to partner with both Google and IBM to enter into the driverless vehicle market. It is expected the company will simply provide parts for Google’s autonomous vehicles. They’re also expected to work with IBM to build a network wherein these self-driving cars will be able to speak with one another.

New Puzzle Pieces In The Genetics Of Schizophrenia

Brett Smith for redOrbit.com – Your Universe Online

Schizophrenia is a one of the most complex and devastating of the inherited mental disorders, and remains a significant public health concern. In a new study published in the journal Nature Genetics, an international team of researchers has identified 22 locations in the human genome that are involved in the development of the condition, including 13 that have been named for the very first time.

“If finding the causes of schizophrenia is like solving a jigsaw puzzle, then these new results give us the corners and some of the pieces on the edges,” said Dr. Patrick F. Sullivan, a geneticist from the University of North Carolina and a study co-author. “We’ve debated this for a century, and we are now zeroing in on answers.”

“This study gives us the clearest picture to date of two different pathways that might be going wrong in people with schizophrenia,” he added. “Now we need to concentrate our research very urgently on these two pathways in our quest to understand what causes this disabling mental illness.”

The study was based on a comprehensive genome-wide association study (GWAS) that included previous similar studies. The international team also looked at data from a Swedish national sample of over 5,000 schizophrenia cases and more than 6,200 controls. The total number of individuals included in the study was over 59,000.

One of the genetic mechanisms identified in the study included the genes CACNA1C and CACNB2, which are critic to the function of nerve cells. Another genetic mechanism, dubbed the ‘micro-RNA 137’ pathway, involves a known regulator of neuronal development.

“What’s really exciting about this is that now we can use standard, off-the-shelf genomic technologies to help us fill in the missing pieces,” Sullivan said. “We now have a clear and obvious path to getting a fairly complete understanding of the genetic part of schizophrenia. That wouldn’t have been possible five years ago.”

Another study published earlier this month by the journal Neuron, revealed that the psychotic symptoms experienced by those with schizophrenia are caused by a faulty switch in the brain that results in the confusion between internal thoughts and objective reality.

“In our daily life, we constantly switch between our inner, private world and the outer, objective world,” Lena Palaniyappan, a Nottingham University psychiatrist who co-led the study, told Reuters. “This switching action is enabled by the connections between the insula and frontal cortex. (But) this switch process appears to be disrupted in patients with schizophrenia.”

In the study, researchers used functional magnetic resonance imaging (fMRI) scans to contrast the brains of 35 healthy participants with those of 38 individuals with schizophrenia. The scientists were able to determine that healthy participants successfully used the connections between the insula and frontal cortex regions of the brain to switch between inner thoughts and outer reality, while the patients with schizophrenia were less able to shift activity to their frontal cortex.

“This could explain why internal thoughts sometime appear as external objective reality, experienced (by schizophrenia patients) as voices or hallucinations,” Palaniyappan said.

Effective Epilepsy Drug Valproate May Cause Birth Defects

Brett Smith for redOrbit.com – Your Universe Online

The anti-seizure drug valproate has been shown to be extremely effective, but a new study has found that it increases the risk of pregnant women giving birth to a child with spina bifida or hypospadias when taken in higher doses.

The finding allows for women who suffer from seizures to make a more informed decision when deciding if they want to have a child.

“For many women on epilepsy medication, the desire to start a family can be fraught with fear that they could have a baby with a range of disabilities or malformations,” said study co-author Terry O’Brien, an epilepsy specialist with The Royal Melbourne Hospital.

“Previous studies have shown a strong relationship between the dose of valproate taken and the risk of the child having a birth defect,” O’Brien added. “However, for many women valproate is the only drug that will help control their seizures.”

Spina bifida is an incurable, debilitating birth defect of the spine and spinal cord, which occurs in the first three months of pregnancy. Hypospadias is a birth defect of the penis that can be treated by corrective surgery.

“Through our research, we now know that by reducing the dose taken in the first trimester of pregnancy, the risk of having a baby with spina bifida or hypospadias will be greatly reduced,” O’Brien said.

The neurologist added that other birth defects such as cleft palates and heart defects were prevalent in women who took the anti-seizure drug, regardless of dosage.

The study was based on data from the Australian Pregnancy Register (APR), which included information on more than 1,700 women with epilepsy who have been or are currently pregnant. Based at The Royal Melbourne Hospital, the APR has gathered game-changing epilepsy data since 1999.

According to study author and epilepsy expert Frank Vajda, the study could make a significant difference for women with epilepsy and their families.

“We always knew that epilepsy drugs were responsible for the high level of fetal malformations but we never knew how much dosage played a role until recently,” he said. “The present findings for spina bifida are clinically significant, as 80 percent of all instances of spina bifida in the APR were associated with valproate exposure.”

“However, since the collection of data for the APR started, we have noted that valproate was being taken less often by pregnant women and in lower dosages,” Vajda added. “This evidence now tells us that by using valproate in the lowest dose that can control severe seizures may reduce the hazard of one of the most devastating birth defects.”

Despite its effectiveness, valproate has been the subject of a growing number of studies finding negative side effects of its use, particularly for unborn children. A study published by JAMA in April found a connection between a pregnant woman’s use of the drug and a higher risk of her child developing autism.

“There must be a continuous effort to include this information along with all the other risks in discussions with women of childbearing age who are candidates for valproate,” said the study’s lead author Jakob Christensen.

Massive Dolphin Die-Off Could Be From Measles-Like Virus

Lawrence LeBlond for redOrbit.com – Your Universe Online

Researchers from the University of Pennsylvania’s School of Veterinary Medicine have turned a large laboratory designed to treat four-legged animals into a research facility to get to the bottom of one of this summer’s greatest tragic mysteries.

Some 70 miles away, dolphins are turning up dead along the Jersey shore and other coastal communities and, at this point, the cause still remains largely unknown. More than 200 dolphins have washed ashore since June and many have ended up on UPenn’s New Bolton Center research tables where veterinarians look to find an answer.

The UPenn lab was specifically called upon for this task due to close ties with the Marine Mammal Stranding Center (MMSC) in Brigantine, NJ, which handled many of the deceased creatures that turned up on nearby shorelines.

The Center, located in Kennett Square, PA, sits in the southeastern part of the state near the Delaware line. The board-certified veterinary specialists have performed detailed necropsies on each of the dolphins brought into the lab to hunt out and identify potential abnormalities. Hours upon hours are then spent examining tissues under microscopes and researchers conduct tests with antibodies, hoping to uncover the cause of death in these intelligent marine mammals.

After painstakingly long processes, some evidence has turned up.

“One of the saddest things to see on these creatures is some have horrible pneumonias and ulcers so you know that they are suffering. And the shark bites are kind of sobering to look at,” Dr. Perry Habecker, chief of large-animal pathology at the New Bolton Center, told USA Today’s Kristi Funderburk.

MORBILLIVIRUS

Habecker said morbillivirus, a measles-like disease that played a role in the massive die-off of 742 bottlenose dolphins along the East Coast in 1987, has been found in some of the dead dolphins washing up on beaches this summer. However, it is still too early to tell if this is the root cause of this year’s die-offs.

Through more extensive analyses, the researchers are looking at the virus more closely, as well as toxins, biotoxins, bacteria, pollutants and any other potential culprit, according to Maggie Mooney-Seus, a communications specialist with the NOAA’s Northeast Fisheries Science Center.

“We haven’t ruled anything out yet because we have had animals from a pretty wide area and we have to look at everything that could be behind this,” she said.

Some suggestions that have arisen are that last year’s superstorm Hurricane Sandy played a role or the Gulf of Mexico oil spill from 2010. But according to Robert Schoelkopf, director of the MMSC, neither Hurricane Sandy nor the oil spill has anything to do with this.

Schoelkopf said that he immediately knew these creatures were suffering from sort of lung infection when they began washing ashore earlier this summer. He noted, “My mind shot right back to 25 years ago when I did the same thing.”

During the 1987 dolphin die-off, Schoelkopf personally witnessed the death of 93 of the animals in NJ alone.

He said that what is very startling with what is being seen this go-around is that females are washing ashore and are visibly lactating. “That means there’s a baby out there swimming around without a mother. That baby is going to become shark bait.”

UNUSUAL MORTALITY EVENT

So far this summer, there have been at least 230 dolphin deaths along the East Coast, prompting the NOAA to declare it an unusual mortality event (UME). This clears the way for intense scientific research in order to find a cause of death.

There have been 60 recognized UMEs since 1991, but only 29 have been resolved with a cause.

Because the NOAA enacted so quickly, Habecker’s team have been able to get moving on finding a root cause. Cultures taken from the dolphins are being sent to labs in Florida and California. Habecker noted that these facilities have the expertise and technology to find out what may be at play here.

Habecker said tissue work will continue in his own lab to determine if morbillivirus is present in further specimens.

“We know it’s out there. It’s always been out there, but we don’t know why we’re seeing some more of it,” Habecker said of the virus.

Mooney-Seus said that knowing the cause of the illnesses and deaths seen in dolphins will give experts an idea of whether this is naturally-occurring in the dolphins or if this had started as a result of some human-based activity.

“You have to look at everything so you can look for opportunities at remedying it if we can,” she told USA Today. “It shows something is definitely going on in the ecosystem and that’s why we have to look at all those environmental factors as well.”

Habecker said that historically, the most common cause for premature dolphin mortality is pneumonia or a parasitic worm that attacks the brain. He and other colleagues are continuing to look for patterns before making any concrete estimates.

Kim Durham, a biologist with the NEFSC, said these dolphins are likely suffering from a bacterial or viral infection. This virus does resemble measles, she said.

“There’s a lot of skin contact among them,” Durham told CBS News. “They’re constantly rubbing each other, so yeah, the possibility that they’re spreading it among themselves is very large.”

LIFE CHANGING EVENT

Schoelkopf, who has been working with dolphins for decades, left a job at an aquarium to pursue a more important role. Once he found that dolphins were much smarter than they were given credit for, he launched the Stranding Center to save dolphins. He said this changed his life forever.

“I didn’t want to work with captive dolphins anymore,” he said. “It wasn’t right for them to do 13 shows a day and never see sunlight.”

After founding the MMSC, Schoelkopf went on to earn a national reputation for rescuing beached or distressed dolphins and other sea creatures.

This summer’s die-off is not the first such incident and won’t be the last. Schoelkopf said it looks to be a “naturally occurring” event and could occur again in another 25 years. “That’s the idea of doing the extensive tests like this, that they can possibly find somewhere or some way around the problem.”

Considering the massive dolphin die-off in 1987, Schoelkopf believes the deaths should start to taper off near the end of September. What is not known is how many more will turn up dead between now and then.

Schoelkopf told CBS News that when fewer dolphin carcasses are seen along the Jersey Shore, that won’t mean the problem is over. It is likely that more deaths will be seen along the southern states due to the animals’ annual migration routes south.

Mooney-Seus said that more dead dolphins are showing up in the south already. During July and most of August the farthest southerly extent of dolphin deaths was Virginia. Now, some dolphin carcasses are turning up in North Carolina.

Since the UME was declared by the NOAA’s NEFSC, the UPenn facility has taken in 33 dolphins, but Habecker noted that the lab had received some before that point, as well.

Schoelkopf is urging anyone who encounters a dolphin, whether it is in the water or on shore, to not approach it. Sharks have been known to attack the dolphins, most of which die before coming ashore, and pose a danger, he said.

Unearthed Ancient Roman Structure Predates Invention Of Mortar

Lawrence LeBlond for redOrbit.com – Your Universe Online

Archaeologists digging at a long-buried city in Italy have unearthed a massive stone monument dating back at least 300 years before the Colosseum and 100 years before the invention of mortar. The new discovery indicates that the ancient Romans had developed architectural skills much earlier than previously believed.

The team of 60 researchers, including 35 undergraduates and 15 graduates, from the University of Michigan and Yale University were on hand this summer to work at the site. The excavation of the city is expected to continue through 2014, but with the new discovery under their belt, the archaeologists are hoping the $2 million U-M Museum of Archaeology-funded project will be extended.

The unearthed ancient structure was found at a site known as Gabii, which sits just east of Rome. The monument, a giant “Lego-like” stone block structure, is about half the size of a football field and dates back to between 350 and 250 BC. Nicola Terrenato, a U-M classics professor and lead scientist on the project, believes it could be the earliest public building ever discovered and said this is the largest American dig in Italy in the past half century.

He said the massive complex, which might also have been a private residence, “holds a stone retaining wall, geometrically patterned floors and two terraces connected by a grand staircase.”

This is unlike anything we thought the Romans were capable of building at the time, noted Terrenato, who added that it challenges an ancient stereotype that these people were a “modest and conservative people” at this period in history.

“There are a lot of constructive details that are beautiful to look at and they tell us more about how the Romans were building at that stage,” Terrenato said. “This shows us they were beginning to experiment with modifying their natural environments—cutting back the natural slope and creating a retaining wall, for example— about a quarter of a millennium earlier than we thought.”

While this site was built at least 300 years before the Colloseum, it does represent a critical step in the process that leads to later architecture, he said.

Perhaps of more interesting note is the fact that each of the massive stone blocks used to build the structure weighed thousands of pounds, something not considered a standard size used during this early period. But it makes sense because larger stones gave the structure more stability seeing mortar had not been invented yet.

“This is like Lego construction,” Terrenato said. “They stacked them one on top of each other without any glue binding them together. This is the only technique they had access to and it must have been the desire for this kind of grand construction that drove them to the invention of mortar about 125 years later.”

Many historians have labeled the Romans as a conservative people who only became lavish after soldiers returned from conquering Greece, bringing the extravagance of their culture home with them. But this new monument predates the extravagant lifestyle theory by a longshot.

“Rome conquered Greece in the 140s BCE. Roman historians said the soldiers came back and wanted Greek luxury, which is way of trying to shift blame,” Terrenato said. “We now know that long before they conquered Greece, the Romans were already thinking big. This tears apart the view of Romans in this period as being very modest and inconspicuous.”

The Gabii excavation site, which sits on a parcel of undeveloped land in modern-day Lazio, was once a major city that waned by the third century as the Roman Empire grew. The Gabii Project is meant to show what a city in this region looked like before the great Roman development period. Because the site itself sits outside Rome, the team is able to explore the site on a much deeper scale – something that would have not been possible within city limits due to centuries of continued building atop ancient sites.

The research team said about 60 percent of this massive building has been uncovered.

“Even though we could not observe the complex in its entire extent, we know we are dealing with a monument without parallels in the region, including Rome,” said Marcello Mogetta, project managing director and a U-M doctoral student in classical art and archaeology. “My bet is that this will become a benchmark in future surveys of Roman architecture.”

Andrew Johnson, an assistant professor of classics at Yale University, noted that this find has educational impacts.

“In the longer term, this is a discovery that we expect will radically change our understanding of Roman Republican history and archaeology,” he said. “But more immediately, our students are returning from the field to the classrooms of their home institutions—Michigan, Yale and over a dozen others—with a new set of skills, methodologies, approaches and questions that we hope will enrich and inform their studies in various academic disciplines in manifold ways.”

Terrenato and colleagues previously unearthed a 1,000-pound lead coffin at the Gabii site.

OSHA Proposes New Rules To Limit Workplace Exposure To Silica Dust

redOrbit Staff & Wire Reports – Your Universe Online

Long-anticipated rules that would limit exposure to silica dust in the workplace were proposed by the Occupational Safety and Health Administration (OSHA) on Friday.

According to Neela Banerjee of the Los Angeles Times, the new rules would cut exposure to the tiny particles in half. The proposal was developed “amid mounting evidence that current standards do not protect workers from increased risk of lung cancer and silicosis, a progressive, incurable disease,” she added.

Current silica dust standards for workers in the shipbuilding, railroad and construction industries were established more than four decades ago, Banerjee said. The new standards would establish a legal limit of 50 micrograms per cubic meter of air for workplace silica dust concentrations for all industries, over an eight-hour day.

Previously, separate rules limited construction sector (250 micrograms) and the general and maritime industry (100 micrograms). However, in light of a recent domestic oil and gas boom due largely to the practice of hydraulic fracturing, a 2012 federal study conducted last year demonstrated that an increasing number of workers in the energy sector were increasingly at risk of being exposed to respirable silica dust.

OSHA estimates that the new regulations could save as many as 700 lives each year, while also preventing up to 1,600 cases of silicosis annually, according to Wall Street Journal reporter Melanie Trottman. Dr. David Michaels, an assistant secretary in the Labor Department, told Trottman that the rule “uses common sense measures that will protect workers’ lives and lungs – like keeping the material wet so dust doesn’t become airborne.”

Nonetheless, it has already drawn somewhat of a backlash from industrial groups arguing that the changes are unnecessary. Amanda Wood, director of labor and employment policy at the National Association of Manufacturers, told the Wall Street Journal that companies had already taken steps to contain the particles, while the American Foundry Society claims that the proposal would cost the metalcasting industry nearly $1.5 billion annually.

Dr. Michaels also told Steve Greenhouse of the New York Times that the new regulations, which come after a delay of more than two years, would affect over 530,000 businesses – 90 percent of them in the construction industry. He added that the rules would cost industry an average of $1,242 per company (a total of $640 million) to comply, but that the benefits resulting from the OSHA proposal would exceed $4 billion.

“Crystalline silica – tiny particles no more than one-hundredth the size of grains of sand – is created during work with stone, concrete, brick or mortar,” Greenhouse explained. “It can occur during sawing, grinding and drilling and is common in glass manufacturing and sand blasting. One government study found that many workers in hydraulic fracturing, known as fracking, were exposed to 10 times the permissible level of silica.”

Ostrich Egg Contains The Oldest Known Globe To Depict The New World

redOrbit Staff & Wire Reports – Your Universe Online

Researchers from the Washington Map Society report that they have discovered the oldest known globe to depict the New World – an etching on an ostrich egg that appears to originate from the early 16th century.

According to ABC News reporter Alexis Shaw, the egg is about the same size as a grapefruit and dates back to the year 1504. It depicts North America as a series of scattered islands, and also includes South America, Japan, Brazil and Arabia, officials from the Washington Map Society explained last Monday.

The discovery will be detailed in the Fall 2013 edition of The Portolan, a journal of cartography published by the Society, and cartographers believe that it could have been made in Florence, Italy. In fact, according to Sarah Griffiths of the Daily Mail, it might have been crafted in the workshop of famed artist and inventor Leonardo da Vinci.

“This is a major discovery, and we are pleased to be the vehicle for its announcement,” Tom Sander, Editor of The Portolan (who has personally inspected the globe) said in a statement. “We undertook a very extensive peer review process to vet the article, which itself was based on more than a year of scientific and documentary research.”

“When I heard of this globe, I was initially skeptical about its date, origin, geography and provenance, but I had to find out for myself,” added author and independent Belgian researcher Dr. Stefaan Missinne. “After all no one had known of it, and discoveries of this type are extremely rare. I was excited to look into it further, and the more I did so, and the more research that we did, the clearer it became that we had a major find.”

The globe was constructed from the lower halves of two ostrich eggs, and includes the phrase “HIC SVNT DRACONES” (“Here are the Dragons”) over the coast of Southeast Asia, Shaw said. It was discovered at the London Map Fair last year by a dealer who claimed that it had been a part of an “important European collection” for decades, officials from the Map Society said.

“The anonymous owner of the globe… allowed Missinne to investigate the globe. The researcher used carbon dating, computer tomography testing, an ink assessment, as well as a geographical, cartographic, and historical analysis,” according to CBSNews.com. Missinne determined that the globe was made sometime around 1504, and could have been used to cast the New York Public Library’s copper Lenox globe, which has been dated to 1510.

Dr. Missinne told Griffiths that the globe reflects the knowledge of the New World provided by early European explorers such as Amerigo Vespucci, the man for whom the Americas were named. It depicts several different types of ships, monsters and waves, as well as 71 different place names, including three in South America – Mundus novus (New World), Terra de Brazil and Terra Sanctae Crucis (Land of the Holy Cross).

Why Egyptians ‘Nested’ Mummies In Multiple Coffins

Lee Rannals for redOrbit.com – Your Universe Online

In his PhD dissertation paper, Anders Bettum of the University of Oslo describes the reasons behind why Egyptians buried their dead in multiple coffins.

Egyptian mummies were burried in multiple coffins that nested within each other, similar to the iconic Russian matryoshka dolls. The child king Tutankhamun (1334-24 BC), for instance, was burred in as many as eight coffins, an unusually large numbered compared to other ancient Egyptian elite who had three or four coffins.

Bettum, an Egyptologist at the Department of Culture Studies and Oriental Languages, wrote in his thesis that nest coffins were not only a status symbol for the Egyptian elite, but they also played a role in the process that they believed would link the deceased to their ancestors.

The rituals that took place during their seventy-day funerals are symbolically rendered on the coffins. The components of each nest reflect the Egyptian view of the world.

“The decorations, the forms and the choice of materials signify a unification of the two myths about Osiris and Amun-Ra respectively,” Bettum said. “On the outer coffin, the deceased is portrayed as Osiris, with a mummified body, a blue-striped wig and a pale, solemn face. The coffin is painted yellow and varnished, and must have shone like gold. The very richest Egyptians did in fact use gold leaf on their coffins.”

He wrote that their choice of color represents the light and its origin in the sun. If the figure of Osiris, the god of the afterlife, is being bathed in the sun then that could only mean one thing, Bettum said.

“The decoration invokes a well known mythical image: when the sun god arrives in the throne hall of Osiris in the 6th hour of the night and the two deities join in mystical union,” the researcher wrote. “According to the Egyptians, this union was the source of all regeneration in nature, and it was here, at the center of this ‘catalyst of life’ that the deceased wanted to be placed for all eternity.”

He said the innermost layers of the coffin nests were decorated to look as living humans in their best outfits. This layer was the most important one because it showed the objective of the afterlife transformation.

“The numerous layers of coffins around the mummy functioned as repeated images of the deceased, but also as protective capsules, similar to the larvae’s pupa before its transformation to a butterfly. Such repeated imagery is a well-known theme in religious art and literature.”

Most coffin nests have been disassembled and scattered to museums all over the world, but Bettum hopes to see more international collaboration to reassemble coffins. He said a project like this could be fascinating to the public and could rekindle interest in ancient Egyptian culture.

“So far, national legislation and interests have unfortunately served as barriers to such cooperation,” he concludes.

Gaia Arrived In French Guiana Today

ESA
ESA’s billion-star surveyor, Gaia, departed yesterday evening from Toulouse and arrived early this morning in French Guiana. Gaia will be launched later this year from Europe’s Spaceport in Kourou on a five-year mission to map the stars of the Milky Way with unprecedented precision.
Built by Astrium in Toulouse, the Gaia spacecraft took off on board an Antonov 124 heavy-lift aircraft at 20.00 yesterday from Toulouse airport with the destination of Cayenne, the capital of French Guiana. The spacecraft will now be transported by truck to Europe’s Spaceport in Kourou, 64 km from Cayenne.
“This is a very exciting day for the Gaia mission and all the teams involved, who have worked for years to get to where we are today,” says Giuseppe Sarri, ESA’s Gaia project manager. “Arriving in Kourou and starting the launch campaign is a great achievement.”
Gaia’s main goal is to create a highly accurate 3D map of our galaxy, the Milky Way, by repeatedly observing a billion stars to determine their precise positions in space and their motions through it.
A billion stars is roughly 1% of all the stars spread across the Milky Way, providing a representative sample from which the properties of the whole galaxy can be measured. Gaia will measure these stars from an orbit around the Sun, near a location known as the L2 Lagrangian point, some 1.5 million km beyond Earth’s orbit.
Other measurements will assess the vital physical properties of each star, including its temperature, luminosity and composition.
The resulting census will allow astronomers to determine the origin and the evolution of our galaxy.
Gaia will also uncover tens of thousands of previously unseen objects, including asteroids in our Solar System, planets around nearby stars, and exploding stars – supernovae – in other galaxies.
Sarri, who also flew on the Antonov aircraft with Gaia, said that the flight from Europe to South America went smoothly. “We are now looking forward to the coming weeks of final preparation, which we will undertake with the same care and determination that the teams have shown so far when building the spacecraft.”
On 28 August, a second Antonov 124 aircraft will carry Gaia’s sunshield and most of the ground support equipment from Toulouse to Cayenne. At that point, all the spacecraft parts and equipment will have arrived in French Guiana, leading towards the launch later this year.

On The Net:

NASA Reveals New Images, Video Of Asteroid Redirect Mission

[WATCH VIDEO: Asteroid Redirect Mission Concept Animation]

Lee Rannals for redOrbit.com – Your Universe Online

NASA has released a new animation along with new photos of the space agency’s proposed asteroid redirect mission.

The space agency said it is developing the first-ever mission to identify, approach, capture and redirect a small asteroid into a stable orbit in lunar vicinity. The part of this mission would be performed by an asteroid capture vehicle, while another part of the mission would involve a two-person crew rendezvousing with the already caught asteroid.

Images released by NASA show the asteroid capture vehicle in a “stowed” configuration. The vehicle would release an object that stretches out like a vacuum cleaner tube for the asteroid to go into. Once the asteroid is inside the cylinder-shaped device, the robot collapses the tube-like object around it similar to a net, capturing the asteroid.

NASA is developing a cutting-edge solar-electric propulsion thruster that uses xenon ions for propulsion to help redirect the asteroid. An earlier version of this propulsion engine has been flying on NASA’s Dawn mission to the asteroid belt.

“This mission represents an unprecedented technological feat and allows NASA to affordably pursue the Administration’s goal of visiting an asteroid by 2025,” NASA said. “It raises the bar for human exploration and discovery while taking advantage of the diverse talents at NASA.”

The video depicts a manned mission heading towards a near-Earth asteroid aboard an Orion spacecraft. During the journey, the animation shows the crew relying on a lunar gravity assist in order to gain momentum to rendezvous with the asteroid.

After rendezvousing with the asteroid, crew members would connect NASA’s Orion spacecraft to the robotic asteroid capture vehicle, where they would perform a spacewalk to collect samples to return back to Earth. The trip from Earth to the captured asteroid would take Orion and its two-person crew about nine days to complete.

The space agency said it is creating an asteroid mission baseline concept to develop further in 2014 to help engineers establish more details about the mission. NASA scientists will continue to evaluate several alternatives for consideration throughout mission planning.

NASA brought together agency leaders in July for an internal review of the multiple concepts and alternatives proposed for each phase of an asteroid mission. The experts also assessed technical and programmatic aspects of the mission. NASA said it is assessing more than 400 responses it has received from universities and the public on ideas for the potential asteroid mission.

The asteroid initiative capitalizes on activities across the agency’s human exploration, space technology and science efforts. The space agency said it is enhancing its ongoing efforts to identify and characterize near-Earth objects for investigation, and to find potentially hazardous asteroids and targets for capture.

Images Below:

(LEFT) In this conceptual image, the two-person crew uses a translation boom to travel from the Orion spacecraft to the captured asteroid during a spacewalk. Credit: NASA

(RIGHT) This concept image shows an astronaut preparing to take samples from the captured asteroid after it has been relocated to a stable orbit in the Earth-moon system. Hundreds of rings are affixed to the asteroid capture bag, helping the astronaut carefully navigate the surface. Credit: NASA

Grandmothers More Likely To Have Depression When They Raise Grandkids

Brett Smith for redOrbit.com – Your Universe Online

A new study from Case Western Reserve University in Cleveland found grandmothers who are a household’s primary caregiver are more likely to suffer from depression, but are highly receptive to assistance.

In one of the longest-running studies of its kind, researchers focused on grandmothers in a variety of family situations, from being full-time caregivers to having no direct role in the care of their grandchildren.

“Although we expected the primary caregiver grandmothers raising grandchildren would have more strain and depressive symptoms,” said co-author Carol Musil, a professor of nursing at Case Western, “we were surprised at how persistent these were over the years examined in the study.”

According to Census data, 5.3 percent of all American households have a grandparent living in the house. Musil said more than 1 million grandmothers have the responsibility of directly raising grandchildren because their parents do not live in the home.

In the study, which was published in the journal Nursing Outlook, the researchers followed 240 randomly-selected grandmothers over six-and-a-half years to see how caring for their grandchildren 16 years and younger affected their well-being. For the first three years, participants, who averaged almost 58 years old, were surveyed about their physical and mental health annually. An additional two surveys were conducted between 2 and 2.5 years apart.

Participants were divided into three caregiving situations: fulltime caregivers for their grandchildren, living in multigenerational households and non-caregivers. The women came from various backgrounds representing rural, suburban and urban Ohio.

While participants showed signs of depression and stress, researchers found that a full-time caregiver situation had no effect on a grandmothers’ resourcefulness, and these women were generally open to receiving a variety of assistance.

Musil said the study showed that grandmothers in a highly stressful situation might be open to resourcefulness training, which has been shown to reduce depressive symptoms in pilot studies.

“They need support from others,” she said, “but the most important thing is to maintain and perhaps develop new cognitive and behavioral skills and approaches for handling some very challenging family issues.”

The Case Western study comes just after another study presented at the Sociological Association’s annual meeting in New York City indicated that a strong grandparent-grandchild bond can benefit both generations.

“We found that an emotionally close grandparent-adult grandchild relationship was associated with fewer symptoms of depression for both generations,” said study co-author Sara M. Moorman, an assistant professor in the Department of Sociology and the Institute on Aging at Boston College. “The greater emotional support grandparents and adult grandchildren received from one another, the better their psychological health.”

“Grandparents who experienced the sharpest increases in depressive symptoms over time received tangible support, but did not give it,” Moorman added. “There’s a saying, ‘It’s better to give than to receive.’ Our results support that folk wisdom — if a grandparent gets help, but can’t give it, he or she feels badly.”

This study was based on tracking the mental health of almost 380 grandparents and 340 grandchildren from 1985 to 2004 using surveys that were taken every few years.

NASA Air Pollution Mission To Fly Over Houston

NASA

A multi-year airborne science mission is on its way to Texas to help scientists better understand how to measure and forecast air quality from space.

Two NASA aircraft equipped with scientific instruments will fly over the Houston area throughout September of 2013. One aircraft will fly as low as 1,000 feet off the ground.

The aircraft are part of NASA’s five-year DISCOVER-AQ study, which stands for Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality.

Researchers are working to improve the ability of satellites to consistently observe air quality in the lowest part of the atmosphere. If scientists could better observe pollution from space, they would be able to make better air quality forecasts and more accurately determine where pollution is coming from and why emissions vary.

A fundamental challenge for space-based instruments monitoring air quality is to distinguish between pollution high in the atmosphere and pollution near the surface where people live. DISCOVER-AQ will make measurements from aircraft in combination with ground-based monitoring sites to help scientists better understand how to observe ground-level pollution from space.

“DISCOVER-AQ is collecting data that will prepare us to make better observations from space, as well as determine the best mix of observations to have at the surface when we have new satellite instruments in geostationary orbit,” said James Crawford, the mission’s principal investigator at NASA’s Langley Research Center in Hampton, Va. “NASA is planning to launch such an instrument, called TEMPO, in 2019.”

Because many countries, including the United States, have large gaps in ground-based networks of air pollution monitors, experts hope satellites can provide a more complete geographic perspective on the distribution of pollutants.

A fleet of Earth-observing satellites, called the Afternoon Constellation or “A-train,” will pass over the DISCOVER-AQ study area daily in the early afternoon. The satellites’ data, especially from NASA’s Aqua and Aura spacecraft, will give scientists the opportunity to compare the view from space with that from the ground and aircraft.

“The A-Train satellites have been useful in giving us a broader view of air pollution than we’ve ever had before,” said Kenneth Pickering, DISCOVER-AQ’s project scientist at NASA’s Goddard Space Flight Center in Greenbelt, Md. “DISCOVER-AQ will help scientists interpret that data to improve air-quality analysis and regional air quality models.”

Flights are scheduled to start Sept. 4 and continue through the month. A four-engine P-3B turboprop plane from NASA’s Wallops Flight Facility in Wallops Island, Va., will carry eight instruments. A two-engine B200 King Air aircraft from NASA Langley will carry two instruments.

Sampling will focus on the Houston metropolitan area ranging from Conroe in the north to Galveston in the south.

The flight path is designed to pass over and complement the air quality information gathered at ground measurement sites operated by the Texas Commission for Environmental Quality and the City of Houston. Many of these sites will be augmented with additional measurements by DISCOVER-AQ and collaborating scientists sponsored by the Texas Air Quality Research Program.

The 117-foot-long P-3B will sample the composition of air outside the aircraft as it spirals between altitudes from 15,000 feet to as low as 1,000 feet over the ground sites. The smaller B200 King Air will collect data looking downward from an altitude of 26,000 feet. The plane’s instruments will look down at the surface, much like a satellite, and measure particulate and gaseous air pollution. The two airplanes will fly from NASA Johnson’s facility at Ellington Field.

The DISCOVER-AQ mission is a partnership with the U.S. Environmental Protection Agency and National Oceanic and Atmospheric Administration. Other academic partners include the National Center for Atmospheric Research; the University of Maryland in College Park and Baltimore County; University of Colorado, Boulder; University of California-Berkeley; Pennsylvania State University, State College; University of Innsbruck in Austria; and Millersville University, Millersville, Penn.

Local partners for the Houston campaign include the Texas Commission on Environmental Quality, the City of Houston, University of Houston, University of Texas, Rice University, Baylor University, and Aerodyne Research Inc.

DISCOVER-AQ is an Earth Venture mission, part of the Earth System Science Pathfinder program managed at Langley for the Earth Science Division of NASA’s Science Mission Directorate in Washington.

For more information on DISCOVER-AQ, visit: http://www.nasa.gov/mission_pages/discover-aq/ and http://discover-aq.larc.nasa.gov/

On the Net:

Omate Bringing Standalone Smartwatch To The Tech Table

Michael Harper for redOrbit.com – Your Universe Online
While tech giants like Apple, Google and Microsoft are only rumored to be releasing wearable computing devices, namely “smartwatches,” smaller companies are already producing and selling these devices.
Now, just weeks before Samsung is expected to release its own “Galaxy Gear” Android smartwatch, another Kickstarter-funded digital wrist companion is looking to move forward in production. The Omate TrueSmart is a water resistant watch that prides itself on being able to operate as a separate, 1.54-inch Android device on your wrist.
The Omate Kickstarter campaign began yesterday and has already earned more than the $100,000 goal. At the time of this writing, 853 backers have donated $165,150 to the campaign, which is slated to run until September 20.
Though none of the smartwatches from the big players in technology have been released or even shown off, many expect these devices to operate in the same way. Like those few options that exist from Pebble, CooKoo and others, smartwatches are generally expected to pair with a smartphone via Bluetooth, display text messages and emails, and even answer or make telephone calls.
The Omate TrueSmart watch can perform each of these tasks, but separates itself by promising to act as a standalone Android device without having to be paired to a smartphone. This means the Android 4.2 device can act as the tiniest of smartphones, handling voice calls, sending text messages and even checking into social networking sites directly from the wrist.
Wearers can navigate through the menus with multi-touch gestures or their voice; a 1.3 GHz dual core processor powers Android apps built for the tiny screen. The Kickstarter video shows users pulling up a tiny version of Facebook for Android, though they don’t show scrolling through a news feed or scanning through pictures of friends.
Like other smartwatches, rumored or realistic, the TrueSmart can pair with smartphones via a Bluetooth 4.0 connection. The thick, waterproof watch also has Wi-Fi radios built in to connect to home networks. Furthering the end goal of building a standalone Android device, Omate also plans to pack this device with a five-megapixel camera.
Smartwatches don’t live in pockets or on desktops; therefore those building or looking to build smartwatches have had to focus on making their offerings resistant against the elements, namely moisture and water. The Omate Smartwatch has the look of a heavy duty and water resistant device and indeed has been built to stand up to IP67 water resistant standards. This, says the Omate team, means the TrueSmart can resist thunderstorms, multiple hand washing, and even those pesky “redbull spills.”
Though the device is capable of standing up to the ever present threat of energy drink spills, weather and hand washing, the multi-touch display will not work while the screen is wet.
The Omate team say they are “more ready than other crowd-funding projects you have seen” with a final design waiting to be produced. With the money they get from Kickstarter, they hope to produce the hardware and persuade Android developers to start building apps for the TrueSmart. The campaign will end later next month and Omate hopes to ship its first round of production units as early as October; the second round of production units are planned to ship in November.

Our Aging Process Is Hastened Due To Maternal Genetics

April Flowers for redOrbit.com – Your Universe Online

There are many reasons our bodies age. The process is determined by an accumulation of various kinds of cell damage that impair the function of bodily organs.

A new study from the Karolinska Institutet and the Max Planck Institute for Biology of Aging shows the damage occurring in the cell’s power plant – the mitochondrion – is of particular importance in the aging process and is determined by the genes we inherit from our mothers as much as the accumulated damage. The findings of the study were published in a recent issue of Nature.

“The mitochondrion contains its own DNA, which changes more than the DNA in the nucleus, and this has a significant impact on the aging process,” says Nils-Göran Larsson, professor at Karolinska Institutet and principal investigator at the Max Planck Institute for Biology of Aging. “Many mutations in the mitochondria gradually disable the cell’s energy production.”

The mitochondrion is located in the cell and generates most of the cell’s supply of ATP, which is used as a source of chemical energy.

This study represents the first time researchers have shown the aging process is influenced by DNA inherited from an individual’s mother, as well as the accumulation of mitochondrial DNA damage during a person’s lifetime.

“Surprisingly, we also show that our mother’s mitochondrial DNA seems to influence our own aging,” said Larsson. “If we inherit mDNA with mutations from our mother, we age more quickly.”

DNA, whether normal or damaged, is passed down between generations. The question of whether it is possible to affect the degree of mDNA damage through lifestyle intervention, however, is yet to be investigated. For now, all the researchers know is mild mDNA damage transferred from the mother contributes to the aging process.

“The study also shows that low levels of mutated mDNA can have developmental effects and cause deformities of the brain,” said Jaime Ross, PhD, at the Karolinska Institutet.

“Our findings can shed more light on the aging process and prove that the mitochondria play a key part in aging; they also show that it’s important to reduce the number of mutations,” said Larsson.

“These findings also suggest that therapeutic interventions that target mitochondrial function may influence the time course of aging,” said Barry Hoffer, MD, PhD, from the Department of Neurosurgery at University Hospitals Case Medical Center and Case Western Reserve University School of Medicine. Hoffer is also a visiting professor at the Karolinska Institutet.

“There are various dietary manipulations and drugs that can up-regulate mitochondrial function and/or reduce mitochondrial toxicity. An example would be antioxidants. This mouse model would be a ‘platform’ to test these drugs/diets,” said Dr. Hoffer.

The team plans to continue their research on mice, and expand to using fruit flies, to investigate whether reducing the number of mutations can extend their lifespan.

Data Mining Study Explores Risk Factors Associated With Cardiovascular Disease

redOrbit Staff & Wire Reports – Your Universe Online

In the absence of other contributing factors, men between the ages of 48 and 60 are at moderate to severe risk of suffering a heart attack, according to new research appearing in the International Journal of Biomedical Engineering and Technology.

By comparison, in the absence of other risk factors (such as alcohol consumption, smoking and high blood cholesterol), women over the age of 50 only face a mild cardiac risk, Subhagata Chattopadhyay of the Camellia Institute of Engineering in Kolkata and his colleagues wrote in their paper.

Cholesterol levels, alcohol intake and passive smoking continue to be the most important risk factors when it comes to mild, moderate and severe heart disease risk, the study authors discovered. Those conclusions are the result of a data mining exercise used to construct a risk model for heart attacks.

As part of their research, Chattopadhyay’s team used 300 real-world sample patient cases with varying levels of cardiac risk, culling the results based on twelve predisposing factors: age, gender, alcohol abuse, cholesterol level, smoking (active and passive), physical inactivity, obesity, diabetes, family history, and prior cardiac event.

Determining a person’s risk of experiencing an adverse health event such as a heart attack is difficult, as clinical history, symptoms and warning signs typically do not follow a set path. The risk level tends to vary from patient to patient, and the interpretations of the diagnoses rarely conform to the rules of epidemiology.

Using computational data mining techniques makes it possible for experts to extract useful information from real-world clinical data, the researchers explained. It could also help eliminate the subjectivity of clinical prognosis to some degree, allowing the epidemiology to work more precisely at the patient level.

Chattopadhyay’s study is not the first time these types of data mining techniques have been used. Previous studies have reportedly had issues in that data classification was based on decisions made by doctors. As a result, the records were exposed to the subjectivity the researchers were hoping to avoid in this study.

“The essence of this work essentially lies in the introduction of clustering techniques instead of purely statistical modeling, where the latter has its own limitations in ‘data-model fitting’ compared to the former that is more flexible,” Chattopadhyay explained.

“The reliability of the data used, should be checked, and this has been done in this work to increase its authenticity,” he added. “I reviewed several papers on epidemiological research, where I’m yet to see these methodologies used.”

Zombie-Like Vortex Gives Final Push To Birth New Star

Brett Smith for redOrbit.com – Your Universe Online

For years, most astronomers have agreed on the basic steps that lead to star formation, except one – how a cloud of swirling gas can slow down enough to concentrate into something capable of nuclear fusion.

A new study from the University of California, Berkeley has found evidence of “zombie vortices” within a gestating star leading to a final push that gives birth to a new star.

According to prevailing theories, stars begin as dense clouds of gas that slowly collapse into clumps. These clumps begin to spin into one or more disks, referred to as protostars. For a protostar to become larger, the disk has to lose some of its spin so that the gas can spiral inward, eventually creating enough mass to ignite through a nuclear reaction.

“After this last step, a star is born,” said Philip Marcus, a professor in the Department of Mechanical Engineering and co-author of the new Berkeley study published in the journal Physical Review Letters.

In the study, the Berkeley researchers focused on how the spinning disk of gas loses its angular momentum in order to kick off this last step. One theory posits that magnetic forces destabilize the disks enough to slow momentum. However, the Berkeley team asserts that the gas needs to be charged to interact with a magnetic field and parts of a protoplanetary disk are too cold to accept a charge.

“Current models show that because the gas in the disk is too cool to interact with magnetic fields, the disk is very stable,” Marcus said. “Many regions are so stable that astronomers call them dead zones – so it has been unclear how disk matter destabilizes and collapses onto the star.”

Unlike the newly developed Berkeley model, prevailing models do not account for changes in a disk’s gas density based upon its height, the researchers said.

“This change in density creates the opening for violent instability,” said co-author Pedram Hassanzadeh, a geophysical fluid dynamics expert currently with Harvard University.

When the Berkeley team accounted for density change in their computer models, vortices emerged within the disk. These vortices spawned more vortices, culminating in the disruption of the disk’s angular momentum.

“Because the vortices arise from these dead zones, and because new generations of giant vortices march across these dead zones, we affectionately refer to them as ‘zombie vortices,’” Marcus said. “Zombie vortices destabilize the orbiting gas, which allows it to fall onto the protostar and complete its formation.”

The researchers noted that these types of vortices are already found throughout nature, from Jupiter’s Great Red Spot to tornadoes that are spun off violent storms.

The Berkeley researchers said they plan to apply their findings to more detailed computer models that include velocities, temperatures and densities of known protostar disks.

“Other research teams have uncovered instabilities in protoplanetary disks, but part of the problem is that those instabilities required continual agitations,” said Richard Klein, a theoretical astrophysicist at the Berkeley’s Lawrence Livermore National Laboratory who is working with the study researchers on the next round of models. “The nice thing about the zombie vortices is that they are self-replicating, so even if you start with just a few vortices, they can eventually cover the dead zones in the disk.”

LEARN MORE: Astronomy: A Self-Teaching Guide (Wiley Self-Teaching Guides)

Twitter’s Video Sharing Vine App Passes 40 Million Users

Michael Harper for redOrbit.com – Your Universe Online

Twitter-owned Vine, the first video-sharing app of its kind to share six-second loops with the world, announced yesterday in a Tweet the app has now passed 40 million users.

Months later, Facebook-owned Instagram released its own video sharing service which gave users 15 seconds in which to record and share their clips. Though many worried Vine users would stage a mass exodus and flood Instagram, today’s announcement indicates those concerns may not have been warranted.

When asked, Vine company officials said it has now reached 40 million registered users, not active users. This means despite 40 million people having a Vine account registered with the social network, the number of people who actively use the app each month is certainly lower. Instagram, on the other hand, has 130 million active users that do not need to download a separate app to begin sharing video — their friends’ videos simply arrive in their familiar feed.

“We’ve said this before and we’ll say it again: this community – now more than 40 million of you – is amazing. Thank you for inspiring us,” reads the Tweet from @vineapp.

Though the 40 million number only applies to registered users, it doesn’t mean Vine is stalled out. Vine launched early this year in January and by June (the same month Instagram launched their video service) had earned some 13 million users. In less than two months, Vine has more than doubled the number of registered users, an impressive feat no matter how one sees it.

When asked, Instagram described its active accounts as those who use the service at least once a month. By using this metric, the video-sharing service claims it has 130 million active users. The social site would not say how many of these millions of accounts use the new video service.

Perhaps more helpful in boosting Vine’s numbers since June was the release of the app on Android. Like Instagram, Vine launched first and exclusively on iPhone. In June, it released the long-awaited Android version of the Vine app, thereby opening up its doors to millions of other users. Also like Instagram and other iOS-first apps, Vine for Android launched with fewer features and some bugs not present in the iPhone version. Since then it has been improving the app to bring it in line with the existing iOS app.

In addition to announcing 40 million users, Vine cofounder Dom Hoffman gave an interview with NPR explaining how the company ultimately decided to offer six-second videos, a seemingly arbitrary number. After experimenting with various lengths — anywhere from five to ten seconds — the Vine team realized six seconds was the proverbial “just right” amount of time.

“One day we did wake up and say, six seconds,” joked Hoffman, saying it took a while to arrive at this conclusion. The other catchy thing about Vine is the loop. Though videos are only six seconds long, they loop continually until a user stops them or navigates away.

“The next thing that we noticed was that the videos start quickly but they also end very quickly and that felt anti-climactic,” said Hoffman, describing how they decided to add the looping feature to the service.

Cycads Evolved To Grow In Groves With Seed Dispersal By Large Frugivores

April Flowers for redOrbit.com – Your Universe Online

Before the age of dinosaurs, the ancient cycad lineage existed. Cycads also co-existed more recently with large herbivorous mammals, such as the Ice Age megafauna that only went extinct a few tens of thousands of years ago. Modern cycads have large, heavy seeds with a fleshy outer coat, suggesting they rely on large bodied fruit-eating animals to disperse their seeds. However, little evidence has been found that modern larger-bodied animals like emus or elephants are eating and dispersing the seeds.

Researchers John Hall and Gimme Walter of the University of Queensland, Australia, questioned how these plants could still be around today if they are adapted for dispersal by a set of animals that has been missing from Earth’s fauna for tens of thousands of years. Their findings, published in the American Journal of Botany, propose the clumped dispersal mechanism these ancient plants most likely relied upon still serves them well today.

Cycad fossils have been recorded from around 280 million years ago – approximately the time the coniferous forests first arose. The ecological distribution pattern suggests the seed dispersal of many living cycads today is limited and ineffectual. Macrozamia miquelii, for example, is a cycad endemic to Australia that is found in highly clumped, dense, numbers, where it dominates the understory. Inexplicably, large areas of seemingly suitable habitat separate colonies from one another. Researchers say these patterns suggest few to none of the seeds are being dispersed at any distance from the parent plants, one of the long-standing tenets of the advantages of seed dispersal.

The researchers wanted to determine whether the seed dispersal and seedling distribution pattern of M. miquelii might indicate it is maladapted to its current dispersers, so they proposed a new twist on the functional significance of the megafaunal dispersal syndrome.

“Naturalists are very comfortable with the idea of animals gaining a biological advantage by choosing to live together in high density ‘colonies’—such as ant nests or seabird rookeries—in certain parts of the landscape,” notes Hall. “But when it comes to plants, there is a bit of a subconscious assumption that the purpose of seed dispersal is to simply spread seeds as far and as evenly as possible across the broadest possible area.”

The researchers investigated whether cycads might be a type of plant that forms such colonies. “The main idea behind our research,” Hall clarifies, “is to ask the question: when it comes to the spatial ecology of plants, could it be useful to think of some plant species as also forming and maintaining ‘colonies’ or ‘groves’ in the wider landscape?

“Australian cycads once co-existed with megafauna that could have dispersed their large, heavy seeds—such as giant ground birds, bigger then present day emus, and Diprotodon, a rhino sized marsupial quadruped,” explains Hall. “The large, heavy and poisonous seeds, surrounded by a fleshy and non-toxic fruit-like layer, seem well adapted to being occasionally swallowed whole en masse by megafauna, which would then pass the many seeds simultaneously at a new location: the genesis of a new grove.”

DISPERSAL CLUES

One or two cones with multiple large seeds are produced by female cycads. Each seed is covered with a thin outer fleshy sarcotesta. The researchers tagged ten large seeds from the single cone of 12 plants with a small steel bolt to track how many of the seeds were removed from the parent plant, and how far they would be dispersed.

Within three months of tagging, nearly all the seeds had their sarcotesta eaten – mostly by brushtailed opposums, which scrape the flesh off and discard the large seeds. The disperser’s identity was confirmed with camera traps at two fruiting females and hair traps baited with seeds. However, the researchers found 97 percent of the tagged seeds had moved less than three feet from their parent plant. Only a few were moved farther out, and in all cases, the seeds were found less than 15 feet from the parent.

The team also found although most of the seeds ended up under the parent cycad, there were almost no seedlings within a five foot radius of the adult cycads. This suggests most seeds close to their parent plant perish.

The researchers say despite the large seed size, the primary dispersers for cycads today are smaller bodied animals. Such animals do not spread the seeds far and wide, nor take them to new habitats that are colonizable. Despite this, the plants seem to be flourishing by sprouting up near the adults and forming mono-dominant stands.

“Since their potential Australian prehistoric megafaunal dispersers became extinct around 45,000 years ago, why haven’t Australian cycads begun to evolve smaller seeds that would be more readily dispersed by flying birds or possums for example, over the interim?” posits Hall.

“We argue that the answer to this question is that cycads are actually disadvantaged by dispersing as lone individuals that may travel long distances, but in so doing so, become isolated from others of their kind,” Hall states.

Hall adds cycad plants are all born either male or female. They rely on host specific insect pollinators, meaning a cycad dispersed alone, a long way from others of its kind, would probably be at a reproductive disadvantage.

If, as scientists believe, cycads evolved to be dispersed by large-bodied frugivores, it is most likely that these animals deposited many cycad seeds in their dung at once. This led to the plants adapted to grow in groves, which plays to their favor today despite the loss of these megafauna dispersal animals.

“There’s no doubt that cycad ancestors were contemporary with herbivorous dinosaurs for many hundreds of millions of years, so it’s plausible that cycad seed dispersal ecology and “colony forming” behavior may be extremely ancient, and echo the ecology of dinosaur-plant interaction” he concludes, “but of course we now enter into the realm of speculation.”

Hall’s research into the spatial ecology of “colony” forming plants continues beyond cycads. He plans to explore these issues in other plants and landscapes, especially in forest understories.

California Health Network Program Doubles Hypertension Control Rates

redOrbit Staff & Wire Reports – Your Universe Online

One California health system’s efforts to help high blood pressure patients through a comprehensive treatment program resulted in a reduction in hypertension-related stroke and heart disease risk, claims research published Tuesday in the Journal of the American Medical Association (JAMA).

Through the program, experts from Kaiser Permanente Northern California (a coalition of 21 hospitals and 73 doctors’ offices) managed to nearly double the rate of blood pressure control among adult patients diagnosed between 2001 and 2009. The rate of hypertension control throughout their network increased from 43.6 in 2001 to 80.4 percent eight years later, the authors of the JAMA study explained in a statement.

During that time, the national mean control rate increased from 55.4 percent to 64.1 percent, while the California state control rate increased from 63.4 percent in 2006 (the first year for which the data was available) to 69.4 percent in 2009. Those percentages are according to the Healthcare Effectiveness Data and Information Set quality measurement set by the National Committee for Quality Assurance, the researchers noted.

“I think there are many parts of this program that would likely be applicable in other primary care settings,” lead author Dr. Marc Jaffe, an endocrinologist at Kaiser Permanente South San Francisco Medical Center, told Reuters Health. “Since the end of the study, the hypertension control rates (at Kaiser) have continued to improve, and as of 2011, our control rates were as high as 87 percent.”

Elements of the Kaiser Permanente Northern California (KPNC) hypertension program include the creation of a comprehensive hypertension registry, the development and sharing of performance metrics, and the establishment of evidence-based guidelines. Furthermore, the approach calls for medical assistant visits for blood pressure measurement, and single-pill combination pharmacotherapy, in which multiple drugs are combined into one pill.

“Two features likely played a big role in the program’s success,” said Lindsey Tanner of the Associated Press (AP). The first was the single-pill combination therapy, which combined the blood pressure drugs lisinopril and a diuretic into a less expensive, easier-to-take form of treatment.

The other part of the program Tanner credits for its success is the fact, starting in 2007, officials at KPNC “began offering free follow-up visits with medical assistants, rather than doctors, checking blood pressure readings. Besides charging no insurance copayment, these brief visits were available at more flexible times, increasing chances that patients would stick with the program.”

Dr. Jaffe called KPNC’s initiative the “first successful, large-scale program” of its kind to be sustained over a long period of time. He noted the study’s success, and the health system’s continued positive results in recent years, “has huge implications for the health of our members” because it reduces their risk of suffering from a stroke or heart disease. Hypertension affects a reported 65 million American adults (or nearly 30 percent of all US residents over the age of 18) and is a major contributor to cardiovascular disease.

Unemployment Restricts Access To Kidney Transplants

Full-time workers more likely to get transplants

People in end-stage kidney failure in need of a kidney transplant are much less likely to be placed on a waiting list for a new kidney or to actually receive a new kidney once on the list if they are unemployed or work part time, according to new collaborative research from the University of New Hampshire.

“There is a strong negative association between a patient’s unemployment and the likelihood of being placed on a waiting list for a kidney transplant, and once on the waiting list, the likelihood of receiving a transplant,” says Robert Woodward, the Forrest D. McKerley Endowed Chair in Health Economics at the University of New Hampshire.

The researchers found that patients who are retired and/or disabled, working part time, or working full time are much more likely to be placed on a transplant waiting list than unemployed patients. They also are more likely to receive a transplant once placed on the list than unemployed patients. Finally, those who work full time are most likely to be both added to the transplant list and receive a kidney transplant.

The research was conducted by Woodward and researchers at the University of Pittsburgh Medical Center, University of Massachusetts Memorial Medical Center, and the Transplant Institute at Beth Israel Deaconess Medical Center. The new research is presented in the journal Clinical Transplantation in the article “Recipient’s unemployment restricts access to renal transplantation.”

Researchers evaluated transplant waiting list information for nearly 430,000 patients in end-stage renal disease from the U.S. Renal Data System and the United Network for Organ Sharing. Researchers investigated the employment status of these patients in relation to the amount of time between being diagnosed with end-stage renal disease and being listed on the transplant list, and the amount of time from when the patient was listed to when s/he received a transplant.

Of the nearly 430,000 patients evaluated for a kidney transplant, about 54,000 were added to the list for a kidney transplant. And of those listed for a kidney transplant, nearly 22,000 actually received a kidney transplant.

Although this analysis did not identify why those who are unemployed experience faced restricted access to kidney transplantation, the authors suspect there may be many factors that contribute to the unwillingness of transplant centers to list and transplant poorly employed patients.

The researchers explain that a number of factors are considered when deciding to add someone to a transplant list and move forward with the transplant procedure, including whether the patient can afford the post-transplant medical care and immunosuppressive medications. This post-transplant medical care plays an important part in the success of the transplant and survival of the patient.

“A lack of employment can lead to limited financial resources for the patient and subsequent inadequate medical care following the transplant. Transplant centers have found that patients with limited financial resources have higher rates of noncompliance with post-transplant medical care. Because noncompliance with post-transplant care is a leading cause of rejection, infection, and death, transplant centers may be more hesitant about providing access to transplants to those with limited financial resources,” Woodward says.

Insurance status also may play a key role in whether access to kidney transplants is restricted to patients who are unemployed or employed part time, many of whom who may not have insurance due to their employment status. Woodward says there is significant evidence to suggest that lack of health insurance can significantly contribute to noncompliance with post-transplant medical care requirements, which might be responsible for up to 35 percent of unsuccessful kidney transplants.

Those who rely only on Medicare also may experience restricted access to transplant services. According to the researchers, even though Medicare pays for a large portion of the cost of the required immunosuppressant medications, it does not cover non-immunosuppressant prescription drugs that transplant patients are more likely to require. As a result, many transplant centers require secondary, private insurance before a patient is considered for listing on the transplant list.

Finally, Woodward and his fellow researchers say that transplant centers could view employment status as a marker of mental and physical health status, education level, and perceived compliance by the patient with post-transplant care.

Woodward and his co-authors suggest that patients who are more likely to experience barriers to transplants based on employment status could benefit from increased interaction between patients, social workers, and other medical personnel, including case managers and financial specialists. The focus should be on continued employment and vocational rehabilitation, they said.

On the Net:

Lessons From The Grand Canyon: Dams Destabilize River Food Webs

Cary Institute of Ecosystem Studies

Managing fish in human-altered rivers is a challenge because their food webs are sensitive to environmental disturbance. So reports a new study in the journal Ecological Monographs, based on an exhaustive three-year analysis of the Colorado River in Glen and Grand Canyons.

Food webs are used to map feeding relationships. By describing the structure of these webs, scientists can predict how plants and animals living in an ecosystem will respond to change. Coauthor Dr. Emma Rosi-Marshall, an aquatic ecologist at the Cary Institute of Ecosystem Studies, comments, “Given the degraded state of the world’s rivers, insight into food webs is essential to conserving endangered animals, improving water quality, and managing productive fisheries.”

The project – which relied on a team of more than 10 researchers from the Cary Institute of Ecosystem Studies, Montana State University, Idaho State University, University of Wyoming, U.S. Geological Survey, and Loyola University of Chicago – assessed six sites on the Colorado River, many so remote they required two-week boat trips through the canyon.

Study sites were distributed along a 240-mile stretch downstream of Glen Canyon Dam, which was completed in 1963 for water delivery and hydroelectric power needs. During the three-year study, samples of over 3,600 animal diets and 4,200 invertebrate populations were collected and processed. Among the team’s findings: following an experimental flood, sites near the dam had the most dramatic changes in the structure and function of their food webs.

Lead author Dr. Wyatt Cross of Montana State University comments, “Glen Canyon Dam has transformed the ecology of the Colorado River. Immediately downstream, cold, low-sediment waters have favored exotic plants and animals that haven’t co-evolved with native species. We now see reduced biodiversity and novel species interactions that have led to the instability of these river food webs.”

Near Glen Canyon Dam, the researchers found food webs dominated by invasive New Zealand mud snails and non-native rainbow trout, with large mismatches in the food web and only a small percentage of available invertebrates eaten by fish. In contrast, downstream food webs had more native fish species, and fewer invertebrates that were more efficiently consumed by fish, including a federally-listed endangered species, the humpback chub.

In March of 2008, the Department of Interior conducted an experiment that simulated pre-dam flood conditions, providing an opportunity to see how high flows affected food webs with very different characteristics. Rosi-Marshall explains, “Food web stability increased with distance from Glen Canyon Dam, with downstream sites near tributaries proving the most resistant. At these locations, the flood didn’t cause major changes in the structure of food webs or the productivity of species.”

It was a different picture for sites near the dam. As co-author Dr. Colden Baxter, an aquatic ecologist with Idaho State University, notes, “These energy inefficient, simplified food webs experienced a major restructuring following the experimental flood.” New Zealand mudsnails were drastically reduced. And changes in algal communities led to a rise in midges and blackflies – favored foods of trout – resulting in a near tripling of non-native rainbow trout numbers.

Rainbow trout, introduced below Glen Canyon Dam in the 1960s, support a valued recreational fishery. But when trout density increases upstream, and fish move downstream into Grand Canyon, they can compete with native fishes for limited food resources, sometimes preying upon juveniles.

“Understanding how and why high flows affect trout numbers is valuable information that decision makers can use to help manage and protect river resources,” remarks Dr. Theodore Kennedy, project coordinator and a coauthor of the study with the U.S. Geological Survey’s Grand Canyon Monitoring and Research Center.

Dr. Robert Hall, an ecologist at the University of Wyoming, notes, “While downstream food webs proved to be more stable in our study, they are clearly a shadow of pre-dam conditions. Four large native fishes have already been lost from the Grand Canyon reach of the Colorado River. And invertebrates that were once an important part of the food web, such as mayflies and net-spinning caddisflies, are conspicuously absent.”

Today, many ecosystems are like the Colorado River: an amalgam of native and non-native species living in human-altered habitat. The study’s authors demonstrated that large-scale modifications, like dams, can have far-reaching effects on how energy flows through food webs, altering their stability and leading to less resilient ecosystems.

Cross concludes, “Looking to the future, we need to develop predictions about how disturbances spread through ecosystems, affecting the species or services upon which we depend, so we can implement proactive strategies.”

This study is a product of the Glen Canyon Dam Adaptive Management Program, a collaborative initiative that supports scientific research in the Grand Canyon as an aid to decision making and management. The USGS Grand Canyon Monitoring and Research Center provides the scientific research and monitoring that informs the Bureau of Reclamation’s management and operation of Glen Canyon Dam.

On The Net:

Fish Species Losing Homes As Sea Anemones Suffer From Bleaching

Brett Smith for redOrbit.com – Your Universe Online

Marine biologists have been warning recently about the dangers of coral bleaching and new research from a team of international scientists indicates sea anemones are also susceptible to the color-sapping phenomenon that is thought to result from death of sea creatures’ symbiotic algae.

In addition to being suspected of causing an increase in sea anemone mortality, bleaching also affects clownfish and 27 other fish species that depend on anemones for shelter from predators, according to the team’s report published in the journal PLoS ONE. Experts suspect bleaching occurs when the surrounding water gets too warm for symbiotic algae.

“Our study showed that at least seven of the ten anemone species suffer from bleaching when water temperatures get too high,” said study researcher Ashley Frisch of the ARC Centre of Excellence for Coral Reef Studies at James Cook University. “Importantly, we found bleaching of anemones occurring wherever we looked – from the Red Sea and Indian Ocean to the Indo-Australian region and the Pacific. Sometimes it was on a massive scale.”

In the study, the marine biologists said bleaching most likely causes increased mortality, based on their observations of the sea floor.

“Anemones are naturally tough and live for many years,” Frisch said. “As a result their rates of reproduction are slow – and when they are hit by a killer bleaching event, it can result in their complete loss from an area over a period of time.”

The Australian scientist said the loss of anemones also has a knock-on effect for those fish that depend on them for shelter, like the clownfish from the popular Disney movie ‘Finding Nemo.’

“Bleaching causes the loss of anemonefish, like Nemos, which have nowhere to hide and without the anemones to protect them are quickly gobbled up by predators,” he said. “Also, because the fish appear to perform useful services for the anemone like protecting them from grazing fish, it may also be that the loss of anemonefishes following a bleaching event means the anemones themselves are much less likely to recover.”

In the study, scientists from the United States, Saudi Arabia and Australia observed almost 14,000 anemones around the globe. While they found only 4 percent of the Earth’s anemones are bleached, bleaching rates ranged from 20 to 100 percent following five major bleaching events.

The team concluded the anemone “population viability will be severely compromised if anemones and their symbionts cannot (acclimate) or adapt to rising sea temperatures” in some areas.

“Anemone bleaching also has negative effects to other species,” the researchers noted, “including reductions in abundance and reproductive output of anemonefishes.

“Therefore, the future of these iconic and commercially valuable coral reef fishes is inextricably linked to the ability of host anemones to cope with rising sea temperatures associated with climate change,” the report said.

“If host anemones (and their symbiotic algae) cannot acclimate or adapt to rising sea temperatures, then populations of host anemones and associated anemonefishes are anticipated to decline significantly,” the report concluded.

Frisch said anemones and the fish they protect also have value with respect to tourism and the aquarium trade. Numerous poor coastal communities rely on the income they impart, he said.

Psychedelic Drugs Don’t Make You Crazy, Says Study

Michael Harper for redOrbit.com – Your Universe Online

LSD and other hallucinogenic drugs have been a controversial topic since their effects were first discovered in the 1940s. The euphoric emotions and experiences felt by those under their influence have led some researchers to believe using the drug can lead to mental illness and other negative long-term effects.

A study from the Norwegian University of Science and Technology (NTNU), however, now claims LSD and other psychedelics are not linked to mental illnesses. On the contrary, clinical psychologist Pål-Ørjan Johansen and researcher Teri Krebs say there’s a link between the hallucinogens and fewer mental health issues. Their research is now publicly available in the journal PLOS ONE.

Those who experiment with psychedelics often hail them as life-changing and report having spiritual experiences while under their influence. Apple co-founder Steve Jobs was an outspoken proponent of the use of LSD in his early days and said his experimentation with the drug marked a turning point in his life. On the other hand, songwriter Brian Wilson (Beach Boys) blames the drugs for the mental anguish he experienced for many years thereafter. Syd Barret, legendary guitarist for Pink Floyd, also claims to have experienced mental issues after taking these drugs.

To conduct their research, Krebs and Johansen borrowed data from the 2001 − 2004 National Survey on Drug Use and Health conducted in the United States. Here, participants were asked about their mental health and any treatments they had taken over the past 12 months. Specifically, the NTNU researchers were looking for reported cases of general psychological distress, anxiety disorders, mood disorders and psychosis.

Of those surveyed, 22,000 reported taking psychedelics at least once. After analyzing the data, they discovered those who took the drugs as recently as the previous year were not any more likely to be diagnosed with mental illness. There was also no link between those who continued to use acid, peyote or psylocybin mushrooms throughout their lifetime. In fact, Krebs and Johansen found, if anything, these hallucinogens were associated with fewer mental health issues. Those who continued to use the drugs were less likely to find themselves in a mental health facility or take psychiatric medications.

Though their results showed no direct link between taking psychedelics and developing mental illness, they have not ruled out the possibility of any negative side effects.

“We cannot exclude the possibility that use of psychedelics might have a negative effect on mental health for some individuals or groups, perhaps counterbalanced at a population level by a positive effect on mental health in others,” they wrote in their research. On the other hand, notes Johansen, other clinical trials have also been unable to discover a link between the drugs and mental illness, and numerous anecdotal cases continue to report long-lasting benefits and life-changing experiences.

“Other studies have found no evidence of health or social problems among people who had used psychedelics hundreds of times in legally-protected religious ceremonies,” said Johansen.

Previous studies have found some positive physical benefits associated with taking psychedelic drugs. As hallucinogens are known to give people the ability to access and relive lost or repressed memories, those who struggle with Post-Traumatic Stress Disorder (PTSD) have been prescribed these drugs to help them work out their disorder.

One hallucinogen, called ibogaine, has been known to give users a 36-hour trip which allows them to access their emotions and relive experiences. This drug — and the resulting trip — has also been successfully shown to break people of their addictions to alcohol, cocaine, heroine and methamphetamine.

Australia Floods Dampen Global Sea Level Rise

When enough raindrops fall over land instead of the ocean, they begin to add up.

New research led by the National Center for Atmospheric Research (NCAR) shows that when three atmospheric patterns came together over the Indian and Pacific oceans, they drove so much precipitation over Australia in 2010 and 2011 that the world’s ocean levels dropped measurably. Unlike other continents, the soils and topography of Australia prevent almost all of its precipitation from running off into the ocean.

The 2010-11 event temporarily halted a long-term trend of rising sea levels caused by higher temperatures and melting ice sheets.

Now that the atmospheric patterns have snapped back and more rain is falling over tropical oceans, the seas are rising again. In fact, with Australia in a major drought, they are rising faster than before.

“It’s a beautiful illustration of how complicated our climate system is,” says NCAR scientist John Fasullo, the lead author of the study. “The smallest continent in the world can affect sea level worldwide. Its influence is so strong that it can temporarily overcome the background trend of rising sea levels that we see with climate change.”

The study, with co-authors from NASA’s Jet Propulsion Laboratory and the University of Colorado at Boulder, will be published next month in Geophysical Research Letters. It was funded by the National Science Foundation, which is NCAR’s sponsor, and by NASA.

Consistent rising, interrupted

As the climate warms, the world’s oceans have been rising in recent decades by just more than 3 millimeters (0.1 inches) annually. This is partly because the heat causes water to expand, and partly because runoff from retreating glaciers and ice sheets is making its way into the oceans.

But for an 18-month period beginning in 2010, the oceans mysteriously dropped by about 7 millimeters (about 0.3 inches), more than offsetting the annual rise.

Fasullo and his co-authors published research last year demonstrating that the reason had to do with the increased rainfall over tropical continents. They also showed that the drop coincided with the atmospheric oscillation known as La Niña, which cooled tropical surface waters in the eastern Pacific and suppressed rainfall there while enhancing it over other portions of the tropical Pacific, Africa, South America, and Australia.

But an analysis of the historical record showed that past La Niña events only rarely accompanied such a pronounced drop in sea level.

Using a combination of satellite instruments and other tools, the new study finds that the picture in 2010–11 was uniquely complex. A rare combination of two other semi-cyclic climate modes came together to drive such large amounts of rain over Australia that the continent, on average, received almost one foot (300 millimeters) of rain more than average.

The initial effects of La Niña were to cool surface waters in the eastern Pacific Ocean and push moisture to the west. A climate pattern known as the Southern Annular Mode then coaxed the moisture into Australia’s interior, causing widespread flooding across the continent. Later in the event, high levels of moisture from the Indian Ocean driven by the Indian Ocean Dipole collided with La Niña-borne moisture in the Pacific and pushed even more moisture into the continent’s interior. Together, these influences spurred one of the wettest periods in Australia’s recorded history.

Australia’s vast interior, called the Outback, is ringed by coastal mountains and often quite dry. Because of the low-lying nature of the continent’s eastern interior and the lack of river runoff in its western dry environment, most of the heavy rainfall of 2010–11 remained inland rather than flowing into the oceans. While some of it evaporated in the desert sun, much of it sank into the dry, granular soil of the Western Plateau or filled the Lake Eyre basin in the east.

“No other continent has this combination of atmospheric set-up and topography,” Fasullo says. “Only in Australia could the atmosphere carry such heavy tropical rains to such a large area, only to have those rains fail to make their way to the ocean.”

Measuring the difference

To conduct the research, the scientists turned to three cutting-edge observing instrument systems:

-NASA’s Gravity Recovery and Climate Experiment (GRACE) satellites, which make detailed measurements of Earth’s gravity field. The satellites enable scientists to monitor changes in the mass of continents.
-The Argo global array of 3,000 free-drifting floats that measure the temperature and salinity of the upper 6,000 feet of the world’s oceans.
-Satellite-based altimeters that are continuously calibrated against a network of tide gauges. Scientists subtract seasonal and other variations to closely estimate global sea level changes.

Using these instruments, the researchers found that the mass in Australia and, to a lesser extent, South America began to increase in 2010 as the continents experienced heavy and persistent rain. At the same time, sea levels began to measurably drop.

Since 2011, when the atmospheric patterns shifted out of their unusual combination, sea levels have been rising at a faster pace of about 10 millimeters (0.4 inches) per year.

Scientists are uncertain how often the three atmospheric events come together to cause such heavy rains over Australia. Fasullo believes there may have been a similar event in 1973-74, which was another time of record flooding in that continent. But modern observing instruments did not exist then, making it impossible to determine what took place in the atmosphere and whether it affected sea level rise.

“Luckily, we’ve got great observations now,” Fasullo says. “We need to maintain these observing platforms to understand what is a complicated climate system.”

On the Net:

Massive Fortifications Unearthed Along Ancient Israel Harbor

Lawrence LeBlond for redOrbit.com – Your Universe Online
Archaeologists digging along the Israeli coastal city of Ashdod have unearthed the remains of a massive ancient fortification built around an Iron-Age Assyrian harbor. The fortifications were constructed in the eighth century BCE and formed a crescent-shaped defense for a 17-acre inland region of the harbor.
At the heart of the well-preserved fortification is a mud-brick wall standing more than 12 feet wide and 15 feet high. The wall is covered in layers of mud and sand that stretch for hundreds of feet on either side.
This discovery comes at the end of the first excavation season at the Ashdod-Yam digsite, located just south of Tel Aviv. The project was led by Dr. Alexander Fantalkin of TAU’s Department of Archaeology and Ancient Near Eastern Cultures on behalf of the Sonia and Marco Nadler Institute of Archaeology.
“The fortifications appear to protect an artificial harbor,” says Fantalkin. “If so, this would be a discovery of international significance, the first known harbor of this kind in our corner of the Levant.”
During the time of construction of the massive fortifications, the southeastern part of the Mediterranean basin, including regions of Africa and the Middle East, was under Assyrian rule. Assyrian inscriptions found at the site reveal that by the end of the century, Yamani, the rebel king of Ashdod, led a rebellion against Sargon II, the king of the Assyrian Empire. Yamani called upon Hezekiah, the king of Judah, to join the insurrection, to which Hezekiah declined.
The Yamani rebellion was not taken well by the Assyrians, who eventually destroyed Philistine Ashdod, shifting power to the nearby area of Ashdod-Yam, where the TAU excavation occurred. The team believe the fortifications are linked to these events, but is not exactly sure how as of yet. The walls were either constructed before, during or after the Ashdod rebellion was laid to rest. It is not known if these fortifications were initiated by the citizens of the region or under order of the Assyrians.
“An amazing amount of time and energy was invested in building the wall and glacis [embankments],” said Fantalkin.
The research team previously discovered more recent ruins – from the Hellenistic period (fourth-second centuries BCE) – atop the Iron Age fortifications. These buildings and walls were built long after the fortifications were abandoned and most likely destroyed by an earthquake in the second half of the second century BCE. The researchers also uncovered ancient artifacts at the site, including coins and weights.
Using photogrammetry, the researchers created a three-dimensional reconstruction of the site, showing all the features of the excavation. Equipment for the reconstruction was provided by the University of Nebraska-Lincoln. Digital surveying was performed by Dr. Philip Sapirstein, a postdoctoral fellow at TAU.
This was the first highly-detailed excavation at the site since a series of digs were conducted in the mid-1960s by late Israeli archaeologist Dr. Jacob Kaplan of the Tel Aviv-Jaffa Museum of Antiquities. Kaplan had previously noted the Ashdod rebels built fortifications in anticipation of an Assyrian attack.
However, Fantalkin believes the construction is too impressive to have been constructed under such circumstances.

Copper Found To Play A Role In Onset, Progression Of Alzheimer’s Disease

redOrbit Staff & Wire Reports – Your Universe Online

An element that is essential to all living organisms could also be one of the primary environmental factors that can lead to the onset of Alzheimer’s disease, according to new research appearing in Monday’s edition of the journal Proceedings of the National Academy of Sciences.

Copper, which is a key constituent of the respiratory enzyme complex cytochrome c oxidase, also appears to enhance the progression of the neurodegenerative condition, lead author Rashid Deane of the University of Rochester Medical Center (URMC) Department of Neurosurgery and his colleagues have discovered.

It does this by preventing the brain from clearing out toxic proteins, causing them to accumulate. When copper builds up in the brain, it can cause the blood brain barrier (which controls what enters and exists the brain) to break down. That causes amyloid beta, a toxic protein produced as a by-product of cellular activity, to accumulate.

“It is clear that, over time, copper’s cumulative effect is to impair the systems by which amyloid beta is removed from the brain,” Deane, who is also a member of the Center for Translational Neuromedicine (CTN) explained in a statement. “This impairment is one of the key factors that cause the protein to accumulate in the brain and form the plaques that are the hallmark of Alzheimer’s disease.”

Exposure to copper is all but avoidable, the researchers point out. In addition to being found in foods such as red meats, shellfish, nuts and many different types of fruits and vegetables, it can also be found in drinking water carried by copper pipes and nutritional supplements. The body uses it in nerve conduction, bone growth, connective tissue formation and hormone secretion, Deane’s team noted.

The researchers conducted a series of experiments using both mouse and human brain cells, and through their research they were able to locate the molecular mechanisms through which copper accelerates the pathology of Alzheimer’s disease. Typically, amyloid beta is cleared out of the brain through a protein known as lipoprotein receptor-related protein 1 (LRP1).

Those proteins bind with the amyloid beta found in the brain tissue and lead them into the blood vessels, where they are removed from the brain. During their experiments, however, the research team exposed otherwise normal mice to copper over a period of three months. They placed trace amounts of the metal in their drinking water, creating a concoction that was one-tenth of EPA-approved water quality standards for copper.

“These are very low levels of copper, equivalent to what people would consume in a normal diet.” said Deane. Even so, he and his associates found the copper found its way into the rodents’ circulatory system and accumulated in the blood vessels (specifically, the cellular capillary walls) that carry blood into the brain.

These cells are an essential part of the brain’s defense system, and in this instance the capillaries help keep the copper from entering the brain. Over time, however, the metal can accumulate. Deane’s team observed the element disrupted the function of LRP1 through the process of oxidation, which ultimately resulted in amyloid beta from being removed. The phenomenon was reportedly detected in both mouse and human brain cells.

The study authors then looked at the impact of copper exposure on mouse models of Alzheimer’s disease. They discovered the blood brain barrier cells in those mice had broken down and become “leaky,” which they attribute to a combination of aging and the cumulative impact of copper being allowed to pass through to the brain.

“They observed that the copper stimulated activity in neurons that increased the production of amyloid beta. The copper also interacted with amyloid beta in a manner that caused the proteins to bind together in larger complexes creating logjams of the protein that the brain’s waste disposal system cannot clear,” URMC explained.

“This one-two punch, inhibiting the clearance and stimulating the production of amyloid beta, provides strong evidence that copper is a key player in Alzheimer’s disease,” he added. “In addition, the researchers observed that copper provoked inflammation of brain tissue which may further promote the breakdown of the blood brain barrier and the accumulation of Alzheimer’s-related toxins.”

Deane advises approaching the study results with caution, since copper is essential to numerous biological functions. He says it will be important for people to make sure they consume enough of the mineral without being exposed to too much of it. While he said his team cannot say for sure what the correct level is, but that diet would likely play a key role in the regulation of this process.

Neanderthal Tools Reveal ‘Cultural’ Differences

Brett Smith for redOrbit.com – Your Universe Online

A new analysis from an archeologist at the University of Southampton in the United Kingdom has revealed distinct cultural differences between two groups of Neanderthals based on the divergent design of stone tools between 115,000 and 35,000 years ago. According to a study by researcher Karen Ruebens, the differences point to a more complex Neanderthal culture than what was previously suspected.

“In Germany and France there appears to be two separate handaxe traditions, with clear boundaries, indicating completely separate, independent developments,” Ruebens said. “The transition zone in Belgium and Northern France indicates contact between the different groups of Neanderthals, which is generally difficult to identify but has been much talked about, especially in relation to later contacts with groups of modern humans.”

“This area can be seen as a melting pot of ideas where mobile groups of Neanderthals, both from the eastern and western tradition, would pass by, influencing each other’s designs and leaving behind a more varied record of bifacial tools,” she added.

The study was published in the Journal of Human Evolution and describes how Neanderthals in the western region of northern Europe fashioned symmetrical, triangular and heart-shaped handaxes. In contrast, Neanderthals in the eastern region tended to make asymmetrical, bifacial knives.

“Distinct ways of making a handaxe were passed on from generation to generation and for long enough to become visible in the archaeological record,” Reubens said. “This indicates a strong mechanism of social learning within these two groups and says something about the stability and connectivity of the Neanderthal populations.”

“Making stone tools was not merely an opportunistic task,” she added. “A lot of time, effort and tradition were invested and these tools carry a certain amount of socio-cultural information, which does not contribute directly to their function.”

The study also indicated that available raw materials, location, and tool reuse did not influence handaxe design, strengthening the case that these designs were culturally driven.

“Principally, this study presents an archaeological contribution to behavioral concepts such as regionality, culture, social transmission and population dynamics,” Reubens wrote in her report. “It illustrates the interpretive potential of large-scale lithic studies, and more specifically the presence of regionalized cultural behavior amongst late Neanderthal groups in Western Europe.”

A study published last week revealed another feature of prehistoric European Neanderthals – they used bone tools to refine leather that are similar to the ones still in use today. Known as lissoirs, the tools are used to smooth out animal hides and make them water-resistant.

Found at sites in southwestern France, the tools were dated to between 51,000 and 41,000 years ago. To confirm speculation that the first bone fragment was indeed a leatherworking tool, researchers reached out to luxury-goods manufacturer Hermès in Paris.

“(We) showed them a picture, and they recognized it instantly,” study co-author Shannon McPherron, an archeologist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, told Nature Magazine.

Despite the encouraging confirmation, McPherron said his team did not immediately jump to any conclusions.

“You find one, and there’s always some doubt,” he said. “You’re worried that it’s not a pattern — that it’s anecdotal behavior.”

However, the team was able to back up their initial find with similar discoveries at other sites in the region, leading them to believe that the lissoirs were commonly used tools among Neanderthals.

Effects Of Bullying Follow Victims Into Adulthood

Lee Rannals for redOrbit.com – Your Universe Online
A study found that adults who were bullied as a child were more likely to face serious illness, struggle to hold down a regular job and have poor social relationships. The results highlight the effects that childhood bullying has on individuals throughout their lives. The researchers also looked into a variety of factors that go beyond health-related outcomes.
“We cannot continue to dismiss bullying as a harmless, almost inevitable, part of growing up,” psychological scientists Dieter Wolke of the University of Warwick said in a statement. “We need to change this mindset and acknowledge this as a serious problem for both the individual and the country as a whole; the effects are long-lasting and significant.”
Wolke and his colleagues investigated the impact of all those affected by bullying, including 1,420 participants who were either the victims or the bullies themselves. The “bully-victims” were at greatest risk for health problems in adulthood, making them over six times more likely to be either diagnosed with a serious illness, smoke regularly, or develop a psychiatric disorder.
The team also found that bully-victims often end up turning to bullying themselves because they may lack the emotional control or support required to cope with the psychological harm caused by their own experiences of being bullied.
“In the case of bully-victims, it shows how bullying can spread when left untreated,” Wolke said in the release. “Some interventions are already available in schools but new tools are needed to help health professionals to identify, monitor, and deal with the ill-effects of bullying. The challenge we face now is committing the time and resources to these interventions to try and put an end to bullying.”
Both bullies and bully-victims were more than twice as likely to have difficulties keeping a job compared to those not involved in bullying. The study, however, revealed few ill effects of being the bully, finding that the act of bullying itself didn’t seem to have a negative impact in adulthood.
“Bullies appear to be children with a prevailing antisocial tendency who know how to get under the skin of others, with bully-victims taking the role of their helpers,” Wolke explained in the press release. “It is important to finds ways of removing the need for these children to bully others and, in doing so, protect the many children suffering at the hand of bullies — they are the ones who are hindered later in life.”
All of the groups studied showed signs of having difficulty forming social relationships, particularly when it came to maintaining long term friendships or good ties with parents in adulthood.
Back in June, researchers reported in the journal Human Performance that bullying doesn’t just stop in childhood, but can carry on into the workplace as an adult. The team said that unattractive people were more likely to be bullied in the workplace than their better-looking counterparts. Officials at the Workplace Bullying Institute say that about 35 percent of the US workforce admits they have been bullied while at work.

Mini Quadcopter Uses Smartphone To Fly On Its Own

Lee Rannals for redOrbit.com – Your Universe Online

Vienna University of Technology (TU Vienna) researchers have developed a quadcopter – a helicopter propelled by four rotors – that can maneuver its way around a room completely on its own using only a smartphone.

The Virtual-Reality-Team at TU Vienna created a quadcopter that does not need a human interface to navigate through a room, and its on-board computer power is a standard smartphone. The quadcopter utilizes the smartphone’s camera to provide visual data and the device’s processor as a control center.

“Proceeding towards robotics and mounting a camera onto a quadcopter was just the logical next step for us,” said Hannes Kaufmann, Faculty of Informatics at TU Vienna, in a statement.

In order to test the quadcopter’s navigational capabilities, the team attached visual codes to the floor to help it create a virtual map of the environment. The quadcopter was able to pick up the codes, obtain information, navigate the environment and head for a specific known location or go to an unexplored region.

“In the future, the quadcopter should also be able to do without these codes. Instead, we want it to use naturally occurring reference points, which can be obtained from the camera data and also from depth sensors such as the MS Kinect”, says Annette Mossel, chief engineer of the quadcopter project.

The team envisions a variety of uses for their quadcopter. Today, companies like Parrot have built drones like this that can be piloted through a smartphone application, but these quadcopters work more like toys than tools. TU Vienna scientists believe their autonomous quadcopter could be used in emergency situations, such as firemen sending it into a burning building to transmit a 3D picture from inside before they enter.

If the team could develop miniature quadcopters then these devices could be used to help guide people to the right place within large buildings. Less-developed regions of the world could use the devices to help monitor illegal forest clearance or poaching activity without having to use expensive full-size helicopters.

The team said they were able to build the quadcopters for less than a thousand Euros, or about $1,300 in parts.

In June, researchers at the University of Minnesota reported that they created a quadcopter that can be controlled using brainwaves. The team used an electroencephalography (EEG) interface system so participants could control a flying four-blade robot accurately.

Cyclospora Infections Continue To Climb Despite Halt On Salad Mix Production

Lawrence LeBlond for redOrbit.com – Your Universe Online

US health officials have continued their investigation into the root cause of a Cyclospora outbreak that has so far infected more than 575 people across 19 US states. However, while infections from two states have been traced back to a salad packer in Mexico, more people are continuing to get sick in other states around the country.

According to an August 15 update by the US Centers for Disease Control and Prevention (CDC), cyclosporiasis, an infection caused by the single-celled parasite Cyclospora cayetanensis, has infected 576 people in 19 states. In Iowa and Nebraska, where prepackaged salad mix has been confirmed as a source of infection, no new illnesses had been reported since the beginning of July.

The Food and Drug Administration (FDA) confirmed last week salad mix packaged by Taylor Farms de Mexico was the cause of infection in illnesses reported in both Iowa and Nebraska. Both state health departments have since reported all contaminated product is now out of the consumer system.

The FDA has continued to investigate illness outbreaks in other states and has yet to find a reliable source of infection in those outbreaks.

Of all reported cases of infection, only 9 percent (36 individuals) have so far been hospitalized as a result of this stomach bug, which causes flu like symptoms including nausea, diarrhea, vomiting, body aches and fever.

According to CDC data, the following states have been affected with cyclosporiasis: Arkansas (10 cases), California (1), Connecticut (1), Florida (29), Georgia (4), Illinois (11), Iowa (153), Kansas (4), Louisiana (3), Minnesota (2), Missouri (4), Nebraska (86), New Hampshire (1), New Jersey (2), New York (7), Ohio (2), Texas (240), Virginia (2) and Wisconsin (14).

According to FoodSafetyNews.com, case counts had increased an additional 20 illnesses by Saturday, with two new reported cases surfacing in Iowa. It is still too early to tell if the new Iowa cases are linked to salad mix or perhaps a similar source that has infected people in other states. Texas has climbed from 240 cases to 258 as of Saturday, according to the news site, citing information taken from both the CDC and state health departments.

Previous outbreaks of Cyclospora have typically been caused by contaminated produce. While it can generally make people quite ill, it is usually not life-threatening.

“On the infectious disease scale, this ranks well below the more notorious and dangerous ailments like E. coli and salmonella,” Dr. Lewis Marshall Jr., chairman of the outpatient services at Brookdale University Hospital and Medical Center in New York City, said in an interview with HealthDay reporter Dennis Thompson.

“It is unlikely to be fatal, but certainly can make one’s life miserable,” he added. “Symptoms include crampy abdominal pain, watery diarrhea, loss of appetite, bloating, nausea, fatigue, fever, headache and body aches.”

Most people who are infected can have diarrhea that seems to last for days; it generally takes about a week for people who are infected to become ill. Most healthy people can fight off the infection easily, but some cases can linger for weeks or months without medical intervention.

Marshall noted there could be more cases out there than what is being reported. It is very possible “that most occurrences go unreported, as many people wouldn’t recognize the symptoms as any different than a common stomach bug,” he said.

Dr. Thomas Frieden, director of the CDC, urges anyone who has suffered from diarrhea for more than a few days to be tested for cyclosporiasis.

With most illnesses of Cyclospora stemming from produce, there are ways to avoid becoming infected altogether. Washing fruit and vegetables before consumption will help, but people should always wash their hands thoroughly before and after handling food to really reduce the likelihood of an infection from occurring.

The rule of washing your fruits and vegetables also applies to prepackaged salad mixes, argues Dr. Salvatore Pardo, vice chairman of the emergency department at Long Island Jewish Medical Center in New Hyde Park, New York.

“My hunch is the public does not [wash their] ‘prepackaged’ salad, which is normally purchased for convenience and dumped into the bowl since it tends to be free from particles — dirt, sand, critters — one would normally find in locally picked ingredients,” Pardo told HealthDay.

Nearby Hot Jupiter Analyzed As Part Of Ongoing Exoplanet Search

[ Watch the Video: The Strange Attraction of Hot Jupiters ]
redOrbit Staff & Wire Reports – Your Universe Online
The number of exoplanets discovered by astronomers since the beginning of the Space Age some five decades ago suggests there are possibly over 100 billion such worlds outside our solar system, according to one Caltech astronomer involved with NASA’s Kepler mission.
According to John Johnson, an assistant professor of astronomy at the Pasadena-based university, over the past 50 years the number of known planets located beyond the Milky Way has increased from zero to more than 850.
As if that wasn’t enough, there are also thousands of additional potential exoplanet candidates currently awaiting official confirmation, the US space agency said. The rate at which ground-based telescopes and orbiting observatories such as Kepler have located these extrasolar planets suggests, “there are at least 100 billion planets in our galaxy,” Johnson said. “That’s mind-boggling.”
Interestingly enough, when the earliest days of the search for exoplanets began, astronomers focused on attempting to find Earth-like worlds capable of supporting life in distant solar systems. However, NASA said planets as small as our own which are orbiting stars located hundreds of light-years away have proven difficult to find.
The “real haul,” according to the space agency, has been gas giants – especially the massive exoplanets known as “hot Jupiters.” These worlds share many of the characteristics of our solar system’s largest planet, except they have higher surface temperatures due to the close proximity of their orbits to their stars – between approximately 0.015 and 0.5 astronomical units. By comparison, Jupiter orbits the Sun at 5.2 astronomical units.
Since they orbit so close to their parent stars, hot Jupiters tend to block a portion of their star’s light when they transit in front. These events are known as “mini-eclipses” and have resulted in hundreds of discoveries. While interest in these types of exoplanets was initially low, they have begun to attract some attention as of late, NASA explained.
To illustrate the point, officials from the aeronautics administration refer to “HD 189733b,” a planet discovered by researchers working out of the Haute-Provence Observatory in France in 2005. This planet is just 63 light years away and manages to block three percent of the light produced by its orange-dwarf parent star, and NASA reports astronomers have been able to learn a great deal by analyzing it.
“For one thing, it’s blue,” the agency said. “Data obtained by the Hubble Space Telescope suggest that, seen from a distance, the azure disk of HD 189733b would look to the human eye much like Earth. Indeed, some members of the media have taken to calling it ‘the other blue planet.’ It is, however, anything but Earthlike.
“In 2007, Heather Knutson of Caltech made a global temperature map of HD189733b using NASA’s infrared Spitzer Space Telescope. She knew it would be hot because HD189733b orbits its star 13 times closer than Mercury,” added NASA. “Temperatures ranged from 1200 F on the nightside to 1700 F on the dayside. Thermal gradients drive winds as fast as 6000 mph, carrying suffocating heat around the globe.”
Experts believe the planet’s blue color could be caused by the presence of silicate particles in its atmosphere. Those particles would scatter blue wavelengths of light from its parent star, and since silicates are a component of glass, some researchers speculate it could be raining molten glass on HD189733b.
NASA and ESA X-ray observatories recently observed the planet transit its star and detected a drop in X-rays three times deeper than the corresponding decrease in optical light. That would indicate the planet’s outer atmosphere is larger than anyone expected. It could actually be boiling away, according to researchers who have estimated HD189733b is losing 100 million to 600 million kilograms of mass per second.
Last Thursday, NASA announced it was scrapping attempts to fully restore Kepler, which had two of its four gyroscope-like reaction wheels fail over the past 13 months. The wheels are used to precisely direct the spacecraft and engineers had been hoping to restore at least one of them to working condition.
Those efforts have proven unsuccessful and now the agency is working on potential new missions Kepler would be able to complete its current state. In the meantime, Kepler has been returned to its point rest state, which is a stable configuration where the telescope uses its thrusters to control its pointing with minimal fuel use.

Researchers Successfully Sneak Malware Through Apple’s App Review Process

redOrbit Staff & Wire Reports – Your Universe Online

iPad and iPhone users, beware: Apple’s mobile app review procedures might not be as good at detecting potentially malicious software as you might think, according to a group of Georgia Tech computer security researchers.

In a paper presented last week at the Usenix Security ’13 conference, authors Tielei Wang, Kangjie Lu, Long Lu, Simon Chung, and Wenke Lee detail how they were able to sneak malware through the Cupertino, California tech giant’s screening process.

According to David Talbot of MIT Technology Review, the app containing the malicious software claimed to offer news from Georgia Tech. In reality, though, it contained dormant pieces of code that would later combine to form malware capable of posting tweets, sending texts or emails, taking pictures, stealing personal information and attacking other programs without the device owner’s knowledge.

The study authors dub these types of malware programs “Jekyll apps” because they keep their true nature hidden until they are installed on a person’s phone or tablet. Once they are on a user’s end device, they rearrange signed code to execute malicious control flows that were nonexistent during the actual software review process.

“We implemented a proof-of-concept Jekyll app and successfully published it in App Store. We remotely launched the attacks on a controlled group of devices that installed the app,” they wrote in their paper, Jekyll on iOS: When Benign Apps Become Evil. “The result shows that, despite running inside the iOS sandbox, Jekyll app can successfully perform many malicious tasks, such as stealthily posting tweets, taking photos, stealing device identity information, sending email and SMS, attacking other apps, and even exploiting kernel vulnerabilities.”

Furthermore, the Georgia Tech researchers included code on their Jekyll app that allowed them to monitor Apple’s review process. They discovered that the app had only been tested for “a few seconds” before it was allowed to go live on the iOS App Store. Lu told AppleInsider that the program was only accessible for a few minutes, and was not installed by any consumers before it was removed from the App Store as a safety precaution.

“The message we want to deliver is that right now, the Apple review process is mostly doing a static analysis of the app, which we say is not sufficient because dynamically generated logic cannot be very easily seen,” Lu explained.

“Apple takes justifiable pride in its iOS security regime. Though the company’s scrutiny of third-party apps often forces developers to do extra work to satisfy its rules, its oversight has keep malware at bay more effectively than the efforts by the company’s competitors,” added Information Week Editor-at-Large Thomas Claburn. “Nonetheless, iOS, like any operating system, has flaws that can be identified and exploited. While Apple tends to address such flaws quickly once it becomes aware of them, it can’t fix problems that it can’t identify.”

Apple spokesman Tom Neumayr told AppleInsider that his company has reviewed the research, and that they had already updated their mobile operating system to address the issues raised by the Georgia Tech researchers. He did not provide specific information about what changes were made, nor did he address the App Store review process itself, the website staff added.

Flavonoids In Celery And Artichokes Kill Cancer Cells

Brett Smith for redOrbit.com – Your Universe Online

Vegetables like celery and artichokes have long been thought to convey numerous health benefits, and a new study from scientists at University of Illinois at Urbana-Champaign revealed that these foods have two chemical components, apigenin and luteolin, known as flavonoids that are capable of killing off cancer cells.

“Apigenin alone induced cell death in two aggressive human pancreatic cancer cell lines. But we received the best results when we pre-treated cancer cells with apigenin for 24 hours, then applied the chemotherapeutic drug gemcitabine for 36 hours,” said study author Elvira de Mejia, a U of I food chemistry and toxicology professor.

The researchers found that using the two flavonoids as a pretreatment was more effective than applying them together with the chemotherapeutic drug simultaneously. In fact, applying both the drug and the flavonoids at the same time resulted in a highly undesirable effect.

“Even though the topic is still controversial, our study indicated that taking antioxidant supplements on the same day as chemotherapeutic drugs may negate the effect of those drugs,” said Jodee Johnson, a U of I researcher who worked on the study as a doctoral student in de Mejia’s lab.

“That happens because flavonoids can act as antioxidants,” she added. “One of the ways that chemotherapeutic drugs kill cells is based on their pro-oxidant activity, meaning that flavonoids and chemotherapeutic drugs may compete with each other when they’re introduced at the same time.”

In the study, which was published in the journal Molecular Nutrition and Food Research, the Illinois researchers found that apigenin blocked an enzyme called glycogen synthase kinase-3β (GSK-3β), which led to a drop in the production of anti-apoptotic genes in the pancreatic cancer cells. These genes cause a cancer cell to self-destruct because its DNA has been damaged.

In one cell line, the percentage of self-destructing cancer cells went from 8.4 percent in cells that had not been dosed with apigenin to almost 44 percent in cells that had been administered a 50-micromolar dose of the flavonoid. Chemotherapy drugs had not been used on either group of cells.

The researchers also found that apigenin modified gene expression.

“Certain genes associated with pro-inflammatory cytokines were highly upregulated,” de Mejia said.

While pancreatic cancer patients would probably not be able to eat enough celery or artichokes to boost flavonoids in the blood to an effective level, drugs or supplements could be used to achieve the desired concentrations, according to de Mejia.

The Illinois food scientist added that everyone should consider adding foods high in flavonoids to their regular diet.

“If you eat a lot of fruits and vegetables throughout your life, you’ll have chronic exposure to these bioactive flavonoids, which would certainly help to reduce the risk of cancer,” she noted.

Researchers have been looking into the anti-cancer properties of flavonoids for years. A study published in 2008 by UCLA researchers found that smokers who ate foods rich in certain flavonoids, including strawberries, brussel sprouts and apples, may reduce their lung cancer risk.

Soda Might Lead To Aggressive Behavior In Children

Michael Harper for redOrbit.com – Your Universe Online

Much to NYC Mayor Mike Bloomberg’s chagrin, soft drinks are a very popular beverage in the United States, enjoyed by men, women and children of every age.

According to a study from Columbia University’s Mailman School of Public Health and others, Americans buy more soda per capita than any other country. The same study also goes on to warn Americans that soft drinks might be associated with behavioral problems in younger children, though they’re not sure how or how closely related the two may be.

A similar correlation between soda and problems in adolescents has been observed, but according to Shakira Suglia, ScD and colleagues from the University of Vermont and Harvard School of Public Health, the same relationship has yet to be studied in young children. This new research is scheduled to be published in an upcoming issue of The Journal of Pediatrics, despite its inconclusive results.

The researchers analyzed data from an ongoing study to find a link between consumption of sugary soda drinks and aggression and withdrawal. Nearly 3,000 five-year-old children are enrolled in this study, which follows mostly single African American and Latino mothers and their children from 20 large US cities. The mother-child pairs in question were first interviewed as a part of the broader study between 1998 and 2000 and will be periodically interviewed going forward. The mothers were asked about their child’s behavior over the previous two months, including periods of aggression or withdrawal. The mothers were also asked questions about their child’s habits, such as how much soda they enjoyed or how many hours of TV they watched a day.

Shakira Suglia and her colleagues pulled the data about aggressive behavior and soft drink consumption from the rest of the pile and found that many children, about 43 percent, had at least one serving of soda a day. A much smaller number of children, about four percent, had more than four servings of soda daily.

The larger study from which Suglia and team culled their data measured aggression on a 0 − 100 scale, the higher the number, the more aggressive tendencies the child exhibited. Suglia says the study defined aggressive behavior as when children destroy their belongings or the belongings of others. The average score of the nearly 3,000 children participating in the study is below 50. When the score gets to 65 and higher, doctors normally suggest the child be evaluated for a problem.

Suglia’s team found that kids who drank no soda each day typically scored a 56 on the aggression scale. They became interested, however, in the way the numbers changed with every soda consumed each day. Kids who had one soda a day averaged a 57 on the aggression scale; two sodas a day bumped the average to 58 and three servings nudged the average to 59. Four or more sodas translated to an average aggression rating of 62.

Even after taking other factors — such as their mothers’ race, how much TV they watched and their other dietary habits — the link between aggression and soda consumption remained.

Suglia is quick to point out, of course, that this doesn’t act as conclusive proof that drinking soda directly correlates with aggressive behavior.

“It’s a little hard to interpret it. It’s not quite clinically significant,” said Suglia in an interview with Reuters.

Though they found what may be a link between the two, the researchers also say they’re not sure what’s causing the link. Two main ingredients of soda — corn syrup and caffeine — may be responsible for this behavior.

How Cosmic Turbulence Creates Stars And Black Holes

John P. Millis, PhD for redOrbit.com – Your Universe Online

The mechanism by which stars and black holes are formed in extreme cases of high mass density has puzzled astronomers since Johannes Kepler first laid out his laws of planetary motion some 400 years ago. In the most basic sense, a large cloud of gas condenses into a rotating disk. Mass is then directed towards the center of the cloud where, due to the increasing density, a stellar body is eventually formed.

One challenge to this picture of star formation is that a mechanism is required to funnel matter inwards, since these disks are otherwise stable. “These accretion discs are extremely stable from a hydrodynamic perspective as according to Kepler’s laws of planetary motion angular momentum increases from the center towards the periphery,” explains Helmholtz-Zentrum Dresden-Rossendorf (HZDR) physicist Dr. Frank Stefani.

“In order to explain the growth rates of stars and black holes, there has to exist a mechanism, which acts to destabilize the rotating disc and which at the same time ensures mass is transported towards the center and angular momentum towards the periphery.”

For more than half a century, astronomers have suspected that magnetic fields figure prominently into this picture, supplying the turbulences that would be required to destabilize the flow and allow matter to flow to the center. Yet it took about 32 years for the magnetorotational instability (MRI) theory to be fully modeled in these systems.

Even so, such models predict the rotating field of gas must contain at least a minimum level of electrical conductivity, lest “dead zones” form within the disk. Previous work had indicated such dead zones would quell turbulent motions, slowing or stopping accretion of matter at the disk’s center.

To overcome the dead zones problem, computational models indicate a vertically oriented magnetic field would need to be unusually strong, with rotational speeds in the disk being very high. By adding circular magnetic fields to the vertical component, much lower field strengths are needed. Yet, this too has problems. In this situation, the geometry of the fields would push more to the edges of the disk, contrary to the Keplerian view.

A new study led by Stefani and his colleague Oleg Kirillov, has shown if the magnetic field at least partially arises from within the accretion disk, instead of external to it, the MRI will produce the desired Keplerian rotation profile in the disk.

“This is, in fact, a much more realistic scenario”, notes Stefani. “In the extreme case that there does not exist a vertical field, we’re looking at a problem of what came first – the chicken or the egg. A circular magnetic field acts to destabilize the disc and the resulting turbulence generates components of vertical magnetic fields. They in turn reproduce the circular magnetic field because of the special form of the disc’s rotational movement.”

Additional experimental follow-up will be required to verify the findings, but this new magnetic field model within the disk appears to close the gap between theory and observation.

Stefani and Kirillov presented the results of their study in a paper titled “Extending the Range of Inductionless Magnetorotational Instability” published in the latest edition of the journal Physical Review Letters.

Researchers Link Two Genes To Chronic Mountain Sickness

redOrbit Staff & Wire Reports – Your Universe Online

An international team of researchers has discovered the genetic basis of chronic mountain sickness (CMS), a condition developed during prolonged exposure to high altitudes, according to research appearing in the August 15 edition of the American Journal of Human Genetics.

The condition, which is also known as “Monge’s disease,” is the result of low oxygen levels and can cause severe damage to the body, including heart attacks, stroke and early-onset pulmonary issues.

Now, researchers have studied the DNA of Andean residents in an attempt to discover the genetic mutations associated with CMS. For the study, a total of 20 Andean individuals had their DNA sequenced. Ten of those participants had CMS and 10 were control subjects and did not suffer from the condition.

Following a high-efficiency survey of the entire spectrum of genetic variations, the investigators discovered 11 regions which demonstrated significant differences in haplotype frequencies consistent with selective sweeps. In those regions, the researchers located two genes – ANP32D and SENP1 – that were found to have significantly increased expression in CMS individuals when compared to their non-CMS counterparts.

Those genes were found to play an essential role in tolerance to hypoxia, a pathological condition in which the entire body or a specific part of it is deprived of adequate oxygen supply. The information will also help experts better understand the mechanisms behind human adaptation to hypoxia, the researchers said.

“They speculated that the increased expression of SENP1 may play a role in the basic pathogenesis of polycythemia in CMS individuals,” the Beijing Genomics Institute (BGI), which was one of the institutions involved in the research, said in a statement. “ANP32D acts as an oncogene, which may alter cellular metabolism in a fashion that is similar to that of cancer cells, especially given that such cells can flourish in low oxygen conditions.”

“We showed that the genes that were identified by the whole-genome scan were actually linked causally to sickness in low-oxygen environments,” added co-senior author Dr. Gabriel Haddad of the University of California, San Diego. “With further study, the two genes we identified and validated may become potential drug targets for treating conditions related to low oxygen levels, such as strokes and heart attacks. In addition, they may also be considered as targets for a potential drug treatment for chronic mountain sickness.”

Going forward, the authors said they will look to complete whole genome sequencing for the nearly 100 remaining patient samples in the hopes they can locate biomarkers that can be used to predict CMS. In addition, they retrieved skin samples from the study participants. Those cells could be reprogrammed into induced pluripotent stem cells that could ultimately be used to test resilience to low oxygen levels.

ER Injuries Most Often Related To Malt Liquor And Strong Beer

Brett Smith for redOrbit.com – Your Universe Online

A preliminary study at a Baltimore hospital found that drinkers of hard liquors and high-alcohol beers like Steel Reserve were more prevalent in the emergency room than drinkers of regular beers.

“Recent studies reveal that nearly a third of injury visits to Level I trauma centers were alcohol-related and frequently a result of heavy drinking,” said lead study author David Jernigan, a director of the Center on Alcohol Marketing and Youth (CAMY) at Johns Hopkins University.

“Understanding the relationship between alcohol brands and their connection to injury may help guide policy makers in considering taxation and physical availability of different types of alcohol given the harms associated with them.”

The study, published in the journal Substance Use and Misuse, was the first to look at how alcohol related to serious injury according to brand, researchers said.

The team collected their information at the Johns Hopkins Hospital Emergency Department in East Baltimore on weekend nights between April 2010 and June 2011. Of the more than 100 respondents who said they had been drinking alcohol before their injury, 69 percent were male and 69 percent were African American, which reflected the demographics of the hospital’s surrounding neighborhood.

Participants also told researchers about their alcohol consumption by brand. After comparing their results to market data from research company Impact Databank, the researchers found that the proportion of distilled spirits consumed by participants was higher than the market share for distilled spirits. Vodka, gin and cognac were more prevalent in the emergency room than their national market share. The same was true for ‘ready-to-drink’ beverages. Female participants were more likely to report consuming higher amounts of these pre-packaged cocktails.

While hard liquor was over-represented in the ER sample, beer was underrepresented compared to the national market share for the popular beverage. However, male participants were more likely to report consuming large quantities of beer or malt liquors. Almost half of participants said they drank Steel Reserve, Colt 45, Bud Ice and King Cobra before their injury. These four beverages make up only 2.4 percent of beer consumption in the United States.

The researchers noted that these are only preliminary findings from a small sample set in an urban hospital. They said they plan to pursue this research in a larger sample across emergency rooms in multiple cities and hospitals.

They also suggested policy implications that could include requirements for clear labeling of alcohol content on certain beverage containers, limits on availability and graduated taxation of alcohol based on alcohol content to cut consumption of higher-alcohol products.

One type of drink tracked in the study, ready-to-drink beverages, is rapidly growing in popularity. According to data released last year by researcher Technomic’s Adult Beverage Resource Group, Skinny Girl bottled cocktails are the fastest growing spirits brand in the US. Growing almost 390 percent from 2010 to 2011, the company was owned by reality television star Bethany Frankel until 2011 when she sold it to Beam Global for a reported $120 million. The company released four new products earlier this year: Moscato, White Cherry Vodka, Mojito and Grapefruit Margarita.

Scientists Get A Closer Look At What Makes Typhoid Fever Tick

Brett Smith for redOrbit.com – Your Universe Online

In the early 20th century, Typhoid Mary became one of the iconic symbols of disease outbreak when she infected about 50 people with typhoid fever while working as a cook in and around New York City.

Scientists have long wondered how a working-class Irish immigrant who appeared so healthy could have carried the Salmonella bacterium responsible for the disease, S. typhi, for so long without any symptoms.

According to a new study published in the journal Cell Host & Microbe, one reason the bacterium is able to lie dormant is by manipulating the immune systems’ macrophages to suit its own needs and enable its proliferation.

“Between 1 and 6 percent of people infected with S. typhi, the salmonella strain that causes typhoid fever, become chronic, asymptomatic carriers,” said study author Denise Monack, an associate professor of immunology and microbiology at Stanford University. “That is a huge threat to public health.”

In the study, Stanford and University of California, San Francisco researchers developed a mouse model of lasting salmonella infection using S. typhimurium, a related strain that can cause an infection lasting as long as two years — the average mouse lifespan.

Previous research by the same team showed that salmonella bacteria are capable of inhabiting macrophages, the immune system cells that typically engulf and destroy invading pathogens. These aggressive defenders will take on a different posture depending on their environment.

“Early in the course of an infection,” Monack said, “inflammatory substances secreted by other immune cells stir macrophages into an antimicrobial frenzy. If you’re not a good pathogen, you’ll be wiped out after several days of causing symptoms.”

However after the initial immune response, the immune system changes gears and releases anti-inflammatory factors since the body can’t take too much inflammation. These factors cause macrophages to enter a more docile phase. The less aggressive macrophages assist in wound healing and other functions instead of eating up microbes.

The more subdued macrophages are also more hospitable to invading salmonella that may have survived the initial assault, the new study found. The researchers came to this conclusion by checking gut lymph nodes and spleens of their infected mice for the presence of pro- and anti-inflammatory substances. They also recorded the ratio of inflammatory versus anti-inflammatory macrophages.

The researchers found that the ratio of anti-inflammatory substances increased and corresponded to a predominance of anti-inflammatory-macrophages in the gut lymph nodes and spleen over the course of the infection. During these later stages of infection, the scientists found that S. typhimurium preferred to inhabit anti-inflammatory macrophages due to the fact that they were more able to replicate in the anti-inflammatory type.

Researchers also found that salmonella required intracellular factors called proliferation-induced receptors, or PPARs. While S. typhimurium initially invaded the mice’s spleen and gut lymph nodes regardless of PPAR-delta status, six weeks later the bacteria was undetectable in mice that had their ability to produce PPAR-delta artificially disabled. However, the bacteria were still prevalent in the tissues of PPAR-delta-producing mice.

“Salmonella is doing something to activate PPAR-delta,” Monack said. “We suspect it’s releasing some as-yet-unknown PPAR-delta-stimulating virulence factor into the macrophages it infects. If we can figure out what that is, it could lead to some great anti-salmonella therapeutics with relatively fewer side effects.”

Plants Pushed Up The Mountain By Warming Climate

University of Arizona

Comparing plant communities today with a survey taken 50 years ago, a UA-led research team is providing the first on-the-ground evidence for Southwestern plants being pushed to higher elevations by an increasingly warmer and drier climate.

In a rare opportunity to directly compare today’s plant communities with a survey taken in the same area 50 years ago, a University of Arizona-led research team has provided the first on-the-ground evidence that Southwestern plants are being pushed to higher elevations by an increasingly warmer and drier climate.

The findings confirm that previous hypotheses are correct in their prediction that mountain communities in the Southwest will be strongly impacted by an increasingly warmer and drier climate, and that the area is already experiencing rapid vegetation change.

In a rare opportunity to obtian a “before – after” look, researchers studied current plant communities along the same transect already surveyed in 1963: the Catalina Highway, a road that winds all the way from low-lying desert to the top of Mount Lemmon, the tallest peak in the Santa Catalina Mountains northeast of Tucson.

“Our study provides the first on-the-ground proof of plants being forced significantly upslope due to climate warming in southern Arizona,” said Richard C. Brusca, a research scientist in the UA’s department of ecology and evolutionary biology who led the study together with Wendy Moore, an assistant professor in the UA’s department of entomology. “If climate continues to warm, as the climate models predict, the subalpine mixed conifer forests on the tops of the mountains – and the animals dependent upon them – could be pushed right off the top and disappear.”

The study, published in the journal Ecology and Evolution, was made possible by the existence of a dataset compiled 50 years ago by Robert H. Whittaker, often referred to as the “father of modern plant ecology,” and his colleague, William Niering, who catalogued the plants they encountered along the Catalina Highway.

Focusing on the 27 most abundantly catalogued plant species, Brusca and Moore discovered that three quarters of them have shifted their range significantly upslope, in some cases as much as a thousand feet, or now grow in a narrower elevation range compared to where Whittaker and Niering found them in 1963.

Specifically, Moore and her team found that the lowermost boundaries for 15 of the species studied have moved upslope; eight of those species now first appear more than 800 feet higher than where Whittaker and Niering first encountered them. Sixteen of the studied species are now restricted to a narrower band of elevation, the researchers noticed. As far as the plants’ upper elevation limits were concerned, the researchers observed a mixed trend: They found it to be higher for four species, lower for eight species and unchanged for 15.

For example, in 1963 Whittaker and Niering recorded alligator juniper as a component of upland desert and grassland communities in the Catalina Mountains, beginning at an elevation of just 3,500 feet. Today, one has to drive to the 5,000-foot elevation marker on the Catalina Highway to see the first live alligator juniper trees in upland habitats.

According to the authors, the main point emerging from the study is that plant communities on the mountain were different 50 years ago because plant species do not necessarily move toward higher elevations as a community. Rather, individual species shift their ranges independently, leading to a reshuffling of plant communities.

The scientists in this multidisciplinary group gathered the data during fieldwork in 2011, and included UA postdoctoral fellows and professors from several programs, including the UA departments of entomology and ecology and evolutionary biology, the Center for Insect Science and the Institute for the Environment, as well as botanists from the Arizona-Sonora Desert Museum.

Based on studies done by other scientists, including UA researchers, the researchers believe that a “thirstier” atmosphere might be a major driver behind the shifts in plant distribution, possibly even more so than lack of precipitation. As the atmosphere becomes warmer and drier, plants loose more water through their leave openings and become water-stressed.

According to the authors, the results are consistent with a trend scientists have established for the end of the Pleistocene, a period of repeated glaciations that ended about 12,000 years ago. By studying the distribution of plant seeds and parts preserved in ancient packrat middens, for example, paleo-ecologists have documented that as the climate warmed up, plant communities changed profoundly.

“In southern Arizona, some species moved north to the Colorado Plateau, others moved up mountain slopes, and others didn’t move at all,” said Moore, who has been collecting data on ground-dwelling arthropods, plants, leaf litter, weather, soil, and other ecological factors in the Santa Catalina Mountains for the Arizona Sky Island Arthropod Project based in her lab.

The Sky Islands encompass an “archipelago” of 65 isolated mountain ranges rising from the surrounding low-elevation desert and desert grassland in an area that constitutes the only major gap in the 4,500-mile long North American Cordillera, which runs from northern Alaska to southern Mexico. The Sky Islands, often referred to as the “Madrean Sky Islands,” span this gap in southeastern Arizona, southwestern New Mexico and northeastern Sonora, Mexico. They include the Santa Catalina Mountains, the Pinal Mountains and the Chiricahua Mountains.

On The Net: