Brain Handles Stress Better When Bodies Are Exercised

April Flowers for redOrbit.com – Your Universe Online

A new study led by Princeton University reveals physical activity reorganizes the brain so that its response to stress is decreased. This makes anxiety less likely to interfere with normal brain function.

The findings, published in The Journal of Neuroscience, show when regularly exercised mice experienced a stressor for example, exposure to cold water — their brains exhibited a spike in the activity of neurons that shut off excitement. These neurons are located in the ventral hippocampus, a brain region shown to regulate anxiety.

The researchers believe their findings potentially resolve a discrepancy in research related to the effect of exercise on the brain. Scientists have not understood the mechanism by which exercise reduces anxiety while also promoting the growth of new neurons in the ventral hippocampus, as these young neurons are typically more excitable. It should mean that exercise results in more anxiety, not less. However, the team found exercise also strengthens the mechanisms that prevent these brain cells from firing.

Elizabeth Gould, Princeton’s Dorman T. Warren Professor of Psychology, notes the impact of physical activity on the ventral hippocampus has not specifically been explored in previous research. The research team was able to pinpoint brain cells and regions vital to anxiety regulation, which may help scientists better understand and treat anxiety disorders.

The findings also reveal that from an evolutionary standpoint, the brain can be extremely adaptive and tailor its own processes to an organism’s lifestyle or surroundings. Gould said a higher likelihood of anxious behavior might have been an adaptive advantage for less physically fit animals. Because anxiety often manifests as avoidant behavior, avoiding potentially harmful situations would increase the possibility of survival for those less capable of responding with a “fight or flight” reaction.

“Understanding how the brain regulates anxious behavior gives us potential clues about helping people with anxiety disorders. It also tells us something about how the brain modifies itself to respond optimally to its own environment,” said Gould, who also is a professor in the Princeton Neuroscience Institute.

One group of mice was given unlimited access to a running wheel for the experiments, while another group had no running wheel. Mice are natural runners who can dash up to 2.5 miles a night when given access to a running wheel. After six weeks of exercise, or exercise deprivation, the mice were exposed to cold water for a brief time.

The research team noticed different behavior in the brains of active and sedentary mice almost immediately. The cold water spurred an increase in “immediate early genes,” or short-lived genes that are rapidly turned on when a neuron fires in the brains of sedentary mice. The active mice lacked these genes, suggesting their brain cells did not immediately leap into an excited state in response to the cold water.

The brains of active mice showed every sign of controlling its reaction to an extent not observed in sedentary mice. A boost of inhibitory neuron activity, known to keep excitable neurons in check, was observed. More of the neurotransmitter gamma-aminobutyric acid, or GABA, which tamps down neural excitement was released at the same time. The researchers also found higher amounts of the protein that packages GABA into little travel pods known as vesicles for release into the synapses of the runner mice.

When the team blocked the GABA receptor that calms neuron activity in the ventral hippocampus, the anxiety-reducing effect of exercise was cancelled out. To achieve this block, the team used the chemical bicuculine, which is used in medical research to block GABA receptors and simulate the cellular activity underlying epilepsy. Bicuculine blocked the mollifying effects of GABA in the active mice when it was applied to the ventral hippocampus.

Disease Can Spread Easily When Passing Basketballs

April Flowers for redOrbit.com – Your Universe Online

Basketballs and volleyballs can potentially spread dangerous germs between players, according to a new study by UC Irvine. The researchers hope to bring a new awareness to athletes, coaches, trainers and parents about safe sanitation practices for athletes. The research team presented their findings at the American College of Sports Medicine national conference in May, 2013.

Known for causing staph infections in athletes, Staphylococcus aureus was chosen for the study. Methicillin-resistant Staphylococcus aureus (MRSA) is a type of staph infection that is resistant to many antibiotics, making it a worrisome condition for doctors. Athletes who contract MRSA often have multiple emergency room visits, costly outpatient follow-ups, and time away from games and practice. Because of this, the NCAA has initiated a campaign to identify and prevent the spread of diseases between athletes.

During the study led by Brandon Haghverdian, who graduated with a bachelor’s degree in biological sciences and starts medical school at UC Irvine in the fall, the team analyzed the germ threat on volleyballs and basketballs, the players’ hands and the gym floor. For each type of item, two of the three surfaces were sterilized while the third was left in a native state. For the balls and floor tiles, a Germicidal Ultraviolet “C” (UVC) light was used for sterilization. Players hands were sterilized using antibacterial soap.

First, the team sampled S. aureus cultures from all three surfaces. The players then dribbled and passed the balls in a specified pattern and duration to simulate actual play. In each case, the team found that S. aureus accumulated on the sterilized surfaces during play. They also found the germ was able to survive on the surface of the balls up to 72 hours in storeroom conditions.

“The overwhelming prevalence of Staph. aureus we encountered supports our understanding of the gym environment as a reservoir of germs,” Joshua A. Cotter, a postdoctoral fellow in orthopedic surgery, said. “Institutions, coaches, and athletes should take note of the role the sports ball can play as a vehicle for the transmission of potentially life-threatening germs.”

Cotter, who supervised the study, but did not participate, added that other dangerous bacteria and viruses may also be spread among athletes.

Genetic Associations Found In Pollen, Dust And Cat Allergies

April Flowers for redOrbit.com – Your Universe Online

Sixteen new genetic associations, including pollen, dust-mite and cat allergies, have been identified by the largest genome-wide association study ever conducted on common allergies.

23andMe, the leading personal genetics company, and the Avon Longitudinal Study of Parents and Children (ALSPAC) collaborated on the study, which examined data for more than 53,000 individuals.

The study, published in Nature Genetics, also identified eight genetic variations for allergies that have previously been associated with asthma. A series of key pathways in the biological basis of common allergies are highlighted by the genes implicated in the study.

In the industrialized world, allergies and allergic asthma are among the most common diseases. A 2005 survey in the US revealed more than half of the population tested positive for sensitivity to at least one of the ten most common allergens. This shows a considerable increase compared to the results of the same survey performed ten years earlier.

“We’ve seen some substantial increases in prevalence of allergies and asthma,” said David Hinds, PhD, 23andMe principal scientist. “Although environmental factors certainly play a role, our study reinforces the genetic link between common allergens and a person’s susceptibility to experiencing an allergic reaction. Additionally, current estimates of the heritability of allergies are high, which suggests that understanding the genetic factors underlying allergic conditions may be key to understanding who might be most likely to suffer from allergies and how the condition might best be treated.”

Three common self-reported allergy phenotypes — pollen, dust-mite and cat allergies — were selected for the study. Comparable data was available for these phenotypes both in the 23andMe research community and in a cohort from ALSPAC. The data from both were included in a genome-wide association meta-analysis.

“Allergy is an important component of many diseases, including asthma, eczema and hay fever, which together account for a huge burden on patients and the health services.” said Professor John Henderson of ALSPAC. “This is a very exciting time for allergy research. Genetic discoveries have identified specific pathways of allergy development that are not shared with allergic diseases like asthma. Understanding these pathways could lead to eventual development of drugs that cure or prevent allergy rather than just suppressing its symptoms.”

“One of the key features of this work is the demonstration that with a suitably sized study, the analysis of medically relevant questionnaire data alongside genetic variation has the potential to yield important information concerning the underlying biology of a complex outcome,” said Dr. Nic Timpson of ALSPAC. “Indeed, through this collaborative interaction with colleagues from EAGLE where specific tests of allergic sensitization were available, we were able to independently replicate many of the findings made here.”

EAGLE, the Early Genetics and Lifecourse Epidemiology research cohort, conducted a companion study, also published in Nature Genetics. The EAGLE researchers used clinically defined data instead of self-reported data, which provided the opportunity to compare results of self-reported data to study results based on clinically defined data. The results of the two studies were very consistent in general, and highlighted many of the same genes and pathways.

“This coordinated approach to research significantly accelerates the replication and validation processes associated with solidifying new genetic discoveries,” said Hinds.

“Through this collaborative effort, we have identified several genes that are responsible for a considerable proportion of allergy in the population,” said Klaus Bonnelykke, MD, PhD and principal scientist from EAGLE. “This is an important step in allergy research.”

Avoiding Conflict Easier As Couples Get Older

Lee Rannals for redOrbit.com – Your Universe Online

The longer a couple has been together, the better they get at learning how to avoid conflict with one another, according to a new study.

Researchers from San Francisco State University reported in the Journal of Marriage and Family that couples who had been married for longer tended to change the subject or divert attention from the conflict when “toxic” subjects emerged.

The team followed 127 middle-age or older long-term married couples across 13 years, checking in to see how they communicated about conflicts from housework to finances. They videotaped the couples’ 15-minute discussions, not solely focusing on the types of communication they used when talking about contentious topics.

The researchers looked at how couples might change in their use of a common and destructive types of communication known as the demand-withdraw pattern. They found while most aspects of demand-withdraw communication remained steady over the couple’s lifetime, both increased their tendency to demonstrate avoidance during conflict.

Typically, avoiding a conflict is thought to be damaging. However, the team said for older couples who have had years to voice their disagreements, this method could be a way to move the conservation away from areas that could be “toxic.”

“This is in line with age-related shifts in socio-emotional goals, wherein individuals tend toward less conflict and greater goal disengagement in later life stages,” said Sarah Holley, San Francisco State assistant professor of psychology who directs the University’s Relationships, Emotion and Health Lab.

She said as people age, they place less importance on arguments and seek more positive experiences, perhaps out of a sense of making the most out of their remaining years. The researchers suggest the age of the partners appears to be driving this communication shift.

“It may not be an either-or question,” Holley said. “It may be that both age and marital duration play a role in increased avoidance.”

The researchers focused on this specific set of communication behaviors because psychologists think the demand-withdraw pattern can be especially destructive for couples. If a husband withdraws in response to his wife’s demands to do the dishes, that withdrawal can lead to an escalation in the wife’s demands.

“This can lead to a polarization between the two partners which can be very difficult to resolve and can take a major toll on relationship satisfaction,” Holley said.

A study last year showed when one partner is angry with another, it may have less to do with the current situation and more to do with the overall happiness of the marriage.

Baylor University scientists found people were most likely to express anger, not in the moments where they felt most angry, but rather in the situations where there was an overall climate of anger in their relationship. They said if couples fall into this “climate of anger,” they tend to continue expressing anger regardless of how they actually feel.

Red Dwarf Stars May Destroy Possibility Of Life In Their Systems

John P. Millis, Ph.D. for redOrbit.com – Your Universe Online

Most of the stars in the Universe are classified as red dwarf stars – stars considerably smaller and cooler than our Sun. About 75 percent of the main sequence stars in our galaxy are classified as such, so they represent an exciting population to study in the search for life beyond Earth.

If it were found that Earth-like planets formed around red dwarf stars, then the door would be opened to the possibility we might soon find life elsewhere in the Universe. But new work is casting some doubt on this possibility.

In order for life to exist on a planet, scientists believe certain conditions must be present. Obviously, the planet would have to have a rocky surface, something upon which the life could build up. Also, the planet would need to have sufficient gravity (i.e. be large enough) in order to maintain an atmosphere, but not so large that the gravitational field strength would overwhelm any forming life.

Finally, and crucially, the planet must receive enough energy from its host star that liquid water can exist on the surface. This means the planet must maintain a certain orbital distance from the star, a region known as the habitable zone. This exact size and location of the habitable zone varies depending on the size and brightness of the star.

For cool red dwarfs, the planets would need to be close in to the star, much closer than the Earth is to our Sun, in order to receive enough heat to maintain the needed temperatures. This is actually great news in the attempt to find planets the size of Earth.

Because the star itself is small, the tug from an Earth-like planet so near the star would mean the star would wobble in response to the planet’s gravity. This wobble would be visible to instruments on Earth, and scientists could then identify “new Earths.” A problem exists however.

A team from St. Andrews University has announced this week that another property of red dwarfs could potentially squelch the possibility for life around these stars. While virtually all stars possess magnetic fields, red dwarfs are known to maintain fields that are quite strong, particularly early on in their formation.

Such strong magnetic fields can have a dramatic effect on nearby planetary objects. For a world close enough to maintain liquid water, the planetary magnetic field could be compressed by that of the nearby red dwarf; so much so that the field could be practically extinguished all together.

This is important, because an atmosphere is another element critical for life, and without a magnetic field the atmosphere would have considerably more exposure to cosmic radiation, especially high-energy charged particles. These charged particles can strip away the atmosphere all together. The chance of this is also increased by the fact the planet is so close to the host red dwarf, as the greatest source of charged particle radiation can come from the stellar wind itself.

Hope still exists, however. The team also revealed that the effect is driven by the strength of the red dwarf’s magnetic field, which varies depending on the rotation period of the star, which itself can slow over time.

“Our work suggests that red dwarf stars with rotation periods larger than about one to a few months will have magnetic fields that won’t significantly squash the magnetosphere of an Earth-analogue planet orbiting inside the habitable zone of its host star”, says Aline Vidotto, from the University of St Andrews, and the lead scientist on the research. “Astronomers will have to take this on board in their search for life elsewhere. The conditions for habitability are turning out to be a lot more complex than we thought.”

Motivation Lacking For Many Marijuana Users

Lee Rannals for redOrbit.com – Your Universe Online

Long-time marijuana smokers have a lack of motivation to work or pursue normal interests, according to a new study.

Scientists at Imperial College London, University College London (UCL) and King’s College London found long-term marijuana users tend to have lower levels of dopamine in a part of the brain called the striatum. They said this finding could explain why some users appear to have a lack of motivation.

The team used PET brain imaging to look at dopamine levels in the striatum of 19 regular cannabis users and 19 non-users of matching age and sex. The users in the study had all experienced psychotic-like symptoms while smoking the drug, such as strange sensations or bizarre thoughts like feeling as though they are being threatened.

The researchers expected the dopamine levels might be higher in this group because increased levels of this chemical has been linked with psychosis. However, they found the opposite effect.

Cannabis users in the study had their first experience with the drug between the ages of 12 and 18 years of age. The team saw a trend for lower dopamine levels in those who started earlier, and also in those who smoke more cannabis. They said marijuana use may be the cause of the difference in dopamine levels.

The lowest dopamine levels were seen in users who meet diagnostic criteria for cannabis abuse or dependence, which could be a marker of addiction severity.

Past studies found marijuana users have a higher risk of mental illnesses that involve repeated episodes of psychosis, such as schizophrenia.

“It has been assumed that cannabis increases the risk of schizophrenia by inducing the same effects on the dopamine system that we see in schizophrenia, but this hasn’t been studied in active cannabis users until now,” said Dr Michael Bloomfield, from the Institute of Clinical Sciences at Imperial, who led the study. “The results weren’t what we expected, but they tie in with previous research on addiction, which has found that substance abusers — people who are dependent on cocaine or amphetamine, for example — have altered dopamine systems.”

He said they only looked at cannabis users who have had psychotic-like experiences with the drug. However, he added they believe the findings would apply to marijuana users in general because they didn’t see a stronger effect in subjects who have more psychotic-like symptoms.

“It could also explain the ‘amotivational syndrome’ which has been described in cannabis users, but whether such a syndrome exists is controversial,” Bloomfield said.

In 2011, researchers published in the British Medical Journal that they found a link between marijuana use and psychosis. They found cannabis use almost doubled the risk of incident psychotic symptoms later in life, even after accounting for other factors like age, sex, socioeconomic status, use of other drugs and other psychiatric diagnoses.

Breath-Taking Documentary Explores The Universe In IMAX 3D

[ Watch the Video “Hidden Universe 3D In IMAX ]

Lee Rannals for redOrbit.com – Your Universe Online
Filmmakers have developed a 3D production for IMAX theaters displaying some of the greatest high-resolution structures in the universe.
Hidden Universe utilizes state-of-the-art telescopes to give the inhabitants of Earth a view of our Universe like it’s never before been seen. Only a handful of people are able to peer into some of the greatest telescope facilities in the world, but this new IMAX experience enables that opportunity for everyone.
The film explores the European Space Observatory’s (ESO) Very Large Telescope (VLT), as well as the internationally-supported Atacama Large Millimeter/submillimeter Array (ALMA).
“The experience of filming in the Atacama Desert at such world-class observing facilities has been amazing,” said film director Russell Scott. “Some of the otherworldly locations among the Andes mountains almost make you feel like you’re on another planet, and this sensation of nature – beyond what we are used to – is exactly what I want to transmit to the audience.”
Viewers will be able to sit down and enjoy a tour of deep space in the comfort of an IMAX cinema. They will be taken on a journey deep inside galaxies and nebulae, skimming over the surface of the famous Red Planet, and get the closest they will ever be to the Sun.
Filmmakers also created 3D simulations based on data gathered by the VLT, ALMA, and deep space telescopes such as NASA and the European Space Agency’s Hubble Space Telescope.
“Hidden Universe will explore the Sun, our human connection to the cosmos, and amazing views of faraway galaxies in a previously unseen way – giving a fresh insight into the Universe,” explains producer Stephen Amezdroz.
British actress Miranda Richardson, who won a Golden Globe Award for her performance in the art-house hit Enchanted April, will be narrating the film. Richardson also received Academy Award and Golden Globe nominations for her role in Louis Malle’s 1992 film Damage.
December Media is the production company who helped produced the film. It is one of Australia’s most experienced production companies and is best known for its documentaries.
Lars Lindberg Christensen, ESO’s Head of the education and Public Outreach Department, said they were enthusiastic about the chance to showcase images from ESO telescopes in IMAX.
“Only the IMAX format can really convey the breathtaking experience of seeing humanity’s most advanced telescopes in action,” Christensen said.
In 2010, award-winning actor Leonardo DiCaprio narrated the IMAX documentary Hubble 3D about the mission to upgrade the Hubble. Astronauts upgraded the space observatory during the STS-125 mission in 2009, adding a new set of IMAX cameras to the Hubble. NASA and IMAX first added 3D cameras to the Hubble Space Telescope back in 2001 for the production of Space Station 3D.

Our Immune Defenses May Have Evolutionary Origins In Corals

Brett Smith for redOrbit.com – Your Universe Online

Many biological processes have their roots in the very earliest stages of life’s evolutionary tree, and new research in the journal BMC Genomics indicates that part of our immune system comes from coral-like ancestors.

According to the study, molecular biologists at the ARC Centre of Excellence for Coral Reef Studies (CoECRS) have found genes in Acropora, or staghorn, corals which are responsible for a quick, strong immune response to the presence of bacteria – genetic material that is also found in mammals, including humans.

“It’s early days, but it certainly looks as if key aspects of our ability to resist bacteria are extremely ancient and may have been pioneered by the ancestor of corals – and then passed down to humans in our evolutionary lineage,” said co-author David Miller, CoECRS team leader and a James Cook University professor of molecular biology and biochemistry.

“Corals are constantly attacked by bacteria in their natural environment, and so have perfected very efficient defenses against them,” Miller said. “These defenses apparently work well enough to be preserved in mammals like us, and possibly in plants too.

“Certain animals in between us and coral, like roundworms and flies, seem to have lost these genes, but our line appears to have retained them,” he added.

The genetic material is comprised of three genes, known as GiMAPs, that have long been associated with anti-bacterial response in mammals.

In the study, Miller and a team of Australian colleagues exposed living colonies of Acropora to signature chemicals associated with bacteria and looked for a genetic response from the sea creatures.

“We were quite surprised at how rapidly and strongly these three genes in particular reacted to the presence of bacterial proteins,” Miller said. “It was spectacular.”

The team said their main goal was not to shed light on mammalian evolution, but to gain a better understanding of the fragile creatures’ immune systems, which have been under siege in recent years.

“By better understanding the basis of coral immunity we may first be able to understand what is causing this pandemic of coral diseases and how human activity is connected to it,” Miller said. “And second, this may lead us to better ways of managing our reefs that reduce the impact of disease, and give corals a better chance of survival during a period of major climatic and environmental change.”

Miller has also been involved in another study that aims to identify the molecular processes behind the formation of coral exoskeletons, which make up coral reefs.

“With the world’s oceans becoming more acidic due to man-made carbon dioxide emissions, the whole basis by which corals and other marine organisms form their skeletons and shells – known as calcification – is under threat,” Miller said.

“Many marine scientists fear that if the oceans become more acidic as we redouble fossil fuel use, many of these lifeforms will not be able to cope – and our coral reefs could literally dissolve before our very eyes,” he added.

Miller said his ongoing work is focusing on seeing if and how corals can cope with ocean acidification and other threats on their own.

Mimicking Living Cells: Synthesizing Ribosomes

Synthetic biology technology could lead to new antibiotics, modified protein-generators

Synthetic biology researchers at Northwestern University, working with partners at Harvard Medical School, have for the first time synthesized ribosomes — cell structures responsible for generating all proteins and enzymes in our bodies — from scratch in a test tube.

Others have previously tried to synthesize ribosomes from their constituent parts, but the efforts have yielded poorly functional ribosomes under conditions that do not replicate the environment of a living cell. In addition, attempts to combine ribosome synthesis and assembly in a single process have failed for decades.

Michael C. Jewett, a synthetic biologist at Northwestern, George M. Church, a geneticist at Harvard Medical School, and colleagues recently took another approach: they mimicked the natural synthesis of a ribosome, allowing natural enzymes of a cell to help facilitate the man-made construction.

The technology could lead to the discovery of new antibiotics targeting ribosome assembly; an advanced understanding of how ribosomes form and function; and the creation of tailor-made ribosomes to produce new proteins with exotic functions that would be difficult, if not impossible, to make in living organisms.

“We can mimic nature and create ribosomes the way nature has evolved to do it, where all the processes are co-activated at the same time,” said Jewett, who led the research along with Church. “Our approach is a one-pot synthesis scheme in which we toss genes encoding ribosomal RNA, natural ribosomal proteins, and additional enzymes of an E. coli cell together in a test tube, and this leads to the construction of a ribosome.”

Jewett is an assistant professor of chemical and biological engineering at Northwestern’s McCormick School of Engineering and Applied Science.

The in vitro construction of ribosomes, as demonstrated in this study, is of great interest to the synthetic biology field, which seeks to transform the ability to engineer new or novel life forms and biocatalytic ensembles for useful purposes.

The findings of the four-year research project were published June 25 in the journal Molecular Systems Biology.

Comprising 57 parts — three strands of ribonucleic acid (RNA) and 54 proteins — ribosomes carry out the translation of messenger RNA into proteins, a core process of the cell. The thousands of proteins per cell, in turn, carry out a vast array of functions, from digestion to the creation of antibodies. Cells require ribosomes to live.

Jewett likens a ribosome to a chef. The ribosome takes the recipe, encoded in DNA, and makes the meal, or a protein. “We want to make brand new chefs, or ribosomes,” Jewett said. “Then we can alter ribosomes to do new things for us.”

“The ability to make ribosomes in vitro in a process that mimics the way biology does it opens new avenues for the study of ribosome synthesis and assembly, enabling us to better understand and possibly control the translation process,” he said. “Our technology also may enable us in the future to rapidly engineer modified ribosomes with new behaviors and functions, a potentially significant advance for the synthetic biology field.”

The synthesis process developed by Jewett and Church — termed “integrated synthesis, assembly and translation” (iSAT) technology — mimics nature by enabling ribosome synthesis, assembly and function in a single reaction and in the same compartment.

Working with E. coli cells, the researchers combined natural ribosomal proteins with synthetically made ribosomal RNA, which self-assembled in vitro to create semi-synthetic, functional ribosomes.

They confirmed the ribosomes were active by assessing their ability to carry out translation of luciferase, the protein responsible for allowing a firefly to glow. The researchers then showed the ability of iSAT to make a modified ribosome with a point mutation that mediates resistance to the antibiotic clindamycin.

The researchers next want to synthesize all 57 ribosome parts, including the 54 proteins.

“I’m really excited about where we are,” Jewett said. “This study is an important step along the way to synthesizing a complete ribosome. We will continue to push this work forward.”

Jewett and Church, a professor of genetics at Harvard Medical School, are authors of the paper, titled “In Vitro Integration of Ribosomal RNA Synthesis, Ribosome Assembly, and Translation.” Other authors are Brian R. Fritz and Laura E. Timmerman, graduate students in chemical and biological engineering at Northwestern.

The work was carried out at both Northwestern University and Harvard Medical School.

On the Net:

Could ADHD Drug Abuse For Academic Reasons Lead To Pre-Exam Drug Tests?

redOrbit Staff & Wire Reports – Your Universe Online

At least one UK neuroscientist is recommending students be tested for drugs such as Ritalin before being allowed to take an academic test, in much the same way athletes are forced to undergo screenings for steroids and other performance-enhancing substances.

According to Josie Ensor and Rosa Silverman of The Telegraph, University of Cambridge professor Barbara Sahakian said medications, such as the widely-used ADHD treatment, are becoming increasingly used by students who obtain them illegally in order to help their performance on university exams or other essential academic activities.

“Academics say the number of students using the drugs has steadily risen over the last few years as they say the pressure to do well increased during the recession, with some students even faking symptoms of ADHD in order to get prescriptions of Ritalin,” Ensor and Silverman said.

“The British Psychological Society (BPS) has also launched an investigation into the growing prescription levels of the drug. There are fears that funding cuts for treatments such as counseling have led to an over reliance on medication,” they added. “A report by the Academy of Medical Sciences suggested that just a 10 percent improvement in memory could raise students one grade band at A-levels or into a different degree class.”

In a recent survey, it was revealed 10 percent of all students at Sahakian’s university had taken drugs such as Ritalin, Modanil and Adderall in order to gain an academic advantage. The Cambridge professor also told The Telegraph that an increasing number of students have been complaining to her and other faculty members that their fellow students have been using these types of substances and gaining an unfair advantage in the classroom.

“Many students have said that they feel it is cheating that some students use ‘smart drugs’ in exams. It is difficult for universities practically to address these issues, but they should have clear policy statements in regard to the use of cognitive enhancing drugs,” the professor told Ensor and Silverman. “Universities are yet to get a grip on the problem. While none seem to encourage its use, none do anything to actively dissuade students from using them. If there were random testing in exam situations, it should act as a deterrent.

“There have been many more people using the drug recently. It is logical that the easier it becomes for people to get hold of these drugs, the higher the number will become of people who take them,” she added. “Students feel under enormous pressure, particularly during exam time and when their coursework is due. Many of them feel they have to turn to the drugs to help them concentrate better and cram for tests.”

The problem is not unique to the UK. Last week, CBS New York reported more than one-third of all American college students had taken prescription medications like Ritalin or Adderall in order to help them study.

In response, New York Senator Chuck Schumer called upon administrators at colleges and universities in his state to enact tough policies restricting students from obtaining the pills without a prescription, said Stephen Adkins of the University Herald. In fact, Schumer estimated that as many as 64,000 students abuse these types of medications in New York City alone.

“This is not the first revelation on study drugs,” Adkins said. “A study conducted by University of Rhode Island in 2009 found that 60 percent of students knew about classmates who consumed study drugs. In 2011, the US Department of Health and Human Service assessed that 5 percent of people aged between 18 and 25 had taken psycho-therapeutic drugs like Adderall or Ritalin without a prescription.”

In addition to allegedly helping academic performance, Ritalin and other ADHD drugs have also been found to help improve the brain function in individuals addicted to cocaine. Not everyone buys into these claims, however, as earlier this month a study from researchers at the Universities of Princeton, Toronto and Cornell found children and adolescents who used the stimulant actually had worse academic outcomes, according to Lindsay Abrams of The Atlantic.

Ecology Forum Discusses The Declining Fortunes Of Yellowstone Elk

Alan McStravick for redOrbit.com – Your Universe Online

Ecologists have been active in their response to the publishing of a new study in the June issue of the Ecological Society of America’s journal Ecology. In it, Arthur Middleton and fellow researchers from the University of Wyoming, the Wyoming Game and Fish Department and the US Geological Survey discussed the Clarks Fork herd residing on crowded winter grounds outside of Cody, Wyoming. Their findings contradict those of a 1988 study that has long been regarded as a seminal work in the understanding of the benefit of migrations for animals that routinely undertake foraging treks.

In John Fryxell‘s 1988 paper, it was explained why migratory hoofed beasts, like the Clarks Fork elk, have outnumbered their non-migratory cousins by as much as an order of magnitude. When used for approximating purposes, an order of magnitude typically refers to one set outnumbering another by ten times. His reasoning for the migrants’ dominant survivability rates over the more sedentary herds focused on three key points. First, the migrant herd uses a much larger area. Second, they are able to make a more efficient use of resources. And finally, they have a diminished vulnerability to predators. He explains how, on their own, each point could not necessarily lead to a diminished mortality rate, but when present in conjunction, they each contribute to such an outcome.

Much more to the point, and as an example, Fryxell contends migration allows for animals to take advantage of seasonal vegetation while, at the same time, shelter themselves from predators and the elements. The migrants of the Clarks Fork herd, in seeking the new vegetation located on higher terrain, are able to leave their predators behind. In the spring, predators are less mobile due to the necessity of attending to the needs of their newborns.

The Clarks Fork herd is comprised of approximately 4000 elk that, each spring, leave their winter grounds for the snowmelt-fed greening grass located in the highlands of the Absaroka Mountains. Their journey is becoming something of a rarity for modern migratory animals as their route is unimpeded by roads, fences, metropolitan areas and other human-built barriers. Middleton’s findings, that the benefits of migration are less than the costs should afford, focuses on the end-point of the migration which happens to lie inside the border of Yellowstone National Park.

The team contends the elk herd has, over the last few decades, been returning to the winter grounds outside Cody with fewer and fewer calves. This is in stark contrast to the herds that stay year round in the Cody area. According to Middleton’s study, the decline in the number of surviving calves of the migratory elk must be attributed to both climate change and a marked increase in the number of predators that prey on newborn calves.

Marco Festa-Bianchet of the University of Sherbrooke recently edited a forum of five working groups of ecologists. These groups, while praising the work of Middleton and his team, commented on the data, at times challenging the interpretation. The above mentioned Fryxell was a contributing member to the forum.

Middleton and his team are not the first ecologists to report on new challenges facing migratory creatures and how those challenges are tied directly to changes in habitat resulting from human development and climate change. However, they do believe theirs to be a novel case study due to a conflagration of habitat and public policy.

Middleton’s study addresses how drought, in combination with the return of predators to Yellowstone, is directly responsible for the low pregnancy and calf survival rates of these elk. One predator, the grizzly bear, is cited specifically.

In response to criticism offered by the forum, Middleton stated, “Many of the forum commentaries discuss the implications of our work for management and conservation of large carnivores and their prey in Yellowstone, especially wolves. However, a persistent focus on the impact of re-introduced wolves among scientists, wildlife managers, and the public misses key roles of grizzly bears and severe drought in limiting elk populations.”

The study claims the summer months have been hotter and dryer in the summer range of the migratory Clarks Fork elk. They state satellite imagery has shown the length of time spring vegetation requires to “green-up” has been shortened by almost a full month over the last 21 years. The spring feeding is a critical time for the female elk to gain the amount of fat needed for reproduction. Over the same 21 years, wolves were reintroduced into Yellowstone National Park. The team claims wolf and bear populations have been growing. A US Fish and Wildlife survey has shown an increase in the number of bears in Yellowstone. However, in a recent article published on redOrbit, the methodology of the survey was called into question. Dan Doak and Kerry Cutler claim the flawed methodology may have led to an overestimation of the growth of the grizzly bear population.

Allowing the premise the predatory populations have been increasing, the study claims this increased ecological pressure is a direct result of human choices. This addresses the human manipulation of predators and the harsh drought conditions linked in study after study to human-caused climate change.

Forum members Chris Wilmers and Taal Levi, in addressing the field irrigation within the Sunlight Basin Wildlife Habitat Management Area, state the non-migratory elk avail themselves of the grazing land more heavily during spans of pronounced drought. Due to their migratory behavior, the other elk herds are unable to take advantage of those hydrated lands.

“I think Middleton has an intriguing idea, and it might be what’s happening. We offer another hypothesis that also fits the data that they have. He says it’s climate change on the summer range and more predators on the summer range. I think it’s because there is irrigation that provides the sedentary elk with food. And I think it’s also that there is predator control outside the park,” said Wilmers.

Wilmers states the population boon within Yellowstone Park has coincided with an intense increase in predator control measures outside the park. “My hypothesis is that in that crucial winter period, the migrants are coming down to range that the resident elk have already been feeding on all summer, and now they are competing for in the winter,” Wilmers says. Of course, to discern between the subtleties of these competing hypotheses would require real-world testing.

According to Wilmers, this could be achieved through direct experimentation. He offers, as an example, stopping irrigation in the resident rangelands or irrigating the mountain highlands where the migratory elk travel to. He also claims the introduction of predator control within Yellowstone National Park or the ceasing of control measures outside the park would be important variable studies. The concept, however, of ‘world as laboratory’ can run into opposition on both the public policy and public opinion fronts. Wilmer’s ecologically invasive manipulations would likely draw ire from legislators, wildlife advocacy organizations and special interest groups.

John Fryxell and Robert Holt, in their forum submission, addressed the difficulty faced by ecologists in isolating underlying causes for the observable pattern changes. In their model, they employed an expression of the elements affecting the Yellowstone elk. As they state, the innate instinct to migrate is a delicate balance of environment, population saturation, predation and genetic predisposition. As they claim, if food scarcity is behind the decline in new population, it may soon be impossible for the elk to live in the park. However, if increased predation during migration is to blame, the elk may eventually split into two resident populations, both inside and outside the park. While Fryxell and Holt reserve judgment on  the causes in the pattern change, they claim, as a result of rapid worldwide changes in the next few decades, changing migration patterns will become more and more common.

Another forum contributor, Atle Mysterud of the University of Oslo, chose to look at how a changing climate soon falls out of sync with both seasonal events and animal life cycles. He states the migratory behavior of the elk is malleable in relation to the timing of a spring green-up. However, the time period necessary for calf gestation and birthing is relatively fixed. Mysterud also commented on the potential non-lethal effects of predators on the elk’s behavior along with the “human shield” against predators for the resident elk that grazed on lands near agricultural and urban areas outside Yellowstone National Park.

Another forum contribution by ecologists from Oxford University was more critical of Middleton and his team. Their paper, “Will central Wyoming elk stop migrating to Yellowstone, and should we care?” might well be regarded as contentious, if only for the title alone. In their paper, they claim the trends in both vegetation and predator differential across the park boundary are compelling. However, they go on to say Middleton’s data cannot confirm that it was these two factors that led to the change in the demographic of the elk population.

“We don’t wish to sound critical of the huge effort they have put in. Nonetheless, despite their hard work, their data on elk condition and pregnancy rates come from a relatively small number of animals collected over only a relatively short time period. Given this, they are restricted to conducting a few piecemeal analyses and telling some compelling stories. But the problem with this approach is that it is easy to construct very many compelling stories. When that happens, the scientific literature can become opinionated and sometimes adversarial.”

To this point, the Oxford ecologists cited studies of the effects of predators on other elk herds in Yellowstone National Park. They found the existing data, sensitive to minimal environmental variation, are complicated. Additionally, they state wolves play little role in the survival of calves. Ultimately, they advocated for increased investment in broader, long-term, large-scale data collection initiatives.

The call for a longer-term data collection period was echoed by Jean-Michel Gaillard of the University of Lyon. He believes deriving data of consistent timescale and quality is necessary to garner a better understanding of the migratory tactics of female elk. Gaillard called for lifetime monitoring of individuals within the elk population.

Jack Massey of the Oxford group ended their forum submission cautioning that the Middleton paper, like many before that have mentioned both wolves and elk, might be co-opted by groups aiming to forward their own political agenda. For instance, those that advocate on behalf of large elk herds might claim this paper is evidence the elk population is being needlessly destroyed by wolves. Massey and his peers claim the causes of elk population decline are not so clear or so simple.

“As ever with such debate, whether we should care all depends on one’s view on what our wilderness should look like.” concluded Massey and his colleagues. Therefore, the answer to the Oxford group’s title question is not one that can be arrived at through science. Rather it will require an examination of an individual community’s values.

Stunning New Solar Atmosphere Images Could Help Solve Longstanding Mysteries

[ Watch the Video: Stunning New Solar Atmosphere Images Could Help Solve Longstanding Mysteries ]

redOrbit Staff & Wire Reports – Your Universe Online

Thanks to an innovative new camera on board a sounding rocket, an international team of scientists has managed to capture pictures of the sun’s outer atmosphere that are five times sharper than any previous images.

The images, which were scheduled to be presented by University of Central Lancashire professor Robert Walsh at the Royal Astronomical Society (RAS) National Astronomy Meeting on Monday, were captured using the NASA High Resolution Coronal Imager (Hi-C).

Walsh and colleagues from the UK, US and Russia used a sounding rocket to launch Hi-C from the White Sands Missile Range in New Mexico, RAS officials said. During its brief flight, the Hi-C team was able to secure the high-quality solar corona images, acquiring roughly one picture each five seconds. They were also reportedly able to discover “fast-track ‘highways’ and intriguing ‘sparkles’ that may help answer a long-standing solar mystery.

According to the researchers, those dynamic bright spots switch on and off. They last approximately 25 seconds, are roughly 680 kilometers (423 miles) across and release at least 10,000 times the annual energy consumption of the entire UK population during each event.

As such, these sparkles clearly demonstrate that massive amounts of energy are being added into the corona, and could then be released violently in order to heat the plasma. By discovering them in the Hi-C images, the authors believe they could help explain why the sun’s corona is about 400 times hotter than the solar surface.

Also in the new images, the researchers report the discovery of “small clumps of electrified gas (plasma) at a temperature of about one million degrees Celsius.” These clumps, they note, “are seen racing along highways shaped by the Sun’s magnetic field,” traveling at approximately 80 kilometers per second (50 miles per second) — or the equivalent of 235 times the speed of sound on Earth.

Each highway is 450 kilometers (280 miles) across, or just about the entire length of the country of Ireland from its northern-most point to its southern-most one, the Society said. These flows are inside a region of dense plasma known as a solar filament. These areas can erupt outward from the sun in phenomenon known as coronal mass ejections (CMEs), and carry billions of tons of plasma into space.

“If a CME travels in the right direction it can interact with the Earth, disturbing the terrestrial magnetic field in a ‘space weather’ event that can have a range of destructive consequences from damaging satellite electronics to overloading power grids on the ground,” the researchers said. By discovering and learning more about the nature of these so-called solar highways, scientists might be able “to better understand the driving force for these eruptions and help predict with greater accuracy when CMEs might take place.”

“I’m incredibly proud of the work of my colleagues in developing Hi-C,” Walsh said. “The camera is effectively a microscope that lets us view small scale events on the Sun in unprecedented detail. For the first time we can unpick the detailed nature of the solar corona, helping us to predict when outbursts from this region might head towards the Earth.”

Animal Sanctuary, 3D Printing Firm Prepping Prosthetic Leg For Duckling

redOrbit Staff & Wire Reports – Your Universe Online

A Tennessee duckling that was born with a backwards foot last year will be receiving a prosthetic foot, thanks to the efforts of a 3D printing company and the folks at a nearby animal sanctuary.

According to the Huffington Post, Buttercup the duckling has been unable to walk properly since being born with a deformed, partially-developed foot last November. He was just “hobbling around on it,” explained officials at Feathered Angels Waterfowl Sanctuary in Arlington, Tennessee.

“Caregivers at the school did their best to keep him happy, but the duckling was constantly in pain,” said New York Daily News reporter Carol Kuruvilla. Buttercup wound up being adopted by Feathered Angels, where the difficult decision to amputate the duckling’s foot was made.

Sanctuary owner Mike Garey explained that there was no alternative, telling Lexy Gross of The Tennessean that the leg would begin to bleed whenever the duckling walked outside. Had the foot not been amputated, it would have been too painful and prone to infections, and it was unlikely that Buttercup would have survived, Garey told Time’s Doug Aamoth.

However, in addition to being the sanctuary owner, Garey was a trained software engineer who began brainstorming ways to help Buttercup, Kuruvilla said. He contacted NovaCopy, a Nashville-based 3D printing company, and then used the left foot of Buttercup’s sister Minnie to design a plastic model of a replacement foot for the injured duckling — a process which reportedly took 13 hours to complete.

Knowing that the plastic would be too cumbersome, Garey and NovaCopy opted instead to create an appendage made from silicone that slips onto Buttercup’s stump using a stretchy silicone sheath, said Kuruvilla.

The materials needed to create the mold were reportedly received by Feathered Angels on Thursday, the mold was poured on Friday, and the final prosthetic should be completed by Sunday afternoon, Gross said — and it’s certainly worth noting that NovaCopy has agreed to cover the costs of the procedure.

FDA To Detain Pomegranate Seed Imports Linked To Hepatitis A Outbreak

redOrbit Staff & Wire Reports – Your Universe Online

The US Food and Drug Administration (FDA) has announced that it will be detaining some international pomegranate seed shipments because the product could contain the Hepatitis A virus.

The pomegranate seeds in question originate from Goknur Gida Maddeleri Ithalat Ihracat Tic [Goknur Foodstuffs Import Export Trading] of Turkey, and it is believed that they played a role in the ongoing multi-state outbreak of Hepatitis A illnesses that has infected over 120 individuals to date, the federal public health department announced on Saturday.

The decision to block the import of the pomegranate seeds follows a joint investigation by the FDA, the Centers for Disease Control and Prevention (CDC), and state and local health authorities into the outbreak, which has been associated with Townsend Farms Organic Antioxidant Blend – a frozen fruit blend sold at Costco which contains pomegranate seed mix originating from Turkey.

“By combining information gained from the FDA’s traceback and traceforward investigations and the CDC’s epidemiological investigation, the FDA and CDC have determined that the most likely vehicle for the Hepatitis A virus appears to be a common shipment of pomegranate seeds from Goknur used by Townsend Farms to make the Townsend Farms and Harris Teeter Organic Antioxidant Blends that were recalled in June,” the agency said.

Those seeds were also used by Gresham, Oregon-based Scenic Fruit Company, who according to Ryan Jaslow of CBS News issued a voluntary recall of more than 61,000 bags of Woodstock Frozen Organic Pomegranate Kernels earlier this week due to health concerns related to Hepatitis A.

“No illnesses are currently associated with Scenic Fruit’s products, and product testing so far has found no presence of the liver-damaging virus on its pomegranate seeds, which were imported from Turkey,” Jaslow said. “The outbreak, however, is being conducted out of an abundance of caution because of an ongoing hepatitis A outbreak linked to frozen pomegranate seeds of Turkish origin.”

“This outbreak highlights the food safety challenge posed by today’s global food system,” FDA deputy commissioner for foods and veterinary medicine Michael R. Taylor said in a statement. “The presence in a single product of multiple ingredients from multiple countries compounds the difficulty of finding the cause of an illness outbreak. The Hepatitis A outbreak shows how we have improved our ability to investigate and respond to outbreaks, but also why we are working to build a food safety system that more effectively prevents them.”

The agency noted that a review of records showed that the Goknur Foodstuffs Import Export Trading pomegranate seeds were the only ingredient common to all brands and varieties of the recalled Townsend Farms and Harris Teeter Organic Antioxidant Blend. The FDA also said that it would be working with all companies that have distributed seeds from this shipment to make sure that all parties that received the product are notified.

According to the CDC, as of June 27, a total of 127 men and women had been exposed to the disease from ingesting Townsend Farms Organic Antioxidant Blend. Illnesses have been reported in Arizona, California, Colorado, Hawaii, Nevada, New Mexico, Utah, and Wisconsin (though this individual was exposed to the product in California). In addition, the CDC reported that the outbreak strain was found in the clinical specimens of 56 people in seven states.

Premeditated Killers Differ Mentally From Impulsive Murderers

redOrbit Staff & Wire Reports – Your Universe Online
The minds of premeditated murderers are quite different than those who kill impulsively, according to a new study appearing in the latest edition of the journal Criminal Justice and Behavior.
“Impulsive murderers were much more mentally impaired, particularly cognitively impaired, in terms of both their intelligence and other cognitive functions,” said Dr. Robert Hanlon, senior author of the study and an associate professor of clinical psychiatry and clinical neurology at the Northwestern University Feinberg School of Medicine.
Murderers whose acts are predatory and premeditated in nature did not typically demonstrate any type of cognitive or intellectual dysfunction, but were more likely to have psychiatric disorders, Dr. Hanlon added.
As part of his research, he studied 77 murderers incarcerated either in the state of Illinois or the state of Missouri. Each of them took standardized intelligence and neuropsychological memory tests.
Compared to impulsive killers, premeditated murderers were nearly twice as likely (61 percent versus 34 percent) to have a history of psychotic or mood disorders, while impulsive murderers were more likely (59 percent versus 36 percent) to have developmental disabilities and cognitive/intellectual impairments, the study claims.
Nearly all of the impulsive murderers have a history of alcohol or drug abuse and/or were intoxicated at the time of the crime — 93 percent versus 76 percent of those who strategized about their crimes.
The study is “the first to examine the neuropsychological and intellectual differences of murderers who kill impulsively and those who kill as the result of a premeditated strategic plan,” said Sarah Griffiths of the Daily Mail.
According to The Telegraph’s Radhika Sanghani, Dr. Hanlon is calling on lawmakers to take intelligence and the mindset of murderers into consideration both in terms of prosecution and crime prevention.
“It’s important to try to learn as much as we can about the thought patterns and the psychopathology, neuropathology and mental disorders that tend to characterize the types of people committing these crimes,” the study author explained.
“Ultimately, we may be able to increase our rates of prevention and also assist the courts, particularly helping judges and juries be more informed about the minds and the mental abnormalities of the people who commit these violent crimes,” he added.

If Carbon Emissions Aren’t Reduced Now, Coral Reefs Will Die

Lee Rannals for redOrbit.com – Your Universe Online

A new study found that in order to prevent coral reefs from dying off, nations need to work hard to drop carbon dioxide emission levels.

Researchers wrote in the journal Environmental Research Letters that all existing coral reefs will die from inhospitable ocean chemistry conditions by the end of the century if civilization continues on its current path.

Coral reefs are havens for marine biodiversity and underpin the economies of many coastal communities. They are very sensitive to changes in ocean chemistry as a result of greenhouse gas emissions, as well as pollution, warming waters, overdevelopment and overfishing.

The researchers calculated the ocean chemical conditions that would occur under different future scenarios. They used computer models to help them determine whether these chemical conditions could sustain coral reef growth.

“Our results show that if we continue on our current emissions path, by the end of the century there will be no water left in the ocean with the chemical properties that have supported coral reef growth in the past. We can’t say with 100 percent certainty that all shallow-water coral reefs will die, but it is a pretty good bet,” said Katharine Ricke, from the Carnegie Institute.

Coral reefs use a naturally occurring form of calcium carbonate to make their skeletons. When carbon dioxide from the atmosphere is absorbed by the ocean, it forms carbonic acid, making the ocean more acidic and decreasing the ocean’s pH. This increase makes it difficult for marine organisms to grow their shells and skeletons, which inevitably threatens the coral reefs around the world.

The team said that deep cuts in emissions are necessary in order to save even a fraction of existing coral reefs. Chemical conditions that can support coral reef growth can be sustained only with very aggressive cuts in carbon dioxide emissions.

“To save coral reefs, we need to transform our energy system into one that does not use the atmosphere and oceans as waste dumps for carbon dioxide pollution. The decisions we make in the next years and decades are likely to determine whether or not coral reefs survive the rest of this century,” said study coauthor Ken Caldeira of Carnegie.

Scientists wrote in May that coral reefs are definitely on the decline, but their collapse can still be avoided through local and global action.

“People benefit by reefs’ having a complex structure — a little like a Manhattan skyline, but underwater,” said Peter Mumby of The University of Queensland. “Structurally complex reefs provide nooks and crannies for thousands of species and provide the habitat needed to sustain productive reef fisheries. They’re also great fun to visit as a snorkeler or diver. If we carry on the way we have been, the ability of reefs to provide benefits to people will seriously decline.”

MIT’s Wi-Vi Technology Uses Wi-Fi To See Through Walls

Michael Harper for redOrbit.com – Your Universe Online

Using technology that nearly all of us have in our homes, MIT researchers have developed a sort of x-ray vision device that can detect motion behind walls or other large objects. It’s called Wi-Vi and it uses low power Wi-Fi antennas to bounce signals off any moving object, similar to the way radar and sonar works.

Because Wi-Vi requires very little power and uses common technology, the team at MIT’s Computer Science and Artificial Intelligence Laboratory say they can build Wi-Vi into small devices like smartphones and use them for law enforcement or search-and-rescue operations. Though the idea of tracking far away objects and movement isn’t a new one, the way it’s being done with Wi-Vi could place this technology in the hands of the general public, and it will almost certainly raise new questions about privacy in the modern age.

Wi-Vi was created by Dina Katabi, a professor in MIT’s Department of Electrical Engineering and Computer Science, and her graduate student Fadel Adib.

When Wi-Fi signals are broadcast from the antennas, only a portion of the signals can make it through walls or other solid objects. Understanding this, Katabi and Adib developed a system which sends out additional signals and uses the reflections from static objects like the wall to draw a baseline. Wi-Vi is built with two Wi-Fi antennas, only one receiver and some clever coding to interpret the incoming signals. The two antennas send out a nearly identical signal with one inverse from the other. This reversing of the signals creates an identical signal when they hit a solid object, such as a wall, and are cancelled out. However if something is moving on the other side of the wall, the returning signals will be different from one another.

“So, if the person moves behind the wall, all reflections from static objects are cancelled out, and the only thing registered by the device is the moving human,” explained Adib in a statement.

Though the thought of being able to track motions from another room may give some pause, the current setup isn’t capable of displaying outlines of the human body. In fact, Wi-Vi only displays motion as a solid line on a graph, registering as a negative signal when the person moves away from the device and a positive signal when they move closer. Even if the person is only making small motions, such as writing on a white board, the Wi-Vi registers it on the display.

It’s the idea and the implementation of Wi-Vi that makes it so different. Radar and sonar have been used for many years and Wi-Fi can be found in nearly any building. Yet it’s the way this Wi-Fi signal is used and the code which inverses the signals and displays them as motion. It’s an arrangement which the MIT team hopes will make Wi-Vi devices accessible and affordable to the general public as well as law enforcement and search and rescue teams. This technology could even fit in a smartphone without presenting a drain on battery life says Katabi, meaning volunteers could help locate people trapped in a collapsed building following an earthquake. When installed in a mobile device, Wi-Vi could even turn a smartphone into a personal safety device.

“If you are walking at night and you have the feeling that someone is following you, then you could use it to check if there is someone behind the fence or behind a corner,” said Katabi.

Yet giving this tracking power to the general public could also create some new concerns about personal privacy.

Hanni Fakhoury, a staff attorney at the Electronic Frontier Foundation told ComputerWorld the law simply has not addressed any issues like the ones Wi-Vi could create.

“Your location is something that’s worthy of privacy. We know that, even within your house, where you go can reveal a lot about yourself.”

Katabi believes this technology could also be used to protect one’s privacy, noting a caretaker could use Wi-Vi to monitor the motions of an elderly grandparent without installing cameras that may make them uncomfortable.

Katabi and Adib will present Wi-Vi at the Sigcomm conference in Hong Kong this August.

UK Green Lights Controversial IVF Threesome

Brett Smith for redOrbit.com – Your Universe Online

A controversial yet lifesaving treatment that essentially engineers three-parent children using in vitro fertilization (IVF) and third-party mitochondrial DNA will move ahead with clinical trials in the UK, according to British officials.

The procedure involves taking dysfunctional mitochondrial DNA from a prospective mother and swapping it out for healthy mitochondrial DNA from a female donor. The technique, called mitochondrial transfer, is designed to prevent the passing on of debilitating genetic disorders from mother to child via mitochondrial DNA.

The UK government announced it would begin to allow doctors to apply for permission to perform the procedure, pending approval of regulations from parliament.

Mitochondria are described as the “power plant” of the cell and their dysfunction can result in disorders that affect the entire body – including the heart and brain. About 1 in 6,500 people is born with a mitochondria-related condition. The mitochondria are the only parts of human cells outside of the nucleus where DNA is found. Unlike nuclear DNA which is inherited more or less equally from the mother and father, mitochondrial DNA is inherited exclusively from the mother.

“It’s only right that we look to introduce this life-saving treatment as soon as we can,” said Dame Sally Davies, the chief medical officer in England.

A report from the Human Fertilization and Embryology Authority (HEFA) published in March said the UK public generally approves of the procedure, although some groups have contested the procedure because it includes the destruction of embryos.

The method could also have unknown generational consequences as it results in the passing on of modified genetic material. Mitochondrial transfer has been shown to work in animals but has never been tried on humans. Davies noted any babies born from the procedure need to be monitored closely.

“This is not a decision to take lightly,” she said, adding that mitochondrial diseases have a “devastating impact” on individuals and those around them.

“People who have it live with debilitating illness, and women who are affected face passing it on to their children,” she said.

Doctors have developed two different kinds of mitochondrial transfer. Both involve genetic material from the parents being injected into a donor egg containing healthy mitochondria that has had its nucleus removed. The resulting embryo carries nuclear DNA from the parents, but donor mitochondria – amounting to only about 37 of the genome’s 23,000 genes.

Current measures to battle mitochondria-related disease for high-risk mothers include in-vitro fertilization with donor eggs and adoption.

Davies said women who donate mitochondria would remain anonymous and untraceable.

“(The announcement) is excellent news for families with mitochondrial disease,” said Doug Turnbull, who led the Newcastle University team that developed the procedure. “This will give women who carry the diseased genes more reproductive choice and the opportunity to have children free of mitochondrial disease.”

Helen Watt of the Christian Anscombe Bioethics Center spoke out against the announcement, saying that the embryo is not a source of “spare parts.”

“Parenthood is about unconditional welcome of children,” she told The Guardian. “It is not about manufacture and control.

“Couples who do not want to take the risk of passing on mitochondrial disease might want to consider ethical alternatives like adoption, which are far preferable to pursuing dangerous techniques of genetic engineering which exploit both embryos and egg donors.”

Using Space Software To Identify Alzheimer’s

ESA

Software for processing satellite pictures taken from space is now helping medical researchers to establish a simple method for wide-scale screening for Alzheimer’s disease.

Used in analyzing magnetic resonance images (MRIs), the AlzTools 3D Slicer tool was produced by computer scientists at Spain’s Elecnor Deimos, who drew on years of experience developing software for ESA’s Envisat satellite to create a program that adapted the space routines to analyze human brain scans.

“If you have a space image and you have to select part of an image – a field or crops – you need special routines to extract the information,” explained Carlos Fernandez de la Pena of Deimos. “Is this pixel a field, or a road?”

Working for ESA, the team gained experience in processing raw satellite image data by using sophisticated software routines, then homing in on and identifying specific elements.

“Looking at and analysing satellite images can be compared to what medical doctors have to do to understand scans like MRIs,” explained Mr Fernandez de la Pena.

“They also need to identify features indicating malfunctions according to specific characteristics.”

Adapting the techniques for analyzing complicated space images to an application for medical scientists researching into the Alzheimer disease required close collaboration between Deimos and specialists from the Technical University of Madrid.

The tool is now used for Alzheimer’s research at the Medicine Faculty at the University of Castilla La Mancha in Albacete in Spain.

Space helping medical research

“We work closely with Spanish industry and also with Elecnor Deimos though ProEspacio, the Spanish Association of Space Sector Companies, to support the spin-off of space technologies like this one,” said Richard Seddon from Tecnalia, the technology broker for Spain for ESA’s Technology Transfer Programme.

“Even if being developed for specific applications, we often see that space technologies turn out to provide innovative and intelligent solutions to problems in non-space sectors, such as this one.

“It is incredible to see that the experience and technologies gained from analysing satellite images can help doctors to understand Alzheimer’s disease.”

Using AlzTools, Deimos scientists work with raw data from a brain scan rather than satellite images. Instead of a field or a road in a satellite image, they look at brain areas like the hippocampus, where atrophy is associated with Alzheimer’s.

In both cases, notes Mr Fernandez de la Pena, “You have a tonne of data you have to make sense of.”

On The Net:

NASA Celebrates Anniversary Of Seasat Mission

April Flowers for redOrbit.com – Your Universe Online
Even if they don’t last long, history tends to look fondly upon trailblazers.
Thirty-five years ago this week, NASA’s Jet Propulsion Laboratory (JPL) launched an experimental satellite called Seasat, with the mission to study Earth and its seas. An unexpected malfunction ended the mission after just 106 days, leading some to look on the satellite as a failure. Seasat is still in orbit, however, shining in the night sky at magnitude 4.0 and continuing to live on through the many Earth and space observation missions it spawned.
In 1969, a group of engineers and scientists from many institutions came together at a conference in Williamstown, Mass., to study how satellites could be used to improve our knowledge of the oceans. NASA began planning for Seasat three years later as the first multi-sensor spacecraft dedicated to observing Earth’s ocean. A broad user group, numerous NASA centers and industry partners worked together, culminating in a launch on June 26, 1978.
Seasat collected more information about ocean physics in its brief life than had been collected in the previous 100 years of shipboard research. The spacecraft established satellite oceanography and proved the viability of several radar sensors, including imaging radar, for studying our planet. The Seasat mission spawned many subsequent Earth remote-sensing satellites that track changes in Earth’s ocean, land and ice, including many currently in orbit or in development. The advances gained through Seasat have been applied to missions studying other planets as well.
Stan Wilson, post-Seasat NASA program manager, said Seasat demonstrated the potential value of ocean microwave observations. “As a result, at least 50 satellites have been launched by more than a dozen space agencies to carry microwave instruments to observe the ocean. In addition, we have two continuing records of critical climate change in the ocean that are impacting society today: diminishing ice cover in the Arctic and rising global sea level. What greater legacy could a mission have?”
“Seasat flew long enough to fully demonstrate its groundbreaking remote sensing technologies, and its early death permitted the limited available resources to be marshaled toward processing and analyzing its approximately 100-day data set,” said Bill Townsend, Seasat radar altimeter experiment manager. “This led to other systems, both nationally and internationally, that continued Seasat’s legacy, enabling Seasat technologies to be used to better understand climate change.”
A RICH HERITAGE
The experimental instruments aboard Seasat included a synthetic aperture radar (SAR), which provided the first-ever highly detailed radar images of ocean and land surfaces from space; a radar scatterometer, which measured near-surface wind speed and direction; a radar altimeter, which measured ocean surface height, wind speed and wave heights; and a scanning multichannel microwave radiometer that measured atmospheric and ocean data, including wind speeds, sea ice cover, atmospheric water vapor and precipitation, and sea surface temperatures in both clear and cloudy conditions.
On June 28, 2013, the Alaska Satellite Facility plans to release newly processed digital SAR imagery from Seasat, which will be available for download here. Scientists will be able to travel back in time to research the ocean, sea ice, volcanoes, forests, land cover, glaciers and more. Prior to this release, only about 20 percent of Seasat SAR data had been digitally processed.
Seasat blazed trails in other areas as well. In oceanography, it provided our first global view of ocean circulation, waves and winds. This created new insights into the links between the ocean and atmosphere that drive our climate. The state of the entire ocean could be seen all at once for the first time ever. Using pulses of microwave radiation to measure the distance from the satellite to the ocean surface precisely, Seasat’s altimeter mapped ocean surface topography. This allowed scientists to demonstrate how sea surface conditions could be used to determine ocean circulation and heat storage. New information about Earth’s gravity field and the topography of the ocean floor was also revealed.
“The short 100-day Seasat mission provided a moment of epiphany to remind people that the vast ocean is best accessed from space,” said Lee-Lueng Fu, JPL senior research scientist and project scientist for the NASA/French Space Agency Jason-1 satellite and NASA’s planned Surface Water and Ocean Topography mission.
An entire generation of scientists took inspiration from Seasat’s short mission. “I decided to take a job offer at JPL fresh out of graduate school because I was told that the future of oceanography is in satellite oceanography and the future of satellite oceanography will begin with Seasat at JPL,” said JPL oceanographer Tim Liu. “I did not plan to stay forever, but I have now been here more than three decades.”
Precise measurements of sea surface height used to study climate phenomena such as El Nino and La Nina have been made since Seasat using advanced ocean altimeters on the NASA/European Topex/Poseidon and Jason missions. Jason-3, the latest Jason mission, is scheduled to launch in 2015 to continue the 20-plus-year climate data record. weather and climate models, ship routing, marine mammal studies, fisheries management and offshore operations have all been improved using satellite altimetry.
The scatterometer instrument – a microwave radar sensor used to measure the reflection or scattering effect produced while scanning the surface of Earth from an aircraft or a satellite – onboard Seasat gave us our first real-time global map of the speed and direction of ocean winds that drive waves and currents and are the major link between the ocean and atmosphere. Other missions, such as JPL’s NASA Scatterometer, Quikscat spacecraft, SeaWinds instrument on Japan’s Midori 2 spacecraft and the OSCAT instrument on India’s Oceansat-2, also used the technology first tested on Seasat. Data from these instruments help to forecast hurricanes, tropical storms and El Nino events.
Another trailblazing instrument on Seasat was the microwave radiometer, which subsequently flew on NASA’s Nimbus-7 satellite. This instrument, which measures particular wavelengths of microwave energy, led to numerous successful radiometer instruments and missions used for oceanography, weather and climate research. The descendants of Seasat’s radiometer include the Special Sensor Microwave Imager instruments launched on United States Air Force Defense Meteorological Satellite Program satellites, the joint NASA/Japanese Aerospace Exploration Agency (JAXA) Tropical Rainfall Measuring Mission microwave imager, the Advanced Microwave Scanning Radiometer (AMSR)-E that flew aboard NASA’s Aqua spacecraft, JAXA’s current AMSR-2 instrument, and numerous other radiometers launched by Europe, China and India. Seasat’s legacy continues in the radiometer, scatterometer and SAR for NASA’s Soil Moisture Active Passive mission to measure global soil moisture, launching in 2014.
Seasat demonstrated the benefit of using radiometer measurements of water vapor to correct altimeter measurements of sea surface height by simultaneously flying a radiometer with a radar altimeter. The accuracy of altimeter readings are affected by water vapor, which delays the time it takes for the altimeter’s signals to make their round trip to the ocean surface and back. All subsequent NASA/European satellite altimetry missions have used this technique.
Sea ice, and its role in controlling Earth’s climate, was also part of Seasat’s oceanographic mission. The SAR instrument provided the first high-resolution images of sea ice, measuring its movement, deformation and age. The SAR also monitored  the global surface wave field and revealed many oceanic- and atmospheric-related phenomena, from current boundaries to eddies and internal waves. Currently, SAR and scatterometers are used to monitor Earth’s ice from space.
“It’s hard to imagine where we would be without the radiometer pioneered on Seasat, but certainly much further behind in critical Earth observations than we are now,” said Gary Lagerloef of Earth & Space Research, Seattle, principal investigator of NASA’s Aquarius mission to map ocean surface salinity. The Aquarius radiometer and scatterometer instruments also trace their heritage back to Seasat.
BEYOND THE OCEAN
Seasat’s SAR also provided spectacular images of Earth’s land surfaces and geology. Datasets from Seasat were used to pioneer radar interferometry, which uses microwave energy pulses sent from sensors on satellites or aircraft to the ground to detect land surface changes such as those created by earthquakes, and measure land surface topography. In the 1980s and 1990s, three JPL Shuttle Imaging Radar experiments flew on the Space Shuttle. JPL’s Shuttle Radar Topography Mission used the technology to create the world’s most detailed topographic measurements of more than 80 percent of Earth’s land surface in 2000. Currently, JPL’s Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) airborne imaging radar system for a wide variety of Earth studies is using the technology. Many international SAR missions owe a debt to Seasat as well, including the Japanese Earth Resources Satellite 1 and Advanced Land Observing System 1, the Canadian/U.S. Radarsat 1 and the European Space Agency’s Remote Sensing Satellites. Planned to launch in 2020, NASA’s Surface Water and Ocean Topography mission will also use the technology.
Seasat’s demonstration of spaceborne repeat-pass radar interferometry to measure minute Earth surface motions has led to a new field of space geodetic imaging said Paul Rosen, JPL project scientist for a future NASA L-band SAR spacecraft currently under study, and it forms the basis for his mission.
“Together with international L-band SAR sensors, we have the opportunity in the next five years to create a 40-year observation record of land-use change where overlapping observations exist,” Rosen said. “These time-lapse images of change will provide fascinating insights into urban growth, agricultural patterns and other signs of human-induced changes over decades and climate change in the polar regions.”
JPL’s Magellan mission, which mapped 99 percent of the previously hidden surface of Venus, and the Titan radar onboard the JPL-built and -managed Cassini orbiter to Saturn both used Seasat technology.

Making Drinking Water From Sea Water Simply And Cheaply

[ Watch the Video: Desalting the Ocean ]

Michael Harper for redOrbit.com – Your Universe Online

Researchers at the University of Texas at Austin (UT) and the University of Marburg in Germany have created a simple and efficient way to desalinate ocean water. Once completed, the water can be used as drinking water or to irrigate crops. Current desalination methods take the salt from seawater using a membrane as a filter. The new method, called electrochemically mediated seawater desalination, uses a small electronic chip filled with seawater. This chip is so efficient at removing salt from that water that it only needs the power of a small, store-bought battery.

The need for fresh drinking water is a pressing one. A 2011 study from Yale University and the University of Notre Dame found that desalination will play an important role as water supplies become limited. The teams in Germany and Texas have described their process in the journal Angewandte Chemie (Applied Chemistry) and are now using the patented technology to create their own startup company called Okeanos Technologies.

“The availability of water for drinking and crop irrigation is one of the most basic requirements for maintaining and improving human health,” said Richard Crooks with the University of Texas.

“Seawater desalination is one way to address this need, but most current methods for desalinating water rely on expensive and easily contaminated membranes. The membrane-free method we’ve developed still needs to be refined and scaled up, but if we can succeed at that, then one day it might be possible to provide fresh water on a massive scale using a simple, even portable, system.”

The new plastic chip works by separating the salt from the water and directing it along a different path. There’s only a small amount of voltage (3.0 volts) required to power this separation. As saltwater passes through the chip, a small amount of voltage is applied which neutralizes some of the chloride ions in the salt water. This creates what the team calls an “ion depletion zone” which increases the amount of electricity in that spot. Salt then separates from the water when it approaches this depletion zone, sending salt along one channel and fresh water along another.

At its present stage, the chip is very small and the team has only been able to achieve a 25-percent desalination rate in their tests. However, they believe that they’ll soon be able to achieve 99 percent desalination, the necessary amount to create drinking water, when they ramp up the scale.

“This was a proof of principle,” said Kyle Knust, a graduate student who works under Crooks in his lab and co-author of the paper.

“We’ve made comparable performance improvements while developing other applications based on the formation of an ion depletion zone. That suggests that 99 percent desalination is not beyond our reach.”

Tony Frudakis, the founder of Okeanos Technologies also believes the team will be able to achieve 99 percent desalination with their new invention, thereby delivering essential freshwater to areas that desperately need it. Frudakis also said this technology could scale very well and work in applications as small as a soda machine or something large enough to provide disaster relief.

German Robotic Ape Shows Off Sophisticated Engineering Design

Michael Harper for redOrbit.com – Your Universe Online
The German Research Center for Artificial Intelligence (DFKI) is following a growing line of robots influenced by nature and have developed a walking and balancing robotic gorilla. Rather than build a bot with a purpose in mind (like the robotic pack mule being developed in the States by DARPA), the DFKI’s “iStruct” ape seems to be built as a mere demonstration of sophisticated new engineering components.
The iStruct’s feet, for example, are built with sensors to keep it moving in sync and in a more natural way than other robotic animals. The new robot ape also boasts an actuated spine that is capable of movement, a departure from the solid steel backbones found in most other animal-like robots.
In an official statement, DFKI explained the goal behind the iStruct, saying the parts they’ve developed here can be used on other platforms to “improve the locomotion and mobility characteristics” of future bots.
“The intelligent structures to be developed contain a variety of functions which cannot only extend the already existing locomotion behaviors of robots, but also permit further relevant applications like the contemporaneous use as carrier and sensor system. This way, different functionalities are united in one construction unit.”
In a video, the iStruct is seen casually loping around in slow strides, moving forward at first before engaging a point turn. According to Discovery News, these seemingly simple actions are anything but easy from an engineering perspective. DFKI has packed the iStruct with 43 force and torque sensors, some of which reside in the heel of the machine. The robotic ape walks with a heel-toe step that is more akin to a human’s gait. As the robot walks, sensors in the heel are constantly measuring the distance to the ground to keep it balanced and upright.
An array of accelerometers also work in concert to keep iStruct together and moving in one direction while temperature sensors make sure the robot’s components don’t overheat and fail. The iStruct is also autonomous and is capable of moving on its own free of wires or tethers. A self-contained battery is packed on board and weighs about 40 pounds.
In a second video on the website, the iStruct is seen perched on all fours on a table. A researcher begins tilting the table like a seesaw to try to shake the iStruct from its position. The aforementioned sensors and accelerometers worked to let the bot move along with the table rather than fall off. This balancing capability, as well as its ability to walk smoothly and naturally, will allow the DFKI’s new primate robot to climb hills and tackle terrain other robots might shy away from.
DFKI says they received their funding from the Space Agency of the German Aerospace Center and the Federal Ministry of Economics and Technology (BMWi). Although they didn’t build iStruct with a specific task in mind, it seems likely that parts of this technology could one day wind up moving robots along in outer space.
And this isn’t the first robotic primate we’ve seen, of course. A team of researchers at Carnegie Mellon University’s National Robotics Engineering Center (NREC) developed the CHIMP (short for CMU Highly Intelligent Mobile Platform) earlier this year to compete in DARPA’s Robotics Challenge.
Unlike iStruct, the CHIMP gets around via tank-like treads on its feet and forearms. It’s also built with claws for hands and was built to aid in disaster relief.

Vitamin D Supplements Decrease Depression And Blood Pressure In Diabetic Women

Rebekah Eliason for redOrbit.com – Your Universe Online

According to a new study, vitamin D supplements can reduce blood pressure and improve moods in women with type 2 diabetes who are suffering from depression. It also helped some women to lose small amounts of weight.

Lead researcher Sue M. Penckofer, PhD, RN, said “Vitamin D supplementation potentially is an easy and cost-effective therapy with minimal side effects. Larger, randomized controlled trials are needed to determine the impact of vitamin D supplementation on depression and major cardiovascular risk factors among women with Type 2 diabetes.”

A new study is planned to investigate 180 women with type 2 diabetes who have insufficient vitamin D levels and exhibit symptoms of depression. Participants will either be assigned a dose of vitamin D or receive a placebo for six months.

Currently, about one in ten Americans is diagnosed with diabetes and that number is projected to increase to one in four by 2050. Women typically have worse outcomes from diabetes than men, which may be due to the fact depression accompanies diabetes in some 25 percent of the women who develop it. A patient’s ability to manage diabetes is affected by depression because it impairs their motivation and ability to manage their diet, exercise and take medications.

Many Americans struggle with low levels of vitamin D, but people with diabetes have an increased risk for vitamin D insufficiency or deficiency. Contributing factors may include limited intake of foods high in vitamin D, obesity, lack of sun exposure and genetic variations.

A placebo study was performed which included 46 female participants with an average age of 55 who had diabetes for an average of 8 years and also had a vitamin D deficiency. It is recommended to have a dietary intake of 600 International Units (IUs) of vitamin D per day but participants took a weekly dose of 50,000 IUs.

After six months it was found that participants’ blood vitamin D levels had returned to acceptable levels along with a significant improvement in mood. Blood pressure was also improved and their average weight dropped from 226.1 pounds to 223.6 pounds.

This was a pilot study performed at Loyola University Chicago Niehoff School of Nursing and was presented at the American Diabetes Association 73rd Scientific Sessions in Chicago.

Northern Biomass Of Earth Has Been Mapped And Measured

ESA
The biomass of the northern hemisphere’s forests has been mapped with greater precision than ever before thanks to satellites, improving our understanding of the carbon cycle and our prediction of Earth’s future climate.
Accurately measuring forest biomass and how it varies are key elements for taking stock of forests and vegetation. Since forests assist in removing carbon dioxide from the atmosphere, mapping forest biomass is also important for understanding the global carbon cycle.
In particular, northern forests – including forest soil – store a third more carbon stocks per hectare as tropical forests, making them one of the most significant carbon stores in the world.
The boreal forest ecosystem – exclusive to the northern hemisphere – spans Russia, northern Europe, Canada and Alaska, with interrelated habitats of forests, lakes, wetlands, rivers and tundra.
With processing software drawing in stacks of radar images from ESA’s Envisat satellite, scientists have created a map of the whole northern hemisphere’s forest biomass in higher resolution than ever before – each pixel represents 1 km on the ground.
“Single Envisat radar images taken at a wavelength of approximately 5 cm cannot provide the sensitivity needed to map the composition of forests with high density,” said Maurizio Santoro from Gamma Remote Sensing.
“Combining a large number of radar datasets, however, yields a greater sensitivity and gives a more accurate information on what’s below the forest canopy.”
About 70 000 Envisat radar images from October 2009 to February 2011 were fed into  this new, ‘hyper-temporal’ approach to create the pan-boreal map for 2010.
This is the first radar-derived output on biomass for the whole northern zone using a single approach – and it is just one of the products from the Biomasar-II project.
Sponsored by ESA, the project also exploited Envisat archives to generate regional maps for 2005.
The future Sentinel-1 mission will ensure the continuity of this kind of radar data at large, but the dedicated Biomass satellite was recently selected to become ESA’s seventh Earth Explorer mission. The mission is set to provide an easier and more accurate way to monitor this precious resource regularly.
The Biomass satellite will complement the Biomasar results, especially for tropical regions.
“Even our new, hyper-temporal approach is not able to penetrate dense multistorey canopies of rainforests with Sentinel-1 or Envisat’s radars. Here, longer wavelengths are indeed needed,” says Prof. Christiane Schmullius from the University Jena, Biomasar coordinator.
The Biomass satellite will deliver, for the first time from space, radar measurements at a wavelength of around 70 cm to delve below the treetops. It will also monitor forest disturbance and regrowth.

On The Net:

Conversations About Healthy Eating Could Be Counterproductive

Lee Rannals for redOrbit.com – Your Universe Online

A new study published by JAMA Pediatrics found that having conversations with your teenager about weight increases the chances of them having unhealthy eating behaviors.

Researchers from University of Minnesota, Minneapolis discovered that overweight or obese adolescents whose mothers engaged in conversations about healthful eating behaviors were less likely to diet and use unhealthy weight-control behaviors.

“Because adolescence is a time when more youths engage in disordered eating behaviors, it is important for parents to understand what types of conversations may be helpful or harmful in regard to disordered eating behaviors and how to have these conversations with their adolescents,” Jerica M. Berge, Ph.D., M.P.H., L.M.F.T., of the University of Minnesota Medical School, and colleagues write in the study background.

The researchers used data from two population-based studies, as well as included surveys that were completed by adolescents and parents. The final sample the team used consisted of 2,348 adolescents and 3,528 parents.

Overweight adolescents whose mothers engaged in healthful eating conversations compared with those whose mothers did not engage in healthful eating conversations were 13 percent less likely to be dieting. Obese children who had conversations with their parents were also 13 percent more likely to engage in unhealthy weight-control behaviors, according to the study.

The researchers saw that weight conversations from one parent or from both were associated with a significantly higher prevalence of dieting relative to parents who engaged in only healthful eating conversations. They also found that adolescents whose fathers engaged in weight conversations were significantly more likely to engage in dieting and unhealthy weight-control behaviors than adolescents whose fathers did not.

“Finally, for parents who may wonder whether talking with their adolescent child about eating habits and weight is useful or detrimental, results from this study indicate that they may want to focus on discussing and promoting healthful eating behaviors rather than discussing weight and size, regardless of whether their child is nonoverweight or overweight,” the authors conclude.

Childhood obesity is becoming a growing problem in many nations. A study released two weeks ago found that the number of children admitted to a hospital for problems related to obesity in England quadrupled between 2000 and 2009. This survey found that nearly three-quarters of these admissions were to deal with problems complicated by obesity, such as asthma, breathing difficulties during sleep, and complications of pregnancy.

National surveys in England found that around 30 percent of children aged two to 15 are overweight and 14 to 20 percent are obese.

Obesity has also been linked to a higher chance in having hearing loss problems. A study published in The Laryngoscope found that obese adolescents have increased hearing loss across all frequencies, but were nearly twice as likely to have unilateral low-frequency hearing loss.

New Scanning Laser Can Tell What An Object Is Made Of

Brett Smith for redOrbit.com – Your Universe Online

Using commercially-available telecommunications technology, engineers from the University of Michigan have developed a laser capable of detecting a substance’s composition from high up in the Earth’s atmosphere, according to a new report in the journal Optics Letters.

Project researchers said the technology could have a wide range of applications, from military warning systems to airport security.

“For the defense and intelligence communities, this could add a new set of eyes,” said report co-author Mohammed Islam, an electrical engineering at the University of Michigan.

Most lasers emit of one wavelength, or color, of light. The new system, however, emits a broadband beam of infrared energy that has columns of light across a spectrum of wavelengths. The beam is in the infrared region, making it invisible to human eyes.

The engineers said infrared light contains the “spectral fingerprinting range,” or frequencies that represent the vibrations of the molecules within a solid substance. The so-called spectral fingerprint is determined by which wavelengths of light are absorbed or reflected, since various substances interact with infrared light differently.

“A grey structure looks grey in visible light, but in the infrared, you can see not only the shape, but also what’s inside it,” Islam said.

The US military already uses spectral fingerprinting to identify potential targets, but the technology currently in use needs sunlight, complicating matters on a cloudy day or at night. Similar chemical sensors are also in use, but these tend to only work at close range.

To develop the new system, the Michigan researchers had to develop a relatively powerful broadband laser, Islam explained. The team began with a 5-watt prototype, moved up to a 25.7-watt version, and is currently working with a 50-watt prototype, which is slated to be field tested later this year.

Last year, the team traveled to Wright Patterson Air Force Base in Ohio to field test their smallest prototype. Observers from the Air Force Research Labs, the University of Michigan, and several other groups were there as the engineers placed the experimental laser in a 12-story tower and directed it to targets about one mile away on a runway. The team used scientific cameras and other instruments to record the laser beam quality and signal level.

The engineers say the technology could give an airborne military aircraft the capacity to illuminate an area with the same magnitude as natural sunlight and then scan the target region.

Islam says that the new technology could also improve TSA airport screening technologies that many feel are intrusive or degrading.

“Those are imaging devices looking for bumps where there shouldn’t be bumps,” Islam said. “They’re looking for shapes that are odd or different. But they can’t see the chemicals in the shapes. That’s why you have to take your shoes off. But our laser can detect the chemical composition.”

The engineers said they were able to build the device by applying a patented technique to commercially available telecom fiber optic technology. They added that their laser takes advantage of the fiber’s natural physics to create the light burst.

Managing Future Of Colorado River Flows

University of Washington

The Colorado River provides water for more than 30 million people, including those in the fast-growing cities of Las Vegas, Phoenix and Los Angeles. Increasing demand for that water combined with reduced flow and the looming threat of climate change have prompted concern about how to manage the basin’s water in coming decades.

In the past five years, scientific studies estimated declines of future flows ranging from 6 percent to 45 percent by 2050. A paper by University of Washington researchers and co-authors at eight institutions across the West aims to explain this wide range, and provide policymakers and the public with a framework for comparison. The study is published this week in the Bulletin of the American Meteorological Society.

“The different estimates have led to a lot of frustration,” said lead author Julie Vano, who recently earned a UW doctorate in civil and environmental engineering. “This paper puts all the studies in a single framework and identifies how they are connected.”

Besides analyzing the uncertainty, the authors establish what is known about the river’s future. Warmer temperatures will lead to more evaporation and thus less flow. Changes to precipitation are less certain, since the headwaters are at the northern edge of a band of projected drying, but climate change will likely decrease the rain and snow that drains into the Colorado basin.

It also turns out that the early 20th century, which is the basis for water allocation in the basin, was a period of unusually high flow. The tree ring record suggests that the Colorado has experienced severe droughts in the past and will do so again, even without any human-caused climate change.

“The Colorado River is kind of ground zero for drying in the southwestern U.S.,” said co-author Dennis Lettenmaier, a UW professor of civil and environmental engineering. “We hope this paper sheds some light on how to interpret results from the new generation of climate models, and why there’s an expectation that there will be a range of values, even when analyzing output from the same models.”

The authors include leaders in Western water issues, ranging from specialists in atmospheric sciences to hydrology to paleoclimate. Other co-authors are Bradley Udall at the University of Colorado in Boulder; Daniel Cayan, Tapash Das and Hugo Hidalgo at the University of California, San Diego; Jonathan Overpeck, Holly Hartmann and Kiyomi Morino at the University of Arizona in Tucson; Levi Brekke at the federal Bureau of Reclamation; Gregory McCabe at the U.S. Geological Survey in Denver; Robert Webb and Martin Hoerling at the National Oceanographic and Atmospheric Administration in Boulder; and Kevin Werner at the National Weather Service in Salt Lake City.

The authors compared the array of flow projections for the Colorado River and came up with four main reasons for the differences. In decreasing order of importance, predictions of future flows vary because of:

  • Which climate models and future emissions scenarios were used to generate the estimates.
  • The models’ spatial resolution, which is important for capturing topography and its effect on the distribution of snow in the Colorado River’s mountainous headwaters.
  • Representation of land surface hydrology, which determines how precipitation and temperature changes will affect the land’s ability to absorb, evaporate or transport water.
  • Methods used to downscale from the roughly 200-kilometer resolution used by global climate models to the 10- to 20-kilometer resolution used by regional hydrology models.

While the paper does not determine a new estimate for future flows, it provides context for evaluating the current numbers. The 6 percent reduction estimate, for example, did not include some of the fourth-generation climate model runs that tend to predict a dryer West. And the 45 percent decrease estimate relied on models with a coarse spatial resolution that could not capture the effects of topography in the headwater regions. The analysis thus supports more moderate estimates of changes in future flows.

“Drought and climate change are a one-two punch for our water supply,” said Overpeck, a professor of geosciences and of atmospheric sciences at the University of Arizona.

The new paper is intended to be used by scientists, policymakers and stakeholders to judge future estimates.

“I hope people will be able to look at this paper and say, ‘OK, here’s the context in which this new study is claiming these new results,’” Vano said.

The research was funded by NOAA through its Regional Integrated Sciences and Assessments program and its National Integrated Drought Information System.

On The Net:

Labeling Food Sizes Affects Perceptions Of Portions

Brett Smith for redOrbit.com – Your Universe Online

The idea of portion control was at the center of New York City Mayor Michael Bloomberg’s so-called soda ban, and a new study from a pair of Cornell University researchers has found describing a portion size can affect eating habits in the same way smaller portions can reduce food intake.

According to the study published in the journal Health Economics, labeling something as “Regular” or “Double Size” impacts how much people eat and how much customers are willing to pay, regardless of the actual portion size.

The Cornell researchers began by serving two different portion sizes of lunch items in either one cup (small) or two cups (large). For some participants, the small and large portions were labeled “Half-Size” and “Regular.” For the others, the same portions were labeled “Regular” and “Double-Size.” The labeling for the first group indicated that the two-cup serving was the normal size, while it was suggested to the second group the one-cup portion was the regular size.

The scientists found the varying “Regular” portions significantly impacted how much people ate. When served identical large portions, participants ate more when it was labeled “Regular” than when it was labeled “Double-Size.” The perceived difference was so great that “Double-size” portion-eaters left 10 times as much food on their plates, the researchers said.

To see how much patrons will pay for differently labeled portion sizes, the researchers set up a kind of food auction. When an auctioned item was labeled “Half-Size,” study volunteers only wanted to spend half as much as when the same portion was labeled “Regular.” Only the labels themselves, not the visual appearance of each serving, were used to indicate the amount of food on each plate compared to a supposed normal serving.

Taken together, the two experiments show people are not only willing to pay more for a bigger sounding portion, they will also eat more of a larger portion if it is arbitrarily considered “Regular.”

For those concerned about the health impacts of larger portion sizes, both consumers and producers could benefit from standardization of food size-labeling, the Cornell researchers said in a statement. By clearly defining what a “small” or a “large” is, customers would be able to know just how much food they are ordering no matter where they are.

The Centers for Disease Prevention and Control and Prevention (CDC) has posted several portion control steps on their official website that people can take when ordering food at a restaurant, including splitting an entrée with a friend or asking for a to-go box when the dinner arrives. The CDC also suggests “spoiling your dinner” with a healthy snack before going out to eat.

The Mayo Clinic in Minnesota also has serving size recommendations on its website. The clinic suggests a serving of pasta should be similar in size to a hockey puck, and protein portions should be about the size of a deck of cards.

Some companies are even seeing a business opportunity in portion control. A recent blog post on Al.com mentioned $3 “portion control plates” available at CVS.

The Nerd Stereotype Deters Women From Computer Science Careers

redOrbit Staff & Wire Reports – Your Universe Online

Stereotypical images of “computer nerds” commonly put forward by television and movies can have a chilling effect on women entering the field of computer science, according to new research from the University of Washington.

However, when these images are downplayed in the print media, women express more interest in further education in this field, the researchers said.

Despite years of effort, it has long been difficult to recruit women into many fields that are perceived to be masculine and male-dominated, such as computer science. The prevalent image of the lone computer scientist focused only on technology stands in stark contrast to a more people-oriented or traditionally feminine image, the researchers said.

Sapna Cheryan, lead researcher of the current study, said understanding what prevents women from entering computer science is key to achieving gender parity in science, technology, engineering and mathematics. Cheryan and team investigated this shortage of women in computer science and other scientific fields, seeking to prove it is not only due to a lack of interest in the subject matter on the part of women.

The investigators completed two studies. In the first, 293 college students from two West Coast universities were asked to provide descriptions of computer science majors. The researchers wanted to discover what the stereotypical computer scientist looked like in students’ minds.

Both women and men spontaneously offered an image of computer scientists as technology-oriented, intensely focused on computers, intelligent and socially unskilled. These characteristics contrast with the female gender role and are inconsistent with how many women see themselves, the researchers said.

The researchers conducted a second study to examine whether the way a social group is portrayed in the media also influences how people think about that group and their relation to it. In that study, the researchers used fabricated newspaper articles to manipulate the students’ images of a computer scientist to investigate the influence this had on women’s interest in entering the field.

A total of 54 students read articles about computer science majors that described these students as either matching or not matching the common stereotypes. The students were then asked to rate their interest in computer science.

The results showed exposure to a newspaper article claiming computer science majors no longer fit current preconceived notions increased women’s interest in majoring in computer science, compared with exposure to a newspaper article that portrayed computer scientists as reflecting current stereotypes.

The male participants were unaffected by how computer science majors were represented in the articles.

“Broadening the image of the people in the field using media representations may help to recruit more women into male-dominated fields such as computer science. Moreover, the media may be a powerful transmitter of stereotypes, and prevent many women from entering these fields,” the researchers concluded.

The research is published online in the current issue of the journal Sex Roles.

Your Eyes May Give You Away And Reveal What Gives You Pleasure

April Flowers for redOrbit.com – Your Universe Online

A new study led by Drexel University has found a common, low-cost ophthalmological tool can measure the brain’s pleasure response to tasting food through the eyes. If the results are validated, this method could have applications for research and clinical work in food addiction and obesity prevention.

The study testing the use of electroretinography (ERG) to indicate increases in the neurotransmitter dopamine in the retina, was led by Dr. Jennifer Nasser, an associate professor in the department of Nutrition Sciences in Drexel University’s College of Nursing and Health Professions. The results of this study were published in the journal Obesity.

ERG measures the electrical responses of the different cell types present in the retina, including rods, cones, inner retinal cells, and the ganglion cells. Researchers placed electrodes on the cornea and the skin near the eye, and then exposed the patients’ eyes to stimuli.

A variety of pleasure-related effects in the brain are associated with dopamine, including the expectation of reward. Dopamine is released in the retina of the eye when the optical nerve activates in response to light exposure.

High spikes in the retina’s electrical signals were observed in response to a flash of light when a food stimulus (a small piece of chocolate brownie) was placed in participants’ mouths. The spike was as large as that seen when the participants were given the stimulant drug methylphenidate to induce a strong dopamine response. Both responses, to drug and food stimuli, were greater than the response to light observed when the participants ingested a control substance, water.

“What makes this so exciting is that the eye’s dopamine system was considered separate from the rest of the brain’s dopamine system,” Nasser said. “So most people– and indeed many retinography experts told me this– would say that tasting a food that stimulates the brain’s dopamine system wouldn’t have an effect on the eye’s dopamine system.”

Nasser and her colleagues conducted the study on a very small scale with only nine participants. The majority of the participants were overweight, but none had eating disorders. All nine underwent a four hour fast before food stimulus testing. Larger testing studies are needed to validate the technique. Such validation will allow researchers to use ERG for studies of food addiction and food science.

“My research takes a pharmacology approach to the brain’s response to food,” Nasser said. “Food is both a nutrient delivery system and a pleasure delivery system, and a ‘side effect’ is excess calories. I want to maximize the pleasure and nutritional value of food but minimize the side effects. We need more user-friendly tools to do that.”

Nasser says the low-cost and ease of performing ERG make it an appealing method for research. For example, Medicare reimbursement for clinical use of ERG is about $150 per session. An ERG session generates approximately 200 scans in just two minutes. A PET scan, on the other hand, costs about $2,000 per session and takes more than an hour to generate a single scan.

New Robot Is Future Farmer’s Helper

Lee Rannals for redOrbit.com – Your Universe Online

A robot inspired by a hamster may be able to help out farmers by monitoring soil conditions on their land.

The spherical-shaped robot works similarity to how a hamster is able to make a wheel move. It relocates its center of gravity to help make the robot roll over itself. This locomotion allows ROSPHERE to travel on non-compacted surfaces, such as dirt and sand.

ROSPHERE features an embedded computer that processes all sensor information in order to define control actions in both degrees of freedom (DOFs) to reach predefined trajectories. It is able to move backward and forward, as well as make turns.

The team from the Universidad Politécnica de Madrid developed the robot to find a method of location that would not be thwarted by uneven or difficult terrain. Wheeled robots can struggle on shifting ground or places strewn with lots of large and small objects. However, ROSPHERE’s unique locomotion allows it to move past these hurdles.

Researchers hope ROSPHERE will eventually be able to travel around fields to monitor conditions and tell farmers the best time to water or tend to their crops. The robot will be able to do this by utilizing its wireless communication system and sensors, which are able to report on moisture levels and temperature. The onboard computer is able to process sensor information such as 9-DOF Inertial Measurement Unit and GPS.

ROSPHERE works similar to the small robotic ball Sphero created by Orbotix. This spherical robotic ball is controlled through an iPhone and moves similar to how the researchers have designed ROSPHERE to move. However, Sphero is designed as more of a toy, while ROSPHERE was created as a tool.

The scientists first used the robot to measure in situ the environment variables on the rows of crops, where the robot shape is suitable for rolling and gathering information. They are working on a second prototype to help improve the mechanical aspects of the robot, as well as include external units supplied with additional sensors.

Another technology being developed to help robots move through sand is known as “terradynamics.”

“We now have the tools to understand the movement of legged vehicles over loose sand in the same way that scientists and engineers have had tools to understand aerodynamics and hydrodynamics,” said Daniel Goldman, a professor in the School of Physics at the Georgia Institute of Technology. “We are at the beginning of tools that will allow us to do the design and simulation of legged robots to not only predict their performance, but also to optimize designs and allow us to create new concepts.”

Aerosols May Have Helped Suppress Hurricanes During 20th Century

Lee Rannals for redOrbit.com – Your Universe Online

Researchers wrote in the journal Nature Geoscience that aerosols could have played a role in helping to suppress the number of Atlantic hurricanes over the 20th Century.

The scientists found that aerosols make clouds brighter, which causes them to reflect more energy back from the sun into space. This impacts ocean temperatures and tropical circulation patterns, making the conditions less favorable for hurricanes.

“Industrial emissions from America and Europe over the 20th Century have cooled the North Atlantic relative to other regions of the ocean. Our research suggests that this alters tropical atmosphere circulation – making it less likely that hurricanes will form,” said Dr. Nick Dunstone, a Met Office climate prediction scientist and lead author of the research.

He said since the introduction of the clean air-acts in the 1980s, concentrations of aerosols over the North Atlantic have reduced, which has helped increase hurricane activity.

“On the other hand, the reduction in aerosols has been beneficial for human health and has been linked to the recovery of Sahel rains since the devastating drought in the 1980s,” Dunstone said.

Dr. Doug Smith, a Met Office research fellow and co-author of the study, said there was a relatively quiet hurricane period between 1900 and 1920, and then again from 1970 to 1980. Active periods have 40 percent more hurricane activity than these quiet periods.

The researchers were able to use changes in man-made aerosol emissions in the Met Office Hadley Center model to reproduce the decade-to-decade Atlantic hurricane activity.

“This study, together with work we published last year, suggests that there may be a greater role than previously thought for man-made influence on regional climate changes that have profound impacts on society,” said Dr. Ben Booth, a Met Office climate processes scientist and another co-author of the study.

The scientists say this study will help future international research because modeling the impact of aerosols is one of the largest uncertainties in climate science. This study suggests the number of Atlantic hurricanes over the next couple of decades will depend on future aerosol emissions and how they interact with natural cycles in the North Atlantic.

The National Oceanic Atmospheric Administration (NOAA) is predicting the 2013 hurricane season will be very active. NOAA said there will be seven to 11 hurricanes hitting the Atlantic, three to six of which could turn into major storms with winds 111 mph or higher.

Experts say higher than average water temperatures in the Atlantic and Caribbean Sea are one of the major factors for their 2013 prediction.

Third-Party Blocking Moves Forward For Mozilla’s Firefox Browser

Enid Burns for redOrbit.com – Your Universe Online

After a period of review, Firefox decided to go ahead with plans to block third-party cookies for its new version of the web browser. The non-profit organization, Mozilla, will work with Aleecia McDonald of the Center for Internet and Society and the Cookie Clearinghouse (CCH) to manage blocked and allowed cookies.

Mozilla plans to block third-party cookies to eliminate false positives and false negatives. Mozilla co-founder and chief technical officer Brendan Eich explained the logic of the company’s decision on his blog:

— We want a third-party cookie policy that better protects privacy and encourages transparency.

— Naïve visited-based blocking results in significant false negatives and false positive errors.

— We need an exception management mechanism to refine the visited-based blocking verdicts.

— This exception mechanism cannot rely solely on the user in the loop, managing exceptions by hand. (When Safari users run into a false positive, they are advised to disable the block, and apparently many do so, permanently.)

— The only credible alternative is a centralized block-list (to cure false negatives) and allow-list (for false positives) service.

“I’m very pleased that Aleecia McDonald of the Center for Internet and Society at Stanford has launched just such a list-based exception mechanism, the Cookie Clearinghouse (CCH),” Eich said in his blog post. “Today Mozilla is committing to work with Aleecia and the CCH Advisory Board, whose members include Opera Software, to develop the CCH so that browsers can use its lists to manage exceptions to a visited-based third-party block.”

Mozilla seeks feedback for the proposed methods the CCH will deploy on the Firefox Aurora patch, which is still in development.

Firefox Aurora will make use of the CCH list to allow or deny third-party cookies. Apple’s Safari browser already blocks third-party cookies.

The advertising industry has hotly disputed Mozilla’s decision, with support from the industry group the Interactive Advertising Bureau. Many sites use third-party cookies to identify site visitors in an anonymous manner. The cookies are set by ad networks and analytics platforms, and track certain behavior on a particular website, or across a network.

“Firefox’s developers made the decision despite intense resistance from advertising groups, which have argued that tracking is essential to delivering well-targeted, lucrative ads that pay for many popular internet services,” reports Washington Post.

“Mozilla’s decision to block third-party cookies is a line in the sand for the advertising industry. We’re trying to change the dynamic so that trackers behave better,” The Washington Post cites Eich as saying.

Firefox is the second most popular browser. According to w3schools.com, Firefox comprises 27.7 percent of the browsers on the internet, following Google’s Chrome browser at 52.9 percent.

The tactics to be employed by Mozilla are a response to the failure of the Do Not Track standards.

One criticism of the Do Not Track practice is that it still leaves a cookie in a user’s browser so web sites know not to track the user. When the cookie is deleted, the site has to place a new cookie in the browser.

Google Rethinks Its Hiring Process – Less Emphasis On Test Scores, More On Personal Experience

Michael Harper for redOrbit.com – Your Universe Online

In the early days, the folks at Google made themselves an example of a new age; a few people with an idea and a few lines of code could build a billion dollar company. They took this even farther with their unorthodox thoughts on how a business should be run. Employees were given time during the week, while on the job, to work on their own pet projects. This, of course, later meant Google would have to begin shuttering many services to streamline their business. Before anyone could become an employee of Google and call Google Campus their workplace, they had to go through an intense hiring process, complete with brain teasers, off the wall questions and meetings with four or five interviewers.

These days Google has changed their approach, and in a recent interview with the New York Times, senior vice president of people operations Laszlo Bock explains that they studied their own approach and found it lacking.

Typical of anything Google, the company compiled vast amounts of data about their hiring process, then compared it against the actual job performance of those people who had been hired using this process. According to Bock and the data his company has collected, there’s zero correlation between a promising interview process and a talented and productive employee. Bock specifically mentions one Google employee who conducted interviews for his team, and only he was able to bridge the gap between how he scored a candidate in the interview and how that person actually performed in their job years later. Bock also says this one employee was able to earn such a track record because he only hired for a very specialized area in the company and happened to be the world’s leading expert in his field.

In addition to learning that the hiring process, as it stands, is completely random, Bock says Google has done away with those brain teasers they famously asked in the interview process.

“On the hiring side, we found that brainteasers are a complete waste of time. How many golf balls can you fit into an airplane? How many gas stations in Manhattan? A complete waste of time. They don’t predict anything. They serve primarily to make the interviewer feel smart,” said Bock.

Instead Google’s new hiring data found that it’s more efficient to ask a candidate about their own experiences on the job and how they handled themselves in real life situations rather than trying to make assumptions based on hypotheticals.

“The interesting thing about the behavioral interview is that when you ask somebody to speak to their own experience, and you drill into that, you get two kinds of information. One is you get to see how they actually interacted in a real-world situation, and the valuable “meta” information you get about the candidate is a sense of what they consider to be difficult,” said Bock.

In its earlier days, Google was also notorious for screening candidates based on their GPA and SAT scores. Even potential employees who had been out of school for 20 or 30 years were asked for this information and sorted out if the numbers didn’t match what the company was looking for. Google no longer uses these scores as a metric for hiring and has even begun hiring employees without any college experience.

“One of the things we’ve seen from all our data crunching is that GPAs are worthless as a criteria for hiring, and test scores are worthless – no correlation at all except for brand-new college grads, where there’s a slight correlation,” said Bock.

“After two or three years, your ability to perform at Google is completely unrelated to how you performed when you were in school, because the skills you required in college are very different. You’re also fundamentally a different person. You learn and grow, you think about things differently.”

Cities Are Like Stars, Says Researcher, And We Need To Better Understand Them

Brett Smith for redOrbit.com – Your Universe Online

According to an unusual new paper in the journal Science, cities can act very much like our sun. However, instead of smashing hydrogen atoms together to make energy, cities smash people together – creating social connections that can result in different kinds of energy: innovation and productivity.

“It’s an entirely new kind of complex system that we humans have created,” said study author Luis Bettencourt, a professor at the Santa Fe Institute. “A city is first and foremost a social reactor. It works like a star, attracting people and accelerating social interaction and social outputs in a way that is analogous to how stars compress matter and burn brighter and faster the bigger they are.”

Despite the convenient analogy, Bettencourt also asserted that the social “math” that occurs within cities is actually far different than the calculations explaining what goes on inside a star. The Santa Fe researcher said the social network component of cities is what separates them from simply being a type of social reactor.

Bettencourt’s paper, and his theory, is based on four main principles: mixing population, incremental network growth, the limits of human effort, and socioeconomic outputs as they relate to social interactions.

According to the study, a city develops as a network of social interaction that optimizes the conditions of its residents. As cities grow, they add to their infrastructure – allowing for further growth. Because of the limits of human effort, cities never experience rapid, uncontrollable growth, Bettencourt concluded.

Based on this framework, certain city dynamics are predictable – from land use to rent levels.

“As more people lead urban lives and the number and size of cities expand everywhere, understanding more quantitatively how cities function is increasingly important,” Bettencourt said. “Only with a much better understanding of what cities are will we be able to seize the opportunities that cities create and try to avoid some of the immense problems they present. This framework is a step toward a better grasp of the functioning of cities everywhere.”

Bettencourt and his colleagues at the Santa Fe Institute have worked over the past decade to study urban data for establishing a quantitative theory of cities, according to a press release. Recent advances in urban data collection and processing has allowed for this relatively new perspective on cities, the study said. Bettencourt noted that the massive volume of new data has facilitated the study of general statistical patterns of urban infrastructure and socioeconomic activity.

“Rapid urbanization is the fastest, most intense social phenomenon that ever happened to humankind, perhaps to biology on Earth,” Bettencourt said. “I think we can now start to understand in new and better ways why this is happening everywhere and ultimately what it means for our species and for our planet.”

The Santa Fe researcher said his initial framework is a first theoretical stage that needs further refining and expanding. He added that more and better urban data from developing nations will soon become available, allowing for further study in places where knowing how urbanization works is extremely important.

Three Americans Receive Prestigious 2013 Kyoto Award

Michael Harper for redOrbit.com – Your Universe Online
Three Americans have won the 29th annual Kyoto Prize award for their individual work in biology, music and technology. Dr. Masatoshi Nei an evolutionary biologist and professor at Penn State, will receive the award for his work in the biological sciences. Cecil Taylor, a jazz pianist known for his percussive style, will be honored with the Arts and Philosophy Prize for his work with the musical arts. Finally, Dr. Robert Dennard, the IBM researcher who invented the first DRAM chip in the 1970s, will receive the Advanced Technology Prize for his work in the field of electronics.
The Kyoto prize has been handed out every year since 1985 by the Inamori foundation, a philanthropic venture started by the founder and chairman of the Kyocera Corporation and KDDI Corporation, Dr. Kazuo Inamori.
Dr. Masatoshi Nei, age 82, has made evolutionary divergence and genetic diversity his life’s work. In the 1970s Dr. Nei devised a measure of distance which could determine how closely related, evolutionarily speaking, different species were from one another. Using this measure –called Nei’s Distance – he showed that Africans, Asians and Europeans represent only about 11 percent of the genetic variation in the world. He also concluded that these three ethnicities split away from one another in Africa some 115,000 years ago, representing the first “out of Africa” theory of human origin.
Cecil Taylor, 84, has often been hailed as one of the most original jazz pianists of all time. His percussive style of free form jazz made him a standout improviser in the early 60s. Taylor’s style of playing is often compared to playing the piano as if it were a drum kit. His music is intently free form with plenty of atonal structures and little resolution. He began playing in the 1950s and continued all the way into the 2000s, often giving long, marathon-style performances.
Finally, US researcher Dr. Robert Dennard is credited with inventing the most essential building blocks of modern DRAM memory found in many computing devices today. He was awarded a patent for the technology in 1968 and the first DRAM chip went on sale five years later in 1973. At the time Dr. Dennard’s DRAM contained less memory than other chips on the market and sold on 1k to 4k varieties.
Dr. Dennard’s invention was so influential because it not only allowed for the random access of memory (RAM) rather than a sequential access of memory (like a tape), but it also read and wrote this memory down more quickly than previous chips.
As computing took off in the seventies and became the massive industry that it is today, Dr. Dennard’s work on the most essential parts of computing became even more influential and now largely considered responsible for the growth of this industry.
All three of these men, each in their 80s, will fly to Japan in November to receive their awards at a special ceremony in Kyoto. While there they’ll receive a 20-karat gold Kyoto Prize medal, a diploma announcing their honor, and a cash prize of 50 million yen, or about $500,000. The Kyoto Prize has been awarded to researchers, scientists, philosophers and artists from 15 nations. Though based in Japan, this internationally recognized award has been given to more Americans (39) than any other country. In fact, 2013 marks the second time the award has been given exclusively to Americans (the first time was in 1996) and the third time that all of the recipients were from North America.

Thirdhand Smoke Can Cause DNA Damage

Rebekah Eliason for redOrbit.com – Your Universe Online

For the first time ever, thirdhand smoke has been found to cause significant amounts of damage to human cells. Thirdhand smoke is the toxic residue that attaches to almost all surfaces leaving a strong odor long after cigarette smoke has dissipated.

The researchers from Lawrence Berkeley National Laboratory also discovered in this study that chronic exposure caused more DNA damage than acute exposure. Samples exposed to chronic thirdhand smoke contained higher concentrations of chemical compounds than samples exposed to acute third-hand smoke. This suggests that as the smoke residue sits it becomes more harmful over time.

“This is the very first study to find that thirdhand smoke is mutagenic,” explained Lara Gundel, a Berkeley Lab scientist and co-author of the study. “Tobacco-specific nitrosamines, some of the chemical compounds in thirdhand smoke, are among the most potent carcinogens there are. They stay on surfaces, and when those surfaces are clothing or carpets, the danger to children is especially serious.”

Researchers tested DNA damage for genotoxicity by using two common in vitro assays known as the Comet assay and the long amplicon-qPCR assay. In vitro assays are laboratory tests which are performed in a test tube to measure the activity of a drug or compound. The tests found that samples tested positive for genotoxicity with both DNA strand breaks as well as oxidative DNA damage. Genotoxicity of this nature is associated with gene mutation which possibly leads to many types of cancer caused by smoking and secondhand smoke exposure.

“Until this study, the toxicity of thirdhand smoke has not been well understood,” explained lead researcher Bo Hang. “Thirdhand smoke has a smaller quantity of chemicals than secondhand smoke, so it’s good to have experimental evidence to confirm its genotoxicity.”

In 2010 Berkeley Lab studies found residual nicotine can react with ozone and nitrous acid which are common indoor air pollutants that form hazardous compounds. The nicotine left in thirdhand smoke reacts with nitrous acid and undergoes a chemical reaction that forms tobacco-specific carcinogenic nitrosamines. When nicotine reacts with ozone, ultrafine particles are formed that can pass through human tissue and carry harmful chemicals with them. People can be exposed to dangerous thirdhand smoke residue through inhalation, ingestions or skin contact.

Thirdhand smoke is especially dangerous because it is extremely difficult to eliminate. Studies found that even two months after smokers moved out of an apartment, smoke particles can be detected in dust. Normal cleaning techniques such as ventilation and vacuuming have not been shown to lower nicotine contamination.

“You can do some things to reduce the odors, but it’s very difficult to really clean it completely,” said Destaillats. “The best solution is to substitute materials, such as change the carpet, repaint.”

This study was performed by placing paper strips in smoking chambers. One set was placed in the chamber for twenty minutes to simulate acute exposure and the second set was placed in a chamber for 258 hours over the span of 196 days to simulate chronic exposure. During the 196 days, the chamber was ventilated for approximately 35 hours.

Researchers discovered higher concentrations of toxic chemicals in the chronically exposed samples for over half the compounds studied when compared with acutely exposed samples. DNA damage was also more extensive in chronic samples.

“The cumulative effect of thirdhand smoke is quite significant,” Gundel said. “The findings suggest the materials could be getting more toxic with time.”

Hang and his teammate’s first extracted compounds from the paper strips using a culture medium that was used to expose human cells for 24 hours. Concentrations of each of the compounds were carefully measured.

“They are close to real-life concentrations, and in fact are on the lower side of what someone might be exposed to,” Hang said.

Hang intends to continue his research by understanding how the chemical reaction between the nitrous amine NNA and different DNA bases works. NNA is important because it is a tobacco-specific nitrous amine not found in secondhand smoke. “It looks like it’s a very important component of thirdhand smoke, and it is much less studied than NNK and NNN in terms of its mutagenic potential,” Hang explained.

Researchers concluded that “Ultimately, knowledge of the mechanisms by which thirdhand smoke exposure increases the chance of disease development in exposed individuals should lead to new strategies for prevention.”

This study “Thirdhand smoke causes DNA damage in human cells” is published in the journal Mutagenesis. The lead investigator was Bo Hang, a biochemist in the Life Sciences Division of Berkeley Lab. He worked with an interdisciplinary group, including chemists from Berkeley Lab’s Environmental Energy Technologies Division Lara Gundel, Hugo Destaillats and Mohamad Sleiman as well as scientists from UC San Francisco, UCLA Medical Center and the University of Texas.

California Study Investigates Financial Burden Of Child-Rearing Grandparents

redOrbit Staff & Wire Reports – Your Universe Online

Raising a child can place quite a strain on any family’s finances, but a new study demonstrates that the burden is especially cumbersome for the increasing number of older men and women raising their grandchildren.

The Elder Index, which was compiled by experts at the UCLA Center for Health Policy Research and the Insight Center for Community Economic Development, reports that over 300,000 grandparents in the state of California alone are primarily responsible for the care and upbringing of their grandsons and granddaughters.

Of those individuals, nearly 65,000 are over the age of 65, and more than 20,000 take care of their grandchildren without extended assistance from their families. Those older men and women, the authors said, are among California’s most vulnerable residents because of the state’s high cost of living and low levels of public assistance.

“California’s high cost of living turns the loving act of caring for a grandchild into a desperate financial risk,” said lead author D. Imelda Padilla-Frausto, a graduate student researcher at the UCLA center. “And older grandparents, many on fixed incomes and with limited mobility, are often the least able to advocate for, and access, public assistance.”

According to the latest Elder Index calculations, which are based on the real cost of living in each California county, nearly half of all custodial grandparents at least 65 years of age lack sufficient income to meet the basic needs of the grandchildren they are caring for. However, public assistance programs which could help them cover those costs are often difficult or completely off-limits for family caregivers, the authors said.

“There is a hypocrisy built into how assistance is allocated to children and their caregivers in California,” said Susan E. Smith, director of the California Elder Economic Security Initiative at the Insight Center. “We preach the importance of keeping families together yet deny grandparents essential assistance because they are ‘family.’ This is an injustice that policymakers could easily address by making more benefits available, and accessible, to grandparents.”

Most older California residents are not eligible for state medical assistance, housing subsidies or food-related benefits because their income levels are often slightly more than the federal poverty level ($18,530 for a family of three and $14,710 for a family of two in 2011), the authors said. They claim that many experts view that income standard as inadequate because it does not account for income variations in different states and different counties.

For example, grandparents who live in more costly areas such as Los Angeles or San Francisco could earn more than the federal poverty level allows for governmental assistances, but less than required to cover essential needs in their city of residence. In fact, the researchers report that the cost involved in caring for at least one grandchild “far exceeds” the federal poverty level in every single county in the state of California.

“In 2011, an older couple with one grandchild who lived in a two-bedroom rental needed an income as high as $49,942 if they lived in Santa Cruz County and as low as $32,965 if they lived in Kern, the ‘lowest-cost’ county,” the university said in information released as part of the Elder Index’s regularly-scheduled biennial update.

To rectify this situation, the authors suggested raising eligibility criteria for some public programs to 200 percent of the poverty level and extending state foster-care benefits to kinship caregivers. They also suggest limiting the frequency of “cumbersome and bureaucratic” benefit renewals, as the majority of seniors live on fixed incomes and thus do not experience the income fluctuations that typically require regular documenting.

Almost Half Of iPhone Apps Peek At Your Private Stuff

Lee Rannals for redOrbit.com – Your Universe Online

According to a new study, more than 13 percent of apps access an iPhone’s physical location while six percent access the device’s address book.

Computer scientists at the University of California, San Diego discovered that nearly half of the mobile apps running on Apple’s iOS operating system have gained access to private data. These findings are based on a study of 130,000 users of jailbroken iOS devices, where uses have removed restrictions that keep apps from accessing the iPhone’s operating system.

One might assume that the results are skewed because the study participants were using a jailbroken iPhone. However, the majority of applications in the study were downloaded through Apple’s App Store and were able to access the same information on locked phones as well.

In March, Apple stopped accepting new applications or app updates that access these “unique identifiers,” or privacy invaders. However, the findings suggest that although this update was made to the App Store policy, many apps can still get that information. Unique identifiers allow the creators of the app and advertisers to track a user’s behavior through all the different apps on their devices. Some apps even associate the unique identifier with the user’s email and other personal information.

The researchers developed an app called ProtectMyPrivacy (PMP) that is able to detect what data the other apps running on an iOS device are trying to access. Their application enables users to selectively allow or deny access to information on an app-by-app basis, based on whether they feel the apps need the information to function properly.

The team has also added notifications and recommendations for when an app accesses other privacy-sensitive information, such as a devices’ front and back camera, microphone and photos.

“We wanted to empower users to take control of their privacy,” said Yuvraj Agarwal, a research scientist in the Department of Computer Science and Engineering at UC San Diego who co-authored the study. “The choice should be in users’ hands.”

Nearly all of PMP’s users voluntarily shared their privacy decisions, allowing the researchers to see which apps they believe should be allowed access to their privacy-sensitive data. PMP is able to make recommendations for 97 percent of the 10,000 most popular iPhone apps.

“We have already shown millions of recommendations, and more than two-thirds of all our recommendations are accepted by our users, showing that they really like this unique feature of PMP,” said Agarwal.

Flixster, a popular app for finding movie times and reviews, was flagged for accessing private data. The researchers discovered that a third-party ad library used by the app was accessing users’ address books and sending back information.

“We provided feedback to the app’s developers in case they are unaware that a third party library may be accessing their users’ private data,” recalled fellow researcher Michael Hall, a visiting researcher in Agarwal’s Synergy Lab at UC San Diego who co-authored the study.

Since the team pointed out the privacy breach to Flixster, the developers created an updated version that uses another ad library that does not access this kind of information.

NASA Celebrates 30th Anniversary Of First American Woman In Space: Sally Ride

[WATCH VIDEO: Sally Ride Anniversary]

Lawrence LeBlond for redOrbit.com – Your Universe Online

Sally Ride became the first American woman to fly into space after securing a spot aboard the Space Shuttle Challenger for STS-7 on June 18, 1983. Her goals captured the attention of the nation, inspiring women everywhere to break barriers.

Ride was one of three specialists on STS-7 and was instrumental in helping the crew deploy communication satellites, conduct experiments and make use of the first Shuttle Pallet Satellite. As a pioneer for female space travel, Ride inspired “generations of young girls to reach for the stars [and] showed us that there are no limits to what we can achieve,” said President Barack Obama during a speech following her death last summer.

Ride spent the last several months of her life battling pancreatic cancer, finally giving in to the battle on July 23, 2012 at the age of 61 years young. She was born in Los Angeles, California on May 26, 1951.

Ride was fascinated by science from a young age and pursued the study of physics in school. She graduated from Stanford University with a PhD in physics, also conducting research into astrophysics and electron laser physics. It was during these studies that Ride discovered a newspaper ad for NASA astronauts.

After turning in an application along with 8,000 other people, Ride became one of only 35 selected for astronaut training. She joined NASA in 1978 and served as a ground-based capsule communicator (capcom) for the STS-2 and STS-3 missions. She also helped in the development of the Space Shuttle´s robotic arm.

After being selected as a crew member for STS-7, Ride faced an onslaught of attention from the media. However, she found no time for sensitive, or perhaps insensitive, questions that seemed to take center stage from media hounds — questions such as “Do you weep when things go wrong on the job?”

Being an astronaut and scientist left little time for Ride to be a woman. She said during one interview that “one thing I probably share with everyone else in the astronaut office is composure.” In reference to her fellow astronauts in the class of 1978, she said, “We’re all people who are dedicated to the space program and who really want to fly in the space shuttle. That’s a common characteristic that we all have that transcends the different backgrounds.”

Bob Crippen, Ride´s commander during STS-7, noted that Sally was more than capable of flying in space. He said, “I wanted a competent engineer who was cool under stress. Sally had demonstrated that talent.”

After her historic STS-7 mission, Ride continued her NASA career, flying on a second mission (STS-41G) less than a year later in October 1984. She also later served on the presidential commission that investigated the Challenger disaster and also led NASA´s strategic planning effort in the mid-1980s.

Ride retired from NASA in 1987 and became a science fellow at the Center for International Security and Arms Control at Stanford University. In 1989, she also joined the University of California-San Diego as a professor of physics and director of the California Space Institute.

Ride founded her own company, Sally Ride Science, in 2001. This let her pursue her passion for motivating boys and girls to study in the fields of science, technology, engineering and math (STEM). The company creates innovative classroom materials, programs and professional development training for teachers as well.

Since being part of the investigative team for the 1986 Challenger disaster, Ride was also selected as an investigator for the 2003 Columbia accident, becoming the only person to serve on both commissions.

Ride was also an aspiring author, inspiring boys and girls by writing a number of science books for children. Her book ℠The Third Planet,´ won the American Institute of Physics Children´s Science Writing Award in 1995

Her untimely death in 2012 left behind a heroic legacy.

While NASA celebrates the 30th anniversary of one pioneering woman, Russia also celebrated the anniversary of the first woman in space on June 16, marking 50 years since female cosmonaut Valentina Tereshkova made a historic space flight aboard Vostok 6, which blasted off on a three-day, 48-orbit mission over Earth.

Just like Sally Ride did in her time, Tereshkova “inspired women around the world to reach for their dreams and shoot for the stars.”

Tereshkova, who reached space once again in 1984, set another milestone when she became the first woman to ever walk in space.

Ride and Tereshkova, as the first women to fly in space, helped usher in new eras of equality in human spaceflight. On the anniversary of their missions, the legacies they left behind remind all of us of the passion and dedication of the women who have paved the way for more than 55 women to have since journeyed into space.

Sadly, a female teacher who would have become the first civilian astronaut to ever fly into space was among the seven killed when the Challenger Space Shuttle exploded 73 seconds after liftoff on January 28, 1986.

New Hampshire´s Christa McAuliffe was selected from more than 11,000 applicants for the NASA Teacher in Space Project. She was posthumously awarded the Congressional Space Medal of Honor in 2004.

Image Below: STS-7 Mission Specialist Sally Ride poses on aft flight deck with her back to the onorbit station. Credit: NASA

Exercise And Eating Habits Of Mothers Can Affect Their Children

redOrbit Staff & Wire Reports – Your Universe Online

Mothers who practice what they preach when it comes to being active and making healthy food choices are more likely to have children who exercise and eat well, according to research published online Tuesday by the International Journal of Obesity.

In the study, Truls Østbye, a professor of community and family medicine at Duke University, and his colleagues report that their findings should serve as a reminder to both mothers and fathers that they serve as role models for their sons and daughters.

Furthermore, they claim that the research emphasizes the importance of parental policies that promote physical activity and healthy eating — especially for younger children. In fact, over one-fourth of all US kids between the ages of two and five are already overweight or obese, the Duke investigators noted.

“Obesity is a complex phenomenon, which is influenced by individual biological factors and behaviors,” explained Østbye. “But there are variations in obesity from one society to another and from one environment to another, so there is clearly something in the environment that strongly influences the obesity epidemic.”

“The ‘obesiogenic’ environment is broad and multi-faceted, including the physical neighborhood environment, media and advertising, and food tax policies, but we feel that the home environment is critical, particularly among children. However, we didn’t have a lot of evidence as to how important this was,” he added.

To emphasize how important both home environment and parental behavior were in shaping the dietary and physical behaviors of children, Østbye and his colleagues studied data from 190 preschool children whose mothers were overweight or obese.

They collected information about each child´s food intake over the past week, separating them into two categories: junk food or healthy food. In addition, the children were outfitted with accelerometers for a week in order to measure the amount of time they were physically active (as well as the time they spent performing sedentary tasks).

The mothers also disclosed information about the home environment, including family policies governing food and physical activity, the availability of both health food and junk food, the availability of physical activity or exercise equipment, and whether or not they served as role models for healthy eating and exercise habits.

After analyzing all of the data, Østbye´s team “found significant associations between these environmental measures and the preschoolers’ physical activity and healthy versus junk food intake,” the Durham, North Carolina university said. They concluded that a healthy home environment and effective parental role modeling both play an essential role in promoting healthy behaviors in youngsters between the ages of two and five.

The study authors report that limiting the amount of junk food available to preschoolers at home increases the amount of healthy foods that those boys and girls will eat, and that the home environment was more influential when it came to eating habits than physical activity levels. They also emphasize that the research demonstrates that children are watching their parents´ behaviors and are picking up both good and bad habits from mom and dad.

“It’s hard for parents to change their behaviors, but not only is this important for you and your own health; it is also important for your children because you are a role model for them,” said Duke research analyst and study co-investigator Marissa Stroo. “This might be common sense, but now we have some evidence to support this.”

Østbye, Stroo and colleagues also analyzed the education levels, employment situation and other socioeconomic factors of the mothers. They found that those factors did not impact their children´s physical activity levels, but had “mixed results” when it came to healthy eating habits. The researchers point out that additional research is needed to better understand the impact that a mother´s socioeconomic situation has on her children´s health.

Doctors Confirm Fibromyalgia Is Not Imaginary

Lee Rannals for redOrbit.com — Your Universe Online

Doctors have been able to determine the source of pain in the skin of patients who suffer from fibromyalgia.

Fibromyalgia, a widespread deep tissue pain, affects about ten million people in the US. The condition causes tenderness in the hands and feet, fatigue, sleep disorders and cognitive decline. For years, the disorder was believed to be imaginary and often even attributed to patients making up the illness. The latest research not only proves its existence, but it has also pinpointed the source.

“Instead of being in the brain, the pathology consists of excessive sensory nerve fibers around specialized blood vessel structures located in the palms of the hands,” said Dr. Rice, President of Intidyn and the senior researcher on the study published in the journal American Academy of Pain Medicine. “This discovery provides concrete evidence of a fibromyalgia-specific pathology which can now be used for diagnosing the disease, and as a novel starting point for developing more effective therapeutics.”

The team analyzed the skin of one patient who lacked all the numerous varieties of sensory nerve endings in the skin that supposedly accounted for highly sensitive and a richly-nuanced sense of touch. This patient had normal function in day-to-day tasks, but the only sensory endings the team detected were those around the blood vessels.

“We previously thought that these nerve endings were only involved in regulating blood flow at a subconscious level, yet here we had evidences that the blood vessel endings could also contribute to our conscious sense of touch“¦ and also pain,” Rice said.

The team used a unique microscopic technology to study small skin biopsies collected from the palms of fibromyalgia patients who were being diagnosed and treated. They found an enormous increase in sensory nerve fibers at specific sites within the blood vessels of the skin. These critical sites are tiny muscular valves known as arteriole-venule (AV) shunts.

“The AV shunts in the hand are unique in that they create a bypass of the capillary bed for the major purpose of regulating body temperature,” Rice explained.

These shunts are unique to the palms of hands and soles of feet, working like a radiator in a car. Under warm conditions, the shunts close down to force blood into the capillaries at the surface of the skin in order to radiate heat from the body, while in cold conditions the shunts open wide to allow blood to bypass the capillaries in order to conserve heat.

Dr. Phillip J. Albrecht, another researcher on the project, said the excess sensory innervation might explain why fibromyalgia patients have especially tender and painful hands.

“But, in addition, since the sensory fibers are responsible for opening the shunts, they would become particularly active under cold conditions, which are generally very bothersome to fibromyalgia patients,” Albrecht said.

Rice added that the hands and feet act as a reservoir from which blood flow can be diverted to other tissues in the body, such as muscles when we begin to exercise.

“Therefore, the pathology discovered among these shunts in the hands could be interfering with blood flow to the muscles throughout the body,” the researcher said.

This discovery of a distinct tissue pathology demonstrates that fibromyalgia is not imaginary, which helps to change the clinical opinion of the disease and guide future approaches for better treatments.

“Wow crazy how spot on they were it has affected me cognitively as well as sleep issues! Mine is brought on by especially cold temperatures,” Amy P, who suffers from fibromyalgia, told redOrbit. “Also my handwriting has greatly suffered. But I believe that’s because I may have tendinitis of some sort.”

Dutch researchers reported a study earlier this month contradicting these findings, saying that weather conditions do not affect fibromyalgia pain or fatigue.

“Our analyses provide more evidence against, than in support of, the daily influence of weather on fibromyalgia pain and fatigue,” said Ercolie Bossema, Ph.D. from Utrecht University in the Netherlands. “This study is the first to investigate the impact of weather on fibromyalgia symptoms in a large cohort, and our findings show no association between specific fibromyalgia patient characteristics and weather sensitivity.”

However, researchers from the recent study point to the blood flow as proof the weather does actually have an effect on fibromyalgia patients.

“This mismanaged blood flow could be the source of muscular pain and achiness, and the sense of fatigue which are thought to be due to a build-up of lactic acid and low levels of inflammation fibromyalgia patients. This, in turn, could contribute to the hyperactivity in the brain,” Rice said.

Philippines To Destroy Five Tons Of Illegal Ivory, Symbolic Victory For Elephant Conservation

Brett Smith for redOrbit.com – Your Universe Online

The Philippines has announced it will destroy five tons of seized ivory, a major symbolic step for a nation known for playing a major role in the illegal ivory trade.

“The destruction of the items would hopefully bring the Philippines’ message across the globe that the country is serious and will not tolerate illegal wildlife trade, and denounces the continuous killing of elephants for illicit ivory trade,” Mundita Lim, director of the“¯country´s Protected Areas and Wildlife Bureau“¯(PAWB), told National Geographic.

PAWB, the country’s leading wildlife agency, said it plans to destroy all the ivory it is currently holding with the exception of 106 pieces that are to be repatriated to Kenya and a few pieces for training, law enforcement and educational purposes.

This large-scale ivory destruction is the latest episode in the ongoing Filipino ivory saga. While the Pacific island nation said it plans to destroy its ℠entire´ five tons of ivory holdings, it is somewhat mysteriously less than half of the total ivory seized by the Philippine government in recent years.

Filipino customs agents reportedly seized 7.7 tons of illegal ivory in 2005 and an additional 5.4 tons in 2009. However, a later audit revealed the customs agency was missing almost six tons of the reportedly confiscated ivory. The discrepancy resulted in a lawsuit filed by PAWB against the government´s customs agency.

The customs agency eventually turned over its 2009 seizure to PAWB. The wildlife organization later had its storeroom broken into, resulting in the theft of over 1.7 tons of ivory. The thieves reportedly replaced the stolen tusks with plastic replicas.

The Philippine government is scheduled to destroy the smuggled ivory on Friday using industrial rollers. Other nations have symbolically burned confiscated ivory caches, but local“¯environmental groups said it would send the wrong message and generate too much smoke. According to Filipino officials, the ivory was smuggled from several different African countries, including Kenya, Tanzania and Uganda.

The Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) passed a global ban on the international trade of ivory in 1989 after Kenya’s President Daniel Arap Moi burned 15 tons of ivory in a symbolic gesture against the slaughter of elephants for their tusks. The recent limited resurgence of African elephant populations has been linked to the ban´s implementation.

During the most recent CITES meeting, the Philippines was lumped in with a so-called “gang of eight” countries that facilitate the illegal ivory trade. These countries range from suppliers — Kenya, Tanzania and Uganda — to facilitators — Thailand, the Philippines, Malaysia, Vietnam and China. These eight nations were required to“¯submit plans for ending their role in the illegal trade.

The Philippines’ action comes as Chinese demand for ivory is skyrocketing along with the country´s rise to world economic prominence. China currently permits the resale of ivory bought before the 1989 ban. The country also has a stockpile of ivory it purchased in 2008 with CITES approval.

As governments around the world debate the legal status of ivory, many of them still hold ivory caches similar to the one that the Philippine government has committed to destroying.

New Language Discovery Reveals Linguistic Insights

A new language has been discovered in a remote Indigenous community in northern Australia that is generated from a unique combination of elements from other languages. Light Warlpiri has been documented by University of Michigan linguist Carmel O’Shannessy, in a study on “The role of multiple sources in the formation of an innovative auxiliary category in Light Warlpiri, a new Australian mixed language,” to be published in the June, 2013 issue of the scholarly journal Language.

The people who live in a small community in the Tanami Desert speak a traditional language, Warlpiri. It is spoken by about 4,000 persons and is highly endangered. In one community called Lajamanu, however, speakers readily switch between languages – from Warlpiri to English and Kriol (an English-based creole). In the 1970s and 1980s, children internalized this switching as a separate linguistic system, and began to speak it as their primary code, one with verb structure from English and Kriol, and noun structure from Warlpiri as well as new structures that can be traced to Warlpiri, English and Kriol, but are no longer the same as in those source languages. As these children grew up they taught the new language to their own children, and it is now the primary code of children and young adults in the community.

Light Warlpiri is one of a small number of ‘mixed languages,’ ones which typically consist of combinations of elements from two languages, although the combinations can be of different types. For example, most of the words come from one language and most of the grammar from the other. It is rare to find the structures of the verb system and noun system from different languages, as in Light Warlpiri, as is the fact that more than two languages were involved in the creation.

One striking innovation involves taking word forms from English, for example, I’m ‘I-present tense,’ and creating new forms such yu-m ‘you-nonfuture,’ (that is, the present and past but not the future). There were no structures in Warlpiri, English or Kriol, however, that meant ‘nonfuture time.’

This creation of new meanings from old sources also occurs in pidgin and creole languages, and in languages in the Balkan linguistic area. Perhaps the common factor between these and Light Walpiri is that each of them arose from combining elements from several languages. The wide separation of these codes suggests that this innovative combining may be an unusual but widely available human language phenomenon.

On the Net:

Cause Of Yuri Gagarin Death Finally Revealed By Fellow Cosmonaut

Lawrence LeBlond for redOrbit.com – Your Universe Online
For more than 45 years, the death of Russian cosmonaut Yuri Gagarin, the first human to cross the Earth´s threshold and venture into space, has been shrouded in secrecy. But now, details of his 1968 death have been released by none other than the first man to walk in space, Aleksey Leonov.
Gagarin, who became the first man to travel into space on April 12, 1961, was killed when his MiG-15 aircraft crashed on March 27, 1968. Gagarin was just 34 years old. The details of that crash and his death have long been a confusing and controversial subject, with many theories coming forward on the actual cause of his death.
Now, Leonov, who conducted the first ever extra-vehicular spacewalk in 1965, has delved deeper into the touchy subject of the Gagarin death mystery. Leonov has been fighting for 20 years or more to gain permission to disclose the details of what happened that tragic day in 1968.
Leonov was part of a State Commission established shortly after Gagarin´s death to investigate the matter further. The official cause of the crash of the MiG-15UTI, according to Leonov, was the crew, consisting of Gagarin and experienced instructor Vladimir Seryogin, trying to avoid a foreign object by carrying out a maneuver that led to a tailspin and, ultimately, collision with the ground, killing both pilots.
“That conclusion is believable to a civilian — not to a professional,” Leonov told Russia Today (RT), adding that he has always wanted the truth to be told, at least to the families involved.
A declassified report states human error played a part in the tragic incident that day. According to the report, an unauthorized SU-15 fighter jet was flying dangerously close to Gagarin´s aircraft.
Leonov said he was in charge of parachute jump training that day. He remembered the weather had been snowy, rainy and windy and was waiting for an official confirmation that the exercises would be canceled. However, it was only moments later that he heard a supersonic noise followed by an explosion. It was then he knew something was amiss, according to his testimony to RT.
“We knew that a Su-15 was scheduled to be tested that day, but it was supposed to be flying at the altitude of 10,000 meters or higher, not 450-500 meters. It was a violation of the flight procedure,” said Leonov.
Leonov noted witnesses pointed out that the SU-15 appeared out of the clouds with its tail smoking and burning. Leonov explained that during the afterburning, the SU-15 came dangerously close to Gagarin´s plane, forcing it to turn sharply at speeds in excess of 450 mph, sending it into a deep spiral.
After a transmission from Gagarin noting the crew was descending and returning to the airbase, no other transmissions came through; the plane crashed 55 seconds later.
Leonov´s worst fears were confirmed when someone had called Chlkalovsky airfield and reported a crash near the village of Novoselovo.
INVESTIGATIVE CLEARANCE
During an investigation of the crash site, the remains of Seryogin´s body were found but not Gagarin. Investigators believed Gagarin safely ejected from the plane and landed elsewhere. It was a full day later when investigators found the body of Gagarin.
Leonov, however, said when he was given clearance to view the actual incident report he found many inconsistencies. The greatest inconsistency was that the report had Leonov´s name on it, although it was written in a different hand, with the facts jumbled.
He noted the report said the noise intervals reported were 15 to 20 seconds apart, when he only reported hearing them seconds apart. The former suggests the two jets were no less than 30 miles apart. But using new computer models, investigators were now able to piece together what exactly caused Gagarin to go into that fatal spiral at breakneck speed.
The computer model placed a solid trajectory relating to the 55-second interval between when the sonic boom was heard and the crash occurred. Experts know that a deep spiral can occur if a larger, heavier aircraft passes by too closely, causing backwash to flip the smaller plane. Leonov said this is exactly what happened to Gagarin. The trajectory based on the computer model was the only one that makes sense and corresponds to all input parameters used by the investigators, he said.
Leonov, upon taking the new information to the public, said the reasoning behind the coverup was that officials were perhaps looking to “hide the fact that there was such a lapse so close to Moscow.” He said he allowed test pilots and other experts a chance to scrutinize and challenge his testimony.
While there is solid evidence that leaves no doubt that the pilot of an SU-15 was at fault for the crash of Gagarin´s plane, his name is not being released. Leonov said the pilot´s anonymity was a condition under which he was allowed to publicize the story. It is only known that the pilot is now 80 years old and is in failing health.
Leonov said he was told that bringing this pilot into the spotlight will “fix nothing.”
Nikolay Stroev, Deputy Head of the Military-Industrial Commission of the USSR, said the incident occurred with no intention on the part of the test pilot. The pilot did not see Gagarin´s aircraft in the clouds as he passed within dozens of feet at supersonic speed.
Several theorists have come forward with their take on what really happened that day, with reports of a collision with a UFO among the most popular incidences.
CLOSURE
But for all intents and purposes, the case is now considered closed, according to Leonov.
As the truth has finally been revealed, it should bring closure to others in the field who found discomfort with the long held controversial subject.
Russian cosmonaut Valentina Tereshkova, the first woman in space, was grounded after news of Gagarin´s death broke in 1968. She said the state would not let her fly anymore, as the possibility of losing a second cosmonaut of such stature would have been catastrophic.
During a conference of the Committee for the Peaceful Use of Outer Space (COPUOS), held at the UN, Tereshkova said the “only regret here is that it took so long for the truth to be revealed. But we can finally rest easy.
“They forbade me from flying ever again, even piloting planes. The repercussions of the death of one cosmonaut were so great that they wanted to keep me safe,” she explained.
But her deepest sadness still lies with the passing of Gagarin. “I still miss him. It is a loss not only for us cosmonaut colleagues, but for the entire community,” she spoke, trying to hold back tears.

Skipping Breakfast Linked To Diabetes In Overweight Women

Rebekah Eliason for redOrbit.com — Your Universe Online

A new study finds skipping breakfast causes acute or rapid onset insulin resistance in overweight women. Chronic insulin resistance is a risk factor for diabetes suggesting that if breakfast is repeatedly omitted, it may contribute to an individual developing type 2 diabetes.

Elizabeth Thomas, MD, an endocrinology fellow at the University of Colorado School of Medicine in Aurora explained, “Our study found that acute insulin resistance developed after only one day of skipping breakfast.”

When a person is in insulin resistance, they need more insulin to lower their glucose level, commonly referred to as blood sugar.

The study involved nine non-diabetic overweight or obese women with an average age of 29. On two different days approximately one month apart, subjects were randomly assigned to either receive breakfast or no breakfast on their first visit and the opposite for the second visit. After four hours, all participants received a standardized lunch with blood samples taken every thirty minutes for three hours following lunch.

It is a normal response for glucose levels to rise after eating which triggers insulin production. Researchers discovered when participants skipped breakfast their insulin levels were much greater than on the day they ate breakfast. According to Thomas, the heightened insulin levels are an indication of acute insulin resistance.

Thomas explained it was not clear if this “heightened metabolic response” was temporary or lasting, but it may contribute to the development of chronic insulin resistance. If insulin resistance is chronic, it leads to a buildup of sugar in the blood which may lead to prediabetes and diabetes.

“This information should help health care providers in counseling patients as to why it is better to eat a healthy, balanced breakfast than to skip breakfast,” Thomas said.

This study was funded by the Endocrine Fellows Foundation in Washington, DC, the National Institutes of Health and the Colorado Nutrition Obesity Research Center.

Results of the study were presented Sunday at The Endocrine Society’s 95th Annual Meeting in San Francisco.

Researchers Link Obesity To Hearing Loss In Kids

April Flowers for redOrbit.com – Your Universe Online

A new study led by researchers at Columbia University Medical Center shows obese adolescents are more likely to have hearing loss problems than their normal-weight peers.  The study findings, published in The Laryngoscope, demonstrated that obese adolescents have increased hearing loss across all frequencies and were almost twice as likely to have unilateral (one-sided) low-frequency hearing loss.

“This is the first paper to show that obesity is associated with hearing loss in adolescents,” said Anil K. Lalwani, professor and vice chair for research, Department of Otolaryngology/Head & Neck Surgery, Columbia University Medical Center.

Obesity in adolescents is associated with sensorineural hearing loss — caused by damage to the inner-ear hair cells — across all frequencies in the range that can be heard by humans, according to the study. The highest rates of hearing loss were for low-frequencies — over 15 percent of obese adolescents compared with almost 8 percent of non-obese adolescents. Low-frequency hearing loss sufferers cannot hear sounds in frequencies 2,000 Hz and below, although they may still hear sounds in the higher frequencies. Normal human hearing range is from 20 Hz to 20,000 Hz. Such sufferers can often still understand human speech, but may have difficulty hearing in groups or in noisy places.

“These results have several important public health implications,” said Dr. Lalwani, who is also an otolaryngologist at New York Presbyterian Hospital/Columbia University Medical Center. “Because previous research found that 80 percent of adolescents with hearing loss were unaware of having hearing difficulty, adolescents with obesity should receive regular hearing screening so they can be treated appropriately to avoid cognitive and behavioral issues.”

In general, overall hearing loss among obese adolescents is relatively mild, the study found, although the nearly two-fold increase in the odds of unilateral low-frequency hearing loss is particularly worrisome. The findings suggest early, and possibly ongoing, injury to the inner ear that could progress as the adolescents become obese adults. The researchers say more research is needed on the adverse effects of this early hearing loss on social development, academic performance and behavioral and cognitive function.

“Furthermore, hearing loss should be added to the growing list of the negative health consequences of obesity that affect both children and adults — adding to the impetus to reduce obesity among people of all ages,” said Dr. Lalwani.

Nearly 17 percent of children in the US are obese, which is defined as having a body mass index (BMI) of 95 percentile. Unlike adult BMI, which is expressed as a number based on weight and height, BMI in children is expressed as a percentile. Previous research has identified obesity and its associated morbidities have been identified as a risk factor for hearing loss in adults.

The research team analyzed data from almost 1,500 individuals who participated in the National Health and Nutrition Examination Survey. Conducted from 2005 to 2006 by the National Center for Health Statistics of the Centers for Disease Control and Prevention (CDC), the survey was a large, nationally representative sample of adolescents between the ages of 12 and 19.“¯Participating adolescents were interviewed in their homes, taking into account family medical history, current medical conditions, medication use, household smokers, socioeconomic and demographic factors, and noise-exposure history.

The research team suggests obesity may directly or indirectly lead to hearing loss. More research is necessary to determine the mechanisms involved; however, the team theorizes that obesity-induced inflammation may contribute to hearing loss.“¯Obese children have been found to have low levels of the anti-inflammatory protein adiponectin, which is secreted from adipose tissue. In obese adults, low levels have been associated with high-frequency hearing loss, which affects a person’s ability to understand speech. Secondary diseases often associated with obesity — which include type 2 diabetes, cardiovascular disease and high cholesterol — have all been reported to be associated with loss of peripheral hearing (relating to the outer, middle, and inner ear) and could also contribute indirectly to hearing loss.

Ultra-Efficient Hemoglobin Key To Evolutionary Success Of Fish

Brett Smith for redOrbit.com – Your Universe Online

The primordial oceans in Earth´s early days were a much more inhospitable than they are today, and new research published in the journal Science suggests that fish developed a highly efficient hemoglobin-based system to deliver large amount of oxygen to their tissues.

“Four hundred million years ago the oceans were not what they are today. They were low in oxygen, high in CO2 and acidic,” said study co-author Jodie Rummer of the Australian Research Council´s Centre of Excellence for Coral Reef Studies.

“Yet the fishes not only survived in these unpromising circumstances, they managed to thrive,” she continued. “Their secret weapon was a system for unloading huge amounts of oxygen from the hemoglobin in their blood, whenever the going got really tough.”

Hemoglobin, an iron-rich blood protein, carries oxygen to the brain, heart and other organs of all vertebrates. As the evolutionary precursors to land animals, these early fish most likely passed on their use of hemoglobin to humans.

“Hemoglobin in the blood takes up oxygen in the gills of fish and the lungs of humans,” Rummer said. “It then carries it round the body to the heart, muscles and organs until it encounters tissues that are highly active and producing a lot of CO2.”

“The acid is a signal to the hemoglobin to unload as much of its oxygen as possible into the tissues,” she explained.

“These early fish managed to develop a way to maximize the delivery of oxygen, even when the water they lived in was low in it,” Rummer added. “They had a phenomenal capacity for releasing oxygen just when needed: it was one of the big secrets of their evolutionary success, to the extent they now make up half the vertebrates on the planet.”

Rummer and a team of international colleagues discovered the hemoglobin system by analyzing the biochemistry of rainbow trout, which are capable of rapidly doubling oxygen release for certain tissues during stressful situations.

This oxygen release system has become very efficient over the past 150 to 270 million years, the researchers said. It is able to deliver large quantities of oxygen to organs such as the eye, an essential organ for clearly spotting underwater predators or prey.

The fish hemoglobin system is probably more efficient than ours because our amphibian ancestors branched away from more evolved fish about 350 to 400 million years ago when the oxygen-delivering method was still in its early stages of development, the researchers said in a statement.

The study authors said their findings could lead to new ways of studying oxygen conditions in the body.

“Also, we feel that if we can understand how fish coped with low-oxygen, high CO2, acidic waters in the past, it will give us some insight into how they might cope with man-made climate change which appears to be giving rise to such conditions again,” Rummer said.

Hemoglobin is also capable of carrying other gases in the blood stream, including carbon dioxide. The protein is also found outside of red blood cells where it can act as an antioxidant.

Across America 2013: Solar Impulse Lands In Washington DC After Pit Stop In Cincinnati

Lawrence LeBlond for redOrbit.com – Your Universe Online

The Solar Impulse team of Bertrand Piccard and Andre Borschberg completed the next-to-last leg of the Across America 2013 project with a safe landing in Washington DC. The zero-fuel aircraft touched down in the US capital at 12:15 am EDT on June 16 after a nearly two day flight with a single pit stop in Cincinnati.

Borschberg piloted the HB-SIA single-seat plane for the first half of the flight. He took off from Lambert-St. Louis International Airport in the predawn hours on June 14 and landed at about 8:15 pm EDT in Cincinnati for a short pit stop before handing over the reins to Piccard for the second half of the flight.

Normally the pilots make landings later at night to avoid heavy air traffic, but since Cincinnati´s Lunken Municipal Airport doesn´t get as much heavy traffic as larger airports, Borschberg decided to glide in earlier than usual.

The Cincinnati pit stop was initially scheduled for 11 hours but lasted closer to 14 hours instead. At 10:11 am on June 15, Piccard climbed into the solar-powered aircraft and took off for Washington DC.

The Solar Impulse team ultimately decided to split the fourth leg of the Across America 2013 flight into two segments because of strong headwinds that were forecast for the region. They knew such strong winds would make it difficult to complete the St. Louis-Washington DC leg in less than 24 hours, which is the maximum time allotted for pilots to fly in the single-seater.

After an additional 14 hours in the air for the fourth leg, Piccard landed safely in the District of Columbia at 12:15 am EDT on June 16, 2013.

“To land in the Capital of the United States has a dual significance for me: On the one hand, it proves the reliability and potential of clean technologies and this is crucial in pushing our message forward,” explained Piccard after landing in DC. “On the other hand, to be hosted by the Smithsonian Institution is an honor for Solar Impulse. The capsule of my around-the-world balloon flight is already displayed in the Air and Space Museum and I hope one day a second Swiss aircraft will join the collection“¦” he said, likely in reference to the Solar Impulse HB-SIA prototype.

Borschberg added, “with the successful completion of these last four US flights, we have shown that we are capable of coping with challenging meteorological conditions for our weather-sensitive plane and for our ground operations, and that we could find each time the right solutions to move forward. It has been a succession of fruitful learnings preparing us for the 2015 world tour.”

The team is scheduled to meet with Secretary of Energy´s Ernest Moniz during a press conference on Monday. But first, the Solar Impulse will go on public display Sunday afternoon from 1:00 to 5:00 p.m. outside the Steven F. Udvar-Hazy Center at the Smithsonian National Air and Space Museum (NASM). Several other events will also be held throughout the week of June 17-23.

The last leg of the Across America 2013 project will see Borschberg climb aboard the HB-SIA solar-powered aircraft for a final time. He will pilot the aircraft from Washington DC to the final American destination, New York City in early July. The last flight’s departure, which is scheduled for a JFK International Airport landing, will depend on weather conditions.

Sugar Can Stress Out Your Heart

Brett Smith for redOrbit.com – Your Universe Online

A new study in the Journal of the American Heart Association has found regularly exposing the heart to too much sugar can lead to heart failure.

“When the heart muscle is already stressed from high blood pressure or other diseases, and then takes in too much glucose, it adds insult to injury,” said study co-author Dr. Heinrich Taegtmeyer, a medical school professor of cardiology at the University of Texas Health Science Center at Houston.

According to the study, the glucose metabolite molecule, glucose 6-phosphate (G6P), can cause stress to the heart — changing muscle proteins and disrupting heart pump function in a way that leads to heart failure.

To reach their conclusion, Taegtmeyer and colleagues performed several experiments on rat hearts, ex vivo, or outside the rat´s body. Experiments were performed this way to “elucidate the mechanisms” suggested by previous research and to allow for more control over conditions, the study said.

The research team also performed tests on deficient heart tissue taken from eleven patients at the Texas Heart Institute who had the heart muscle removed in the process of corrective surgery. Using biochemical and genetic analyses, both sets of experiments led to uncovering the damage caused by G6P.

Taegtmeyer noted the study´s findings could have significant ramifications for the prevention of heart failure, a disease with a post-diagnosis survival rate of 50 percent that impacts 5 million Americans a year, according to the Centers for Disease Control.

“Treatment is difficult,” Taegtmeyer said. “Physicians can give diuretics to control the fluid, and beta-blockers and ACE inhibitors to lower the stress on the heart and allow it to pump more economically.”

“But we still have these terrible statistics and no new treatment for the past 20 years,” he added.

According to a statement from the American Heart Association, the study has already led to potential new treatments. Rapamycin, an immunosuppressant, and metformin, a diabetes medication, have been found to disrupt the signaling of G6P and boost cardiac power in small animal studies.

“These drugs have a potential for treatment and this has now cleared a path to future studies with patients,” Taegtmeyer said.

Heart failure, marked by the ineffective pumping of blood, can be caused by heart attack, diabetes, high blood pressure and viral infections. The AHA study comes just after a team of California-based researchers published a study showing the progress made by newer treatments for the disease.

Medical advancements over the past 20 years have led to a panoply of new treatment and researchers from UCLA decided to look into their effectiveness.

The study included 2,500 patients who had been treated at a UCLA medical facility from 1993 to 2010. Three drug-based treatments, (ACE inhibitors, beta blockers, and aldosterone antagonists), have been widely used to treat heart failure between 1993 and 2010. Meanwhile, the use of implanted automatic heart defibrillators in heart failure patients has gone from 11 percent to 68 percent. The devices correct abnormal heart rhythms, a frequent cause of sudden death.

The California researchers found death rates were 42 percent lower for patients in the most recent treatment group, between 2005 and 2010, than for the patients in the 1990s, mostly due to the drop in sudden deaths.