Mindfulness Linked To Lower Stress Hormones

April Flowers for redOrbit.com – Your Universe Online

New research led by researchers with the Shamatha Project at the University of California, Davis suggests that focusing on the present rather than allowing the mind to drift may help lower levels of stress hormones.

Focusing your mind and mental resources on immediate experience is an aspect of mindfulness that can be improved by meditation training.

“This is the first study to show a direct relation between resting cortisol and scores on any type of mindfulness scale,” said Tonya Jacobs, a postdoctoral researcher at the UC Davis Center for Mind and Brain.

Cortisol is a hormone produced by the adrenal gland. High levels of cortisol have been linked to physical or emotional stress, with prolonged release contributing to wide-ranging, adverse effects on a number of physiological systems.

The Shamatha Project is a comprehensive, long-term, control-group study of the effects of meditation training on the mind and body that has drawn attention from scientists and Buddhist scholars alike, including the Dalai Lama. The new findings are the latest results to come from the Shamatha Project, led by Clifford Saron. Saron is an associate research scientist at the UC David Center for Mind and Brain.

The study, published in the journal Health Psychology, employed a questionnaire to measure aspects of mindfulness amongst participants before and after an intensive, three-month meditation retreat. Participants’ cortisol levels in the saliva were measured as well.

Buddhist scholar and teacher B. Alan Wallace of the Santa Barbara Institute for Consciousness Studies trained participants in such attentional skills as mindfulness of breathing, observing mental events, and observing the nature of consciousness during the long retreat. Individuals also practiced cultivating benevolent mental states. Such states include loving kindness, compassion, empathic joy and equanimity.

Individually, the study found a correlation between a high score for mindfulness and a low score in cortisol before and after the retreat. People who increased their mindfulness score after the retreat showed a decrease in cortisol.

“The more a person reported directing their cognitive resources to immediate sensory experience and the task at hand, the lower their resting cortisol,” Jacobs said.

Jacobs stresses that the research did not show a direct cause and effect. She notes that the effect could run either way, in fact. Reduced levels of cortisol could lead to improved mindfulness, rather than the other way around. Mindfulness scores on the questionnaire increased from pre- to post-retreat. Cortisol levels, however, did not change overall.

Training the mind to focus on the present, according to Jacobs, may reduce the propensity to ruminate about the past or worry about the future. Thought processes such as these have been linked to cortisol release.

“The idea that we can train our minds in a way that fosters healthy mental habits and that these habits may be reflected in mind-body relations is not new; it’s been around for thousands of years across various cultures and ideologies,” Jacobs said. “However, this idea is just beginning to be integrated into Western medicine as objective evidence accumulates. Hopefully, studies like this one will contribute to that effort.”

For the purposes of this study, Saron notes that the researchers used the term “mindfulness” to refer to behaviors that are reflected in a particular mindfulness scale, which was the measure used in the study.

“The scale measured the participants´ propensity to let go of distressing thoughts and attend to different sensory domains, daily tasks, and the current contents of their minds. However, this scale may only reflect a subset of qualities that comprise the greater quality of mindfulness, as it is conceived across various contemplative traditions,” he said.

Prior Shamatha Project research has shown that the meditation retreat had positive effects on visual perception, sustained attention, socio-emotional well-being, resting brain activity and on the activity of telomerase, an enzyme important for the long-term health of body cells.

Better-Educated Parent Leads To More Nutritious Children

Lee Rannals for redOrbit.com — Your Universe Online

Researchers wrote in the journal Public Health Nutrition that the more educated the parent, the more likely their child would eat fewer fatty and sugary foods.

The study was performed on nearly 15,000 children between the ages of two and nine years old from eight European countries; including Italy, Estonia, Cyprus, Belgium, Sweden, Hungary, Germany and Spain.

The researchers found that parents with a lower level of education feed their children foods rich in sugars and fats more often than parents with a higher level of education, who feed their children more products of a higher nutritional quality, such as vegetables, fruit, pasta, rice and wholemeal bread.

“The greatest differences among families with different levels of education are observed in the consumption of fruit, vegetables and sweet drinks”, explains Juan Miguel Fernández Alvira, the author of the work and researcher from the University of Zaragoza to SINC.

The researchers say their study implies a greater risk of developing obesity in children from less advantaged socio-culture groups.

“The [programs] for the prevention of childhood obesity through the promotion of healthy eating habits should specifically tackle less advantaged social and economic groups, in order to [minimize] inequalities in health”, concludes Fernández Alvira.

The World Health Organization (WHO) says nearly 40 million children under the age of five suffered from being overweight in 2010. The recommendations for children over two are not much different than adults. Experts say their diet should include cereals, fruit, vegetables, dairy products, lean meats, fish, poultry, eggs and nuts.

Dieticians say parents should offer children a wide variety of foods and avoid using food as a method to reward or punish behavior.

University of Illinois researchers wrote in January about how adding minutes to family mealtime might help keep away childhood obesity. The team said that children in families who engage each other over a 20-minute meal four times a week weighed less than kids who left the table after 15 to 17 minutes.

Scientists are finding more and more reasons to try and put an end to childhood obesity. A UCLA study found that consequences of childhood obesity could actually be immediate. Children who are overweight face nearly twice the risk of having three or more reported medical, mental or developmental conditions than normal weight children.

“The findings should serve as a wake-up call to physicians, parents and teachers, who should be better informed of the risk for other health conditions associated with childhood obesity so that they can target interventions that can result in better health outcomes,” said lead author Dr. Neal Halfon, a professor of pediatrics, public health and public policy at UCLA.

Theory And Practice Is Key To Optimized Broadband, Low-loss Optical Metamaterials

Penn State

The union of theory and practice makes broadband, low-loss optical devices practical, which is why two groups of Penn State engineers collaborated to design optical metamaterials that have custom applications that are easily manufactured.

Metamaterials are manufactured materials that derive their unusual properties from structure rather than only composition, and possess exotic properties not usually found in nature. Nanostructured metamaterials appear different for signals of different frequencies. They are dispersive, so that if researchers manipulate this material dispersion, they gain a comprehensive control of the device performance across a band of frequencies.

In the past, to control the optics of metamaterials, researchers used complicated structures including 3-dimensional rings and spirals that are difficult if not impossible to manufacture in large numbers and small sizes at optical wavelengths. From a practical perspective, simple and manufacturable nanostructures are necessary for creating high-performance devices.

“We must design (nanostructures that can be fabricated,” said Theresa S. Mayer, Distinguished Professor of Electrical Engineering and co-director of Penn State’s nanofabrication laboratory.

Designing materials that can allow a range of wavelengths to pass through while blocking other wavelengths is far more difficult than simply creating something that will transmit a single frequency. Minimizing the time domain distortion of the signal over a range of wavelengths is necessary, and the material also must be low loss.

“We don’t want the signal to change as it passes through the device,” said Jeremy A. Bossard, postdoctoral fellow in electrical engineering.

The majority of what goes in must come out with little absorption or distortions to the signal waveform due to the metamaterial dispersion.

“What we do is use global optimization approaches to target, over wide bandwidths, the optical performance and nano fabrication constraints required by different design problems,” said Douglas H. Werner, John L. and Genevieve H. McCain Chair Professor of Electrical Engineering. “The design methodology coupled with the fabrication approach is critically important.”

The design team looked at existing fishnet structured metamaterials and applied nature-inspired optimization techniques based on genetic algorithms. They optimized the dimensions of features such as the size of the fishnet and the thicknesses of the materials. One of the transformative innovations made by the researchers was the inclusion of nanonotches in the corners of the fishnet holes, creating a pattern that could be tuned to shape the dispersion over large bandwidths. They reported their approach in today’s (Mar. 28) online issue of Scientific Reports.

“We introduced nanonotches in the corners of the air holes to give a lot more flexibility to independently control the properties of permittivity and permeability across a broad band,” said Werner. “The conventional fishnet doesn’t have much flexibility, but is easy to fabricate.”

Permittivity measures the ease or difficulty of inducing an electric field in a material, while permeability measures the ease or difficulty of inducing a magnetic field. Theoretically, manipulating permittivity and permeability allows tuning of the metamaterial across a range of wavelengths and creates the desired index of refraction and impedance.

Theory may provide a solution, but can that solution become reality? The fabrication team placed constraints on the design to ensure that the material could be manufactured using electron-beam lithography and reactive ion etching. The initial material was a three-layer sandwich of gold, polyimide and gold on oxidized silicon. When the silicon dioxide mask and the electron beam resist are removed, the researchers were left with an optical metamaterial with the desired properties.

In this case they created a band pass filter, but the same principles can be applied to many optical devices used in optical communications systems, medicine, testing and characterization or even optical beam scanning if the metamaterial is shaped to form a prism.

Another use of this metamaterial could be in conjunction with natural materials that do not have the desired properties for a specific optical application.

“All materials have a natural dispersion,” said Mayer. “We might want to coat a natural material in some regions to compensate for the dispersion.”

According to Werner, currently the only way to compensate is to find another natural material that would do the job. Only rarely does such a material exist.

On The Net:

Is it Time to Shake Our Salt Addiction?

Peter Suciu for redOrbit.com — Your Universe Online

While it is easy enough today to head down to the grocery store and pick up a container of salt, in earlier times, salt wasn´t so readily available or even affordable. One of the oldest natural preservatives and seasonings, salt has long been essential to life. And in parts of the world where salt was hard to obtain, it was sometimes traded ounce for ounce with gold.

This resulted in a salt trade from North Africa to West Africa where caravans would carry the white mineral across the Sahara to trading centers like Djenne and Timbuktu. In the Roman Empire soldiers were paid in salt, which is where the term “worth one´s salt” came from as well as the word “salary.”

Today salt isn´t so hard to come by. In fact, it can be quite difficult not to ingest more than the amount recommended by health professionals. Not only can salt be readily purchased at the grocery store, it is already in much of the prepared and packaged food on our tables. According to the FDA, more than 75 percent of dietary sodium comes from eating packaged and restaurant foods. A recent study even suggested that children are getting too much salt from pre-packaged food.

A new balanced review published in the March 27 edition of The New England Journal of Medicine, cited correlations between blood pressure and salt intake in a number of different studies.

Dr. Theodore A. Kotchen, M.D., professor of medicine and associate dean for clinical research at the Medical College of Wisconsin, was the lead author of the article. He is also the author or co-author of more than 200 peer-reviewed publications dealing with mechanisms of blood pressure control, hypertension treatment strategies and genes associated with hypertension.

Kotchen has noted a possible causation between lowering salt intake and decreased blood pressure, which occurs in individuals who have been diagnosed with hypertension. And while not as pronounced, Kotchen has also noted a link between salt intake and blood pressure in non-hypertensive individuals as well.

Recent studies have also indicated a possible connection between reduced salt intake leading to decreased cardiovascular disease and decreased mortality.

“Salt is essential for life, but it has been difficult to distinguish salt need from salt preference,” said Dr. Kotchen in a statement. “Given the medical evidence, it seems that recommendations for reducing levels of salt consumption in the general population would be justifiable at this time.”

There have been efforts to get people to shake the salt habit. Recent national studies in Great Britain and Finland have resulted in national salt-reduction programs. Back in the United States, New York City has also toyed with the idea of a salt ban for some years, going so far as to ban donations to homeless shelters because the city can´t assess the amount of salt, fat or fiber content.

However, while salt could be a cause for health concerns, Kotchen admits there´s no one-size-fits-all recommendation. He notes that a lower limit for salt consumption has not been clearly identified and maintains that in certain patients groups less rigorous targets for salt reduction could be appropriate.

There is also the fact that salt could have benefits that outweigh possible health risks.

Salt could help those who don´t have access to clean water, as NPR reported recently that “Sun, salt and lime sounds like the beginnings of a cocktail recipe, but for some, it could mean cleaner, life-sustaining water.” The salty water may not be especially healthy or even taste good, but Joshua Pierce, associate professor of materials science and engineering at Michigan Tech, noted that it is an effective way to reduce pathogens in contaminated water.

Just as salt is crucial to life, so is water, and the salt added to purify the water is actually less than the amount found in Gatorade — proving that in our modern world salt is indeed everywhere.

Diarrheal Disease In Botswana Could Worsen With Climate Change

Virginia Tech researcher say climate driving diarrheal disease

In a National Science Foundation funded study, Kathleen Alexander, an associate professor of wildlife at Virginia Tech, found that climate drives a large part of diarrheal disease and increases the threat of climate change for vulnerable communities.

The only study of its kind in Sub-Saharan Africa is based on three decades of historical data and has important implications for arid countries around the world struggling with poverty and increasing health challenges.

Alexander, a veterinarian, teaches in Virginia Tech’s College of Natural Resources and Environment and conducts research at the Blacksburg, Va., campus and at her nonprofit research center, the Center for African Resources: Animals, Communities, and Land Use(CARACAL), in Chobe, Botswana.

Alexander’s research study, “Climate Change Is Likely to Worsen the Public Health Threat of Diarrheal Disease in Botswana,” was published today (March 26, 2013) in the International Journal of Environmental Research and Public Health.

Alexander and colleagues analyzed data on diarrheal disease from 1974, eight years after Botswana gained independence from British rule, through 2003.

“Botswana proactively set up a health surveillance program shortly after it became independent. Over such a long time period, however, it was not easy locating all the historical documents pertaining to diarrhea case incidence,” Alexander said.

“Finding such data in Africa is difficult, and this explains why long-term studies of climate and health interactions are uncommon,” she continued. “Our work indicates that there is a critical need to identify climate-health interactions across the continent and develop appropriate adaptive strategies in response.”

The Botswana Ministry of Health and the Ministry of Environment Wildlife and Tourism provided research assistance. The Wildize Foundation, an African conservation organization, supplemented funding. Researchers Marcos Carzolio and Eric Vance from Virginia Tech and Douglas Goodin from Kansas State University co-authored the study.

“Diarrheal disease is an important health challenge, accounting for the majority of childhood deaths globally and the second highest in Botswana,” Alexander said. “Our findings suggest that climate change will increase the occurrence of diarrhea and the burden of disease among vulnerable populations in Botswana and similarly affected regions.”

Botswana is an arid, landlocked country in southern Africa with a subtropical climate of distinct wet and dry seasons. Alexander and her co-authors evaluated monthly reports of diarrheal disease among patients treated at Botswana health facilities since 1974 and compared that data with climatic variables over that same period.

“Our analysis suggests that forecasted climate change increases in temperature and decreases in precipitation for the region are likely to increase dry season diarrheal disease incidence, while diarrheal disease incidence in the wet season is likely to decline,” Alexander explained.

Diarrheal case incidence peaks in both the wet and dry seasons in Botswana with mean case incidence 20 percent higher on average in the dry season over the wet season.

“We were not expecting diarrheal disease to be worse in the dry season,” Alexander pointed out. “These dry season diarrheal disease peaks occur during the hottest and driest times of the year, conditions that can increase fly activity and density. This is significant, as flies can be important in the transmission of diarrheal-disease-causing microorganisms.”

Alexander believes flies may provide an important dry season amplifying influence on factors already contributing to diarrheal disease. “It is an important area of research that we will be pursuing,” she added.

“Our results identify significant climate-health interactions and highlight the need for an escalated public health focus on controlling diarrheal disease in Botswana,” she continued. “Understanding the potential health impacts of climate change in low-income countries will be essential to developing mitigation and adaptive strategies designed to protect these vulnerable populations expected to be impacted the hardest but least able to adapt.”

“While our work identifies important climate-health interactions and increased vulnerability of Botswana to forecasted changes in regional climate,” Alexander cautioned, “it is important to remember that this does not account for affects of nonclimatic factors such as future improvements in sanitation infrastructure and hygiene. The impact of forecasted climate change on this disease syndrome is likely to be significantly reduced if present day public health deficiencies are fully identified and addressed.”

“It is essential, however, that we include affected communities in identifying climate change preparedness,” Alexander emphasized. “Lack of sociocultural considerations in public health planning can result in locally applied interventions being nonsustainable.”

Understanding climate variability as a determinant of infectious disease is increasingly seen as a cornerstone of climate change preparedness and an urgent area of need in Africa and elsewhere around the world.

“Much of the threat of climate change on health results from our vulnerabilities to environmental change. These vulnerabilities are primarily associated with the poor, who are most dependent on the environment and least able to adapt to changes in these systems,” she explained. “If we address current community health deficiencies now, climate change impacts are not likely to have such important and potentially devastating consequences in the future.”

Because of the magnitude of Alexander’s work in Botswana, she is one of three scholars selected as an African regional expert by the World Health Organization and the Convention on Biological Diversity secretariat to participate in a regional workshop in Mozambique April 2-5. As a specialist in disease ecology with associated ecological and human dimensions, she will make a presentation to leaders from various African countries on integrating health and biodiversity in policy and planning efforts.

“Kathy is a brilliant scholar who successfully connects her many skills to people in Botswana,” said Paul Winistorfer, dean of Virginia Tech’s College of Natural Resources and Environment. “She recognizes that her most important goal is to improve the lives and livelihood of these people, while respecting the human-wildlife interaction that is coupled to environmental sustainability.”

On the Net:

Star Analyst Bites Into Apple’s 2013 Product Offerings

Michael Harper for redOrbit.com — Your Universe Online

Gene Munster, an analyst for the investment bank Piper Jaffray and the loneliest iTV holdout, now claims the next version of the iPhone will be available at the end of June, only to be followed by that “cheap” iPhone in September.

In a note to investors, Munster also said that this June´s iPhone release would be the first major Apple event of the year, potentially dashing the hopes of those who expect a new and improved iPad or Retina iPad mini in April. Munster hasn´t given up hope on that iTV either. The analyst still believes that Apple will release an “actual TV” by the end of the year, parroting claims he made last year.

Many of the iPhone 5S rumors have been gravitating toward a few particular features, namely NFC, a fingerprint scanner, and the usual better camera and faster processor.

The Piper Jaffray analyst believes two of these features are very likely to show up in this year´s iPhone 5S, saying there´s only an “outside chance” that the S-Version iPhone 5 will be NFC-ready.

Similarly, Munster hasn´t written off the chance of a fingerprint scanner but believes it´s more likely to appear in the iPhone 6. Apple purchased mobile security company AuthenTec last July for $356 million. Since then, many analysts and insiders have been predicting that Apple will start using biometric security features like a fingerprint scanner in future versions of the iPhone.

Like most, Munster believes that the rumored cheap iPhone will be aimed at emerging markets. Speaking to Bloomberg, Munster claimed that Apple will release a low-cost iPhone this September, complete with a deal with the world´s largest carrier China Mobile. So far, Munster and his colleagues have been predicting that Apple will release this cheap, plastic iPhone to stake their claim in developing parts of the world — a market on which Munster believes the phone will be worth about $135. Munster also said he predicts this cheap iPhone will be available unlocked for around $250, significantly cheaper than the current unlocked iPhone 5 which is priced at $649.

“We continue to believe Apple will have a cheaper phone product to address the emerging markets,” writes Munster in his letter to investors which was obtained by MacRumors.

“In recent public comments, Tim Cook noted that the original iPod cost was $399 and eventually the company released a $49 iPod Shuffle which addressed a broader market. We believe Apple will likely introduce a cheaper device in the September quarter.”

Earlier this year, Munster placed his bets on a March release of a new iPad and iPad mini. Something has changed in the last two months, however, and he now believes that Apple´s first big release will be a June launch of the iPhone 5S.

According to Munster, if Apple does release a new iPad, they´ll do it quietly and without the usual fanfare. Presumably, this also means there won´t be any new Macs for the next few months, a move which Munster believes will cause Apple to miss their sales numbers for smartphones and computers.

Finally, Munster is once again saying Apple will release their HDTV by the end of this year, a claim he often made in 2012. As for the “iWatch,” which has been making headlines lately, the analyst suspects Apple won´t release this gadget until 2014. He also said wearable computing technology could cannibalize the smartphone market in five to ten years.

Traditional LDL Cholesterol Readings May Not Be Accurate Enough, Researchers Claim

redOrbit Staff & Wire Reports – Your Universe Online

The formula currently used to calculate an individual´s “bad” cholesterol levels is often inaccurate, sometimes underestimating low-density lipoprotein (LDL) levels in those most at risk, researchers from the Johns Hopkins University School of Medicine have discovered.

Their study, published online in the Journal of the American College of Cardiology, explains that the Friedewald equation — which has been used to gauge a patient´s LDL levels since 1972 — is an estimate and not an exact measurement. It is currently used by doctors to assess a patient´s risk of having a heart attack due to the accumulation of plaque in arteries, but the new research calls its accuracy into question.

“In our study, we compared samples assessed using the Friedewald equation with a direct calculation of the LDL cholesterol. We found that in nearly one out of four samples in the ℠desirable´ range for people with a higher heart disease risk, the Friedewald equation had it wrong,” Seth Martin, lead author and clinical fellow at the hospital´s Ciccarone Center for the Prevention of Heart Disease, said in a statement.  “As a result, many patients may think they achieved their LDL cholesterol target when, in fact, they may need more aggressive treatment to reduce their heart disease risk.”

“In patients with heart disease, we want to get their LDL level below 70 — that is the typical goal,” added Steven Jones, senior author and director of inpatient cardiology at The Johns Hopkins Hospital. Based on their findings, however, many people may falsely believe that their cholesterol targets have been met — especially those with high levels of triglycerides (a type of lipid or fat in the blood that can increase a person´s risk of cardiovascular disease).

Martin, Jones and their colleagues analyzed detailed lipid profiles for more than 1.3 million US adults dating from 2009 through 2011. The LDL cholesterol and blood lipid components in those samples had been measured using ultracentrifugation, a technique which uses a centrifuge to separate the particles so that they can be examined. They were then also evaluated using the Friedewald equation, and the results of each method were compared.

The Friedewald equation calculates LDL cholesterol using a specific formula — total cholesterol minus HDL cholesterol minus triglycerides divided by five, the researchers said. The result is then expressed in milligrams per deciliter. The Johns Hopkins team suggests an alternative method which they claim will be more accurate — the use of non-high-density lipoprotein (HDL) levels, in which the so-called “good” cholesterol is subtracted from total cholesterol.

The revised calculation would include not only LDL, but also very low density lipoprotein (VLDL) particles, which are also known to cause plaque in a person´s heart arteries. On average, the non-HDL reading would be approximately 30 points higher than LDL cholesterol readings calculated using the Friedewald method, the researcher said. That number could vary, though, and Martin believes it could provide a better measurement as to whether or not a patient needs to make specific lifestyle changes or alter his/her medications.

“Non-HDL cholesterol is a much better target for quantifying risk of plaques in coronary arteries,” Jones said. “Looking at non-HDL cholesterol would make it simpler and more consistent, and would enable us to provide our patients with a better assessment.”

In addition to Martin and Jones, authors of the study included Michael J. Blaha, Mohamed B. Elshazly, John W. McEvoy, Parag H. Joshi, Peter O. Kwiterovich, Andrew P. DeFilippis and Roger S. Blumenthal from Johns Hopkins; Eliot A. Brinton from the Utah Foundation for Biomedical Research and Utah Lipid Center; Peter P. Toth from the University of Illinois College of Medicine at Peoria; and Krishnaji R. Kulkarni and Patrick D. Mize from Atherotech Diagnostics Lab in Birmingham, Alabama.

Supreme Court May Decide Whether We Own Our Genes

Brett Smith for redOrbit.com — Your Universe Online

They may be responsible for everything in your life, from conception to death, they may be inside every living cell in your body — but you do not own your own genes, legally speaking.

According to a report in Genome Medicine, patents essentially cover the entire human genome, hampering research and raising the question of “genomic liberty.”

The legal standing of genomic patents could change next month when the Supreme Court reviews patent rights for two key breast and ovarian cancer genes, BRCA1 and BRCA2, which include segments of genetic code as small as 15 nucleotides, known as 15mers.

“This is, so to speak, patently ridiculous,” said report co-author Dr. Christopher E. Mason of Weill Cornell Medical College. “If patent claims that use these small DNA sequences are upheld, it could potentially create a situation where a piece of every gene in the human genome is patented by a phalanx of competing patents.”

In their report, Mason and Dr. Jeffrey Rosenfeld, an assistant professor of medicine at the University of Medicine & Dentistry of New Jersey, looked at patents for two different categories of DNA fragments: long and short. They revealed 41 percent of the human genome is covered by “long” DNA patents that can include whole genes. Because many genes share similar sequences within their code that are patented, the combination of all these “short” DNA patents covers 100 percent of the genome.

“This demonstrates that short patent sequences are extremely non-specific and that a 15mer claim from one gene will always cross-match and patent a portion of another gene as well,” Mason said. “This means it is actually impossible to have a 15mer patent for just one gene.”

To reach their conclusions, the researchers first looked at small sequences within BRCA1 and noticed one of the company’s BRCA1 patents also covered almost 690 other human genes. Some of these genes are unrelated to breast cancer — instead being associated with brain development and heart functioning.

Next, researchers determined how many known genes are covered by 15mers in current patent claims. They found 58 patents covered at least ten percent of all bases of all human genes. The broadest patent claim matched 91.5 percent of human genes. When the team took patented 15mers and matched them to known genes, they found 100 percent of known genes are patented.

Finally, the team also looked at “long” DNA sequences from existing gene patents, ranging from a few dozen to thousands of base pairs. They found these long sequences added up to 41 percent of known human genes.

“There is a real controversy regarding gene ownership due to the overlap of many competing patent claims. It is unclear who really owns the rights to any gene,” Rosenfeld said. “While the Supreme Court is hearing one case concerning just the BRCA1 patent, there are also many other patents whose claims would cover those same genes.

“Do we need to go through every gene to look at who made the first claim to that gene, even if only one small part? If we resort to this rule, then the first patents to be granted for any DNA will have a vast claim over portions of the human genome,” he added.

Another legal question surrounds patented DNA sequences that cross species boundaries. The researchers found one company has the rights to 84 percent of all human genes for a patent they received for cow breeding.

What is Topography?

Hi, my name is Emerald Robinson, and in this “What is” video, we’re going to answer the question, “What is Topography?”
Topography is the study of a land’s surface shape – its hills and mountains, valleys, rivers, and craters. Topographers analyze these features, whether they’re on the earth, the moon, an asteroid, or on a distant planet.
The primary goal of topography is to find out the latitude (the distance north or south of the equator,) the longitude (the distance east or west of the Prime Meridian,) and elevation (the distance above sea level) of various landforms.
Topographers study both the geology and the geography of land’s features. These qualities also make up what we call an area’s terrain. Many times, topographers use information about the earth’s terrain to create a topographic map.
Topographic maps are useful because they are able to show elevation on a flat piece of paper. Elevation is indicated by a line, usually curved, called a “contour line.” For example, a mountain’s peak that has an elevation of 10,000 feet would be represented by a contour line drawn in the shape of all continuous points of the peak that are 10,000 feet above the ground.
Contour lines are usually labeled with the elevation they represent, and can be used to tell the slope of a land form. Close contour lines mean a steep slope, while contour lines that have more space between them mean a more gentle slope.
Topography is used to determine where to safely construct new buildings; to figure out where rivers and streams flow; to help dig mines and to build dams, and to plan and repair roads.
You may have even used topography at a local, state, or national park to plan a hike, or to find out where to canoe, fish, and do other outdoor activities.

What is a Mineral?

Hi, I’m Emerald Robinson, and in this “What Is?” video, we’re going to answer the question, “What is a Mineral?”

Mineralogists define a mineral as a substance that:

● Is natural, which means that humans don’t manufacture it;

● Is a solid, that is, it keeps its form at room temperature and;

● Has a distinct chemical make-up, which means it has a specific chemical formula that’s consistent throughout the mineral. This is true whether the mineral is made of a single element, or of a combination of elements. A specific chemical make-up distinguishes minerals from rocks, which are mixtures of many different materials.

In addition to these qualities, most mineralogists agree that minerals are “abiogenic,” that is, not produce in living organisms, and that they have atoms that are arranged in a specific order. This property results in many minerals taking the form of crystals.

Mineralogists have identified almost 5,000 unique kinds of minerals. They’re classified by physical properties such as color, hardness, luster, (how they reflect light), and radioactivity.

Despite earth’s wide variety of minerals, about 90% of our planet’s crust is made of minerals composed largely of silicon and oxygen, called “silicate minerals.” The most common silicate minerals are called feldspar and quartz.

Minerals don’t just make up the earth – they also make about 4% of your body mass. Because minerals can’t be made by the body, it’s important that we get them in our diet. Some of the more important ones include:

● Calcium and phosphorous, important for bones and teeth,

● Iron, essential in red blood cells, and

● Potassium and sodium, required for healthy nerve cells.

● We also require trace minerals like copper, zinc, and iodine.

Humans use minerals in everything from cereals, to batteries, to shampoos to pet foods. In fact, at least 30 minerals are present in the computer you’re using to view this video.

Obesity Breathalyzer? Study Finds Link Between BMI And Gases In The Breath

Brett Smith for redOrbit.com – Your Universe Online

Law enforcement has been using breathalyzers for years to protect the public from drunk drivers, but a new study from Cedars-Sinai Medical Center in Los Angeles suggests that breathalyzers might serve to protect certain people from doughnuts and other fatty foods.

According to the study, which appeared in the“¯Journal of Clinical Endocrinology & Metabolism, someone with a higher body mass index (BMI) is associated with higher concentrations of hydrogen and methane gas in their breath. These gases are associated with a methane-producing intestinal microorganism called Methanobrevibacter smithii (M. smithii).

“Normally, the collection of microorganisms living in the digestive tract is balanced and benefits humans by helping them convert food into energy,” said lead author Dr. Ruchi Mathur, director of the hospital´s Outpatient Diabetes Treatment and Education Center in the Division of Endocrinology. “When M. smithii becomes overabundant, however, it may alter the balance in a way that makes the human host more likely to gain weight and accumulate fat.”

In the study, researchers analyzed the breath-chemical content of over 790 people. An analysis of the breath showed four different patterns: normal breath content, higher levels of methane, higher concentrations of hydrogen, or higher levels of both gases. Participants whose breath test contained higher concentrations of both gases tended to have a higher BMI and higher percentages of body fat.

“This is the first large-scale human study to show an association between gas production and body weight — and this could prove to be another important factor in understanding one of the many causes of obesity,” Mathur said.

As M. smithii metabolizes hydrogen in the digestive tract it produces methane, which is eventually exhaled by the individual. The scientists say this bacterial action allows for a more efficient extraction of nutrients from food — resulting in a greater risk of weight gain and obesity for the human host.

“Essentially, it could allow a person to harvest more calories from their food,” Mathur said.

It should be noted that Mathur´s connection between bacterial action and obesity is only a theory and the study did not establish a concrete causal relationship between the two. Previous research has shown conflicting evidence on the relationship between M. smithii and body fat.

A June 2012 article published in the International Journal of Obesity by a group of French researchers found that obese individuals were associated with the intestinal microbe“¯Lactobacillus reuteri and lower levels of“¯M. smithii. And a 2009 study in the journal PLOS ONE by the same research group found that high levels of“¯M. smithii were associated with“¯anorexia.

Despite some conflicting evidence, Mathur continues to explore the potential connection between M. smithii and body fat. She is currently working on a study that looks at the effects of targeting the bacteria with a specific antibiotic.

“We’re only beginning to understand the incredibly complex communities that live inside of us,” she said. “If we can understand how they affect our metabolism, we may be able to work with these microscopic communities to positively impact our health.”

Urban Vegetation Deters Crime In Philadelphia

Contrary to convention, vegetation, when well-maintained, can lower the rates of certain types of crime, such as aggravated assault, robbery and burglary, in cities, according to a Temple University study, “Does vegetation encourage or suppress urban crime? Evidence from Philadelphia, PA,” published in the journal, Landscape and Urban Planning.

“There is a longstanding principle, particularly in urban planning, that you don’t want a high level of vegetation, because it abets crime by either shielding the criminal activity or allowing the criminal to escape,” said Jeremy Mennis, associate professor of geography and urban studies at Temple. “Well-maintained greenery, however, can have a suppressive effect on crime.”

After establishing controls for other key socioeconomic factors related to crime, such as poverty, educational attainment and population density, Mennis, along with environmental studies major Mary Wolfe, examined socioeconomic, crime and vegetation data, the latter from satellite imagery.

They found that the presence of grass, trees and shrubs is associated with lower crime rates in Philadelphia, particularly for robberies and assaults.

The authors surmise this deterrent effect is rooted in the fact that maintained greenery encourages social interaction and community supervision of public spaces, as well the calming effect that vegetated landscapes may impart, thus reducing psychological precursors to violent acts. They offer their findings and related work as evidence for urban planners to use when designing crime prevention strategies, especially important in an age when sustainability is valued.

Mennis said rather than decreasing vegetation as a crime deterrent, their study provides evidence that cities should be exploring increasing maintained green spaces.

“Increasing vegetation, supporting sustainability – they are a nice complement to so many city initiatives beyond increasing aesthetics and improving the environment,” he said.

“Reducing stormwater runoff, improving quality of life, reducing crime – all of these objectives are furthered by increasing well-managed vegetation within the city.”

On the Net:

World’s First Two-Headed Bull Shark Found In Gulf Of Mexico

Brett Smith for redOrbit.com – Your Universe Online

According to a report in the Journal of Fish Biology,“¯American marine biologists discovered the first ever two-headed bull shark last year in the Gulf of Mexico. Different from conjoined twins whose bodies are connected in utero, the phenomenon known as dicephalia has previously  been observed in other marine species such as blue or tope sharks.

“This is certainly one of those interesting and rarely detected phenomena,” said report co-author Michael Wagner, assistant professor of fisheries and wildlife at Michigan State University. “It´s good that we have this documented as part of the world´s natural history, but we´d certainly have to find many more before we could draw any conclusions about what caused this.”

Creatures with dicephalia often die quickly after birth, making their discovery extremely rare. The researchers were fortunate that a fisherman found the two-headed shark after opening the uterus of an adult shark he had caught.

“You´ll see many more cases of two-headed lizards and snakes,” Wagner said. “That´s because those organisms are often bred in captivity, and the breeders are more likely to observe the anomalies.”

According to Wagner, the two-headed shark would have never survived in the wild and it died very quickly after being discovered. He added that the shark´s body development suffered from the amount of energy that growing and maintaining two heads demands.

“It had very developed heads, but a very stunted body,” he told OurAmazingPlanet.

After transporting the shark to Wagner´s lab back at Michigan State, the marine biologist and his team were able to use magnetic resonance imaging (MRI) scanners to examine the anomalous shark in greater detail. The images revealed that the shark had two distinct heads, hearts and stomachs. The dual physiologies joined together in the back half of the animal and terminated in a single tail.

Wagner cautioned against hastily attributing the shark´s abnormal physiology to pollution in order to further a conservation agenda.

“Given the timing of the shark´s discovery with the Deepwater Horizon oil spill, I could see how some people may want to jump to conclusions,” Wagner said. “Making that leap is unwarranted. We simply have no evidence to support that cause or any other.”

Despite Wagner´s assertion, there have been many reports linking deformed creatures in the Gulf of Mexico to the Deep Water Horizon accident in 2010. Last year, Al Jazeera reported that the oil and chemical dispersants that were used to clean up the oil spill were causing mutations such as eyeless shrimp and clawless crabs.

“Disturbingly, not only do the shrimp lack eyes, they even lack eye sockets,” Louisiana commercial fisher Tracy Kuhn told the Arab news network.

“Some shrimpers are catching these out in the open Gulf [of Mexico],” she added, “They are also catching them in Alabama and Mississippi. We are also finding eyeless crabs, crabs with their shells soft instead of hard, full grown crabs that are one-fifth their normal size, clawless crabs, and crabs with shells that don’t have their usual spikes “¦ they look like they’ve been burned off by chemicals.”

Besides potentially causing widespread mutations, the fishing industry has also reported a significant drop in productivity since the spill.

Study Says Angry Tones Influence How Baby Brains Process Emotion

April Flowers for redOrbit.com – Your Universe Online
Parents have long known that the tone of their voice affects a baby’s mood, but a new study from the University of Oregon shows that a baby’s exposure to parental arguments is associated with the way the infant’s brain processes stress and emotions.
The study revealed that infants respond to an angry tone of voice, even when they are asleep. The findings of the study were published in a recent issue of Psychological Science.
Babies’ brains develop in response to the environments and experiences that they encounter. This high level of neural plasticity, however, comes with a certain degree of vulnerability. Severe stress, such as maltreatment or institutionalization, can have a significant negative impact on child development, according to previous research. Graduate student Alice Graham and her advisors, Phil Fisher and Jennifer Pfeifer, wondered what impact more moderate stressors might have on that vulnerability.
“We were interested in whether a common source of early stress in children´s lives — conflict between parents — is associated with how infants´ brains function,” explained Graham.
The research team took advantage of recent developments in functional magnetic resonance imaging (fMRI) with infants to answer their questions. The study participants were twenty infants, ranging in age from 6 to 12 months. They were brought into the lab at their regular bedtimes. While the children slept in the scanner, they were presented with nonsense sentences spoken in neutral, happy, mildly angry and angry tones by an adult male.
“Even during sleep, infants showed distinct patterns of brain activity depending on the emotional tone of voice we presented,” says Graham.
Babies from high-conflict homes showed a greater reactivity to very angry voice tones in brain areas linked to stress and emotional regulation. These areas include the anterior cingulate cortex, the caudate, the thalamus and the hypothalamus.
According to previous animal research, these brain areas play an important role in the impact of early life stress on development. Graham´s study suggests that the same might be true in human babies. Apparently, infants are not oblivious to their parents’ conflicts, the study shows, and exposure to such conflicts may influence the way babies’ brains develop to process emotions and stress.

Hormones In Saliva Predict Aggression And Violence In Boys

April Flowers for redOrbit.com – Your Universe Online

A simple saliva test could be an effective tool in predicting violent behavior, a new pilot study led by Cincinnati Children´s Hospital Medical Center indicates.

The study findings, published this week in Psychiatric Quarterly, suggest a link between aggression and the concentrations of certain hormones in saliva.

The research team collected saliva samples from 17 boys ages 7 to 9 who were admitted to the hospital for psychiatric care in order to identify which children were most likely to show aggression and violence.“¯Three samples were collected from each boy in one day shortly after admission. The samples were then tested for levels of three hormones: testosterone, dehydroepiandrosterone (DHEA) and cortisol. The team found a correlation between the levels of these hormones and the severity and frequency of aggression.

Drew Barzman, MD, a child and adolescent forensic psychiatrist at Cincinnati Children´s Hospital, and his team focused on a common problem in psychiatric units: the rapid, real-time assessment of violence among child and adolescent patients. Barzman sees other possible applications for a fast and accurate saliva test, as well as predicting violence.

“We believe salivary hormone testing has the potential to help doctors monitor which treatments are working best for their patients,” said Barzman. “And because mental health professionals are far more likely to be assaulted on the job than the average worker, it could offer a quick way to anticipate violent behavior in child psychiatric units. Eventually, we hope this testing might also provide a tool to help improve safety in schools.”

The saliva test was used in conjunction with other aggressive behavior tools, such as the Brief Rating of Aggression by Children and Adolescents (BRACHA) questionnaire. BRACHA is an assessment tool developed by Barzman’s team to predict violence and aggression in the hospital.

“This study sample, while small, gives us the data we need to move forward,” added Barzman. “We have more studies planned before we can reach a definitive conclusion, but developing a new tool to help us anticipate violent behavior is our ultimate goal.”

Too Much TV Has Little Impact On A Child’s Social Development

Lawrence LeBlond for redOrbit.com – Your Universe Online
A recent study from University of Otago, New Zealand, published in the journal Pediatrics, suggested that too much TV was making kids mean and antisocial. The results of that study also showed that it was not always how much TV, but what types of shows they were watching.
However, a new study by the Medical Research Council (MRC) has found evidence that spending hours in front of the TV or playing video/computer games each day does not have as much of an influence on a child´s social development as previously suggested.
In a study published in the journal Archives of Disease in Childhood, researchers from the MRC examined primary school students and found there is no significant link between watching TV and bad behavior.
The study authors said for children who watch TV for 3 or more hours per day there was a slight increase in the risk of developing anti-social behaviors — such as stealing, bullying or fighting — by the age of seven. However, other influences, such as bad parenting styles may explain the link more.
The MRC team also found no link between the amount of time playing video games and increases in bad behavior.
In the past, prolonged TV viewing has been linked to various behavioral and emotional problems in children, said the authors, but most research has focused exclusively on TV. The MRC study looked more at TV viewing and playing video games in respect to a psychological and social impact in children between five and seven years old.
For the study, Dr. Alison Parkes and her colleagues followed 11,000 children who were part of the UK Millennium Cohort Study, which has been tracking the long-term health and development of UK children born between 2000 and 2002. When these children reached age five and then again at seven, their parents were asked to describe how well adjusted their kids were, using a Strengths and Difficulties Questionnaire (SDQ).
The SDQ contained five scales, measuring conduct, emotional symptoms, poor attention span, difficulties in making friends, and empathy and concern for others. The mothers were also asked to report how much time their children spent watching TV and playing computer and electronic games at the age of five.
At age five, nearly two-thirds of the kids watched TV for between one and three hours per day, with 15 percent watching more than three hours. Less than 2 percent watched none at all. As for playing video games, the authors found that only 3 percent of five-year-olds spent three or more hours per day on this activity.
After taking influential factors into account, including parenting and family dynamics, watching TV for three or more hours per day was associated with a very small increased risk of antisocial behavior between the ages of five and seven.
But the authors found no links between spending excess time in front of the tube and emotional or attention issues. And spending time playing video games had no impact. They noted that the link between heavy screen time and mental health may be indirect, rather than direct, such as increased sedentary behavior, difficulty sleeping, and impaired language development.
Dr. Parkes said it is wrong to blame social problems on TV. “We found no effect with screen time for most of the behavioral and social problems that we looked at and only a very small effect indeed for conduct problems, such as fighting or bullying.”
“Our work suggests that limiting the amount of time children spend in front of the TV is, in itself, unlikely to improve psychosocial adjustment,” she told BBC´s Michelle Roberts.
Professor Annette Karmiloff-Smith, of Birkbeck, University of London (BBK), said that researchers should focus more on the positive impact watching TV and playing video games may have on children, rather than the possible adverse effects.
“We are living in a world that is increasingly dominated by electronic entertainment, and parents are understandably concerned about the impact this might be having on their children’s well-being and mental health,” said Professor Hugh Perry, chair of the MRC’s neurosciences and mental health board. “This important study suggests the relationship between TV and video games and health is complex and influenced by many other social and environmental factors.”
“[The study] suggests that a cautionary approach to the heavy use of screen entertainment in young children is justifiable in terms of potential effects on well-being, particularly conduct problems, in addition to effects on physical health and academic progress shown elsewhere,” Parkes and her research team concluded.

Pacific Island Birds Vanished Without A Trace After The Arrival Of Man

April Flowers for redOrbit.com – Your Universe Online

The last region on Earth to be colonized by humans was home to more than 1,000 species of birds that went extinct shortly after people reached their island homes, new research from the Zoological Society of London (ZSL) and collaborators reveals.

Tropical Pacific Islands, like Hawaii and Fiji, were an untouched paradise almost 4,000 years ago when the arrival of the first people caused irreversible damage with overhunting and deforestation. Many birds disappeared as a result, but uncertainties in the fossil record have constrained our understanding of the sheer scale and extent of these extinctions.

Professor Tim Blackburn, Director of ZSL’s Institute of Zoology says, “We studied fossils from 41 tropical Pacific islands and using new techniques we were able to gauge how many extra species of bird disappeared without leaving any trace.”

After the first humans arrived on these islands alone, 160 species of non-passerine land birds — non-perching birds with feet designed with specific functions, such as webbed for swimming — became extinct without a trace.

The largest order of birds is comprised of passerines, or songbirds, representing over half of the world’s total bird species. Passerines include such birds as flycatchers, birds of paradise, crows and many well-known garden birds. Non-passerines comprise all other birds. Though some non-passerines spend most of their life at sea, such as shearwaters and albatrosses, all of them nest on land.

“If we take into account all the other islands in the tropical Pacific, as well as seabirds and songbirds, the total extinction toll is likely to have been around 1,300 bird species,” Professor Blackburn added.

The list of lost species includes several species of moa-nalos (large flightless birds from Hawaii), and the New Caledonian Sylvionis (a relative of game birds like pheasants and grouse, which weighed in around 65 pounds).

The study, published in Proceedings of the National Academy of Sciences, shows that certain islands and species were especially vulnerable to overhunting and habitat destruction. More species were lost from small, dry islands because they were deforested more easily and had fewer hiding places from hunters. The type of bird made a difference as well, with flightless birds being over 30 times more likely to become extinct than those that could fly.

The loss of bird species in the Pacific has not stopped. After the arrival of Europeans, at least 40 more species have disappeared and many more are facing extinction today.

Toenail Clippings To Be Tested For Hexavalent Chromium

redOrbit Staff & Wire Reports – Your Universe Online

Three decades after thousands of pounds of a cancer-causing agent accidentally leaked from a tank in a Garfield, New Jersey factory, researchers plan to measure the impact of the incident on the city´s residents in a most unusual way — by analyzing their toenail clippings.

According to Sheila M. Eldred of Discovery News, scientists from New York University (NYU) are planning to gather toenail clippings from the Bergen County city´s populace in order to test for accumulation of the toxic carcinogen.

The NYU researchers will be able to tell how much of the substance has been built up over the past 18 months due to the slow-growing nature of the protein-based covering.

“The risk of contamination comes from a 1983 leak where thousands of pounds of hexavalent chromium seeped out of a tank at a factory surrounded by houses and apartment buildings,” explained ABC News reporter Christina Ng. “Scientists say only 30 percent of the leak was cleaned up, and 10 years later chromium was found in area basements and a firehouse.”

The US Environmental Protection Agency (EPA) told Ng that the underground plume of the substance (a metal typically used for industrial production purposes) is approximately three-fourths of a mile wide and slightly less than one mile long.

NYU School of Medicine professor and research scientist Judith Zelikoff added that the region they are concerned about involves 600 homes and businesses, and that more than 3,700 people could be affected.

“Residents of the area are being given kits that include a stainless steel nail clipper (cheap nail clippers may contain traces of chromium), a plastic bag for the clippings, nail polish remover, alcohol swabs, instructions and an envelope for the clippings. The results will take about five weeks,” the ABC News reporter said.

“Samples are being collected from people ages 18 to 65 who are non-smokers and do not take chromium supplements. The people must have been residents of Garfield for at least two years,” she added.

The three-decade old lead originated at a tank at the EC Electroplating Company, a factory which Katie Zezima of the Associated Press (AP) reports was “surrounded on all sides by houses and apartments.”

New Jersey officials started cleaning up the spill, but stopped after two years of work, the wire service reporter explained. Then, in 1993, chromium was detected at a now defunct firehouse, and then later in people´s homes.

The EPA dubbed the location a “Superfund” site — essentially declaring it one of the most toxic uncontrolled hazardous waste sites in the country — back in 2011. They also warned residents to remain out of their basements in order to prevent possible exposure to the carcinogen.

Chromium was removed from the building and then demolished last year, Zezima said, but EPA officials discovered the presence of tanks that had holes in them. That could have caused the metal to leak into the groundwater, though officials report that the city´s drinking water supply has not yet been affected.

There are, however, concerns that “people could inhale chromium dust that has been found in basements where groundwater has leached in,” Zezima said. “High quantities of the metal have been found in 14 homes that have since been cleaned up. Trace amounts were found in 30 to 40 homes. Testing continues, and a nearby school did not show elevated chromium levels.”

Benefits To Sharing Personal Health Info Via Social Media

Alan McStravick for redOrbit.com — Your Universe Online
Though common and almost daily use of social media has been prevalent for over a decade, it is important to recognize that we, as a society, are still experiencing the advent of this technology and its exploration of the full potentiality it might achieve in our daily lives.
Like with many previous technologies, we have adopted and adapted it for the seemingly most primitive of uses: entertainment. However, computer scientists are exploring how to broaden our use of social media to be more beneficial, not only to the individual user, but also to society at large.
Currently, according to a Brigham Young University (BYU) study, social media use is typified by individuals posting information related to the most mundane activities. We post information and pictures about meals we´ve eaten, activities we´ve attended and funny things we´ve seen.
However, according to the BYU researchers, one area we present some hesitance in sharing with the entire World Wide Web is our health. Individuals may post a status about an upset stomach, a lingering flu or sniffles brought on by allergies, but when it comes to our experiences with over-the-counter and prescription medications or physicians or clinics we have researched, our social media presence is practically mute.
“Less than 15 percent of us are posting the health information that most of us are consuming,” said Rosemary Thackeray, BYU professor of health science and lead author of the study appearing online in the Journal of Medical Internet Research.
In the study, the team determined while more than 60 percent of Internet users will go online to seek out a self-diagnosis for our ills, very few of us will broadcast, via social media, the information we learn.
According to Thackeray, if Internet users were able to take advantage of the “social” in social media, we would see a marked improvement in the available online health information.
“If you only have a few people sharing their experience with using a painkiller, that´s different than 10,000 people doing that,” Thackeray said. “If we´re really going to use this social media aspect, there needs to be a true collective wisdom of the crowds.”
Thackeray and her BYU colleagues, Ben Crookston and Josh West, derived their study data from the Pew Internet and American Life Project. From the Pew project, they determined a full three-fourths of people actually turn to basic search engines like Google and Yahoo when beginning their search for medical or health information.
The original 75 percent is whittled down to just under a third of individuals who then transition to more traditional social networking sites, like Facebook and Twitter, for their health-related activities. Additionally, 41 percent of individuals will seek out online rankings or reviews of individual doctors and health care facilities.
Though the above data shows a great majority of us consult our computers in trying to determine our afflictions, the team reports only 10 percent of those surveyed went on to post reviews of their experiences with a physician or clinic. They also claim only 15 percent of respondents to the Pew study went on to post comments, questions or information when it came to health-related info.
“The inherent value of ℠social´ in social media is not being captured with online health information seeking,” Thackeray said. “Social media is still a good source of health information, but I don´t think it´s ever going to replace providers or traditional health care sources.”
Where individual contributions are lacking, computer scientists have been developing algorithms meant to take advantage of the limited information presented via social networking sites with regard to public health. In fact, redOrbit´s own Enid Burns wrote previously about research conducted at the University of Rochester regarding the use of the social networking platform Twitter and how it could help to pinpoint targeted flu outbreaks.
In that research, the team used GPS tagging on the social network to help map out specific areas and neighborhoods in cities where the prevalence of flu symptoms were elevated.
“If you want to know, down to the individual level, how many people are sick in a population, you would have to survey the population, which is costly and time-consuming,” said Adam Sadilek, postdoctoral researcher at Rochester. “Twitter and the technology we have developed allow us to do this passively, quickly and inexpensively; we can listen to what people are saying and mine this data to make predictions.”
The team at Rochester sees wide-ranging future application of this targeted tracking of unhealthy regions assisting individuals with planning travel routes within a sickened city. Sadilek contends the MIT-developed algorithm they utilized for their study could, as an example, aid someone in deciding to avoid a subway station if it had been determined to be populated with potentially sick individuals.
Of course, the collection of both passive and specific online habits poses ethical questions that are still in the process of being addressed and answered by experts in the field. As initially noted above, despite the seeming ubiquity of the Internet in our day-to-day lives, it is important to bear in mind that this medium is still in its infancy. Rules and guidelines, and the ethics involved with each, are still being determined.
The BYU researchers believe social media will transition from basic use, such as entertainment and general information, to a more valuable presence once there is an increase in individual self-reporting in health discussions. They claim patients, themselves, will achieve a sense of empowerment, with regard to their health decision-making, as a result. Additionally, physicians would achieve a greater awareness of the public discourse around certain medical issues.
One aspect the BYU study did not delve into was just why it is we tend to avoid specifics with regard to our health. The idea that we will share the most intimate (and inane) moments of our life, but stop short when the topic turns to our health, may be a result of the prevailing cultural mores of our society.
While this study maintained a US-centric view, the idea of social media usage and its being affected by the general culture in which it exists has been explored previously, focusing on the country of Japan.
Noted in a 2008 article, Japanese citizens who engaged in social media activity very rarely associated with persons whom they hadn´t actually met in a public setting. Additionally, one would be hard pressed to find any information on an individual´s profile page that would identify the user.
The culture of the Japanese may just be a far more exaggerated example of the US culture, as it pertains to our discussion of health information. The BYU team claims the challenge now, as they see it, is achieving success in getting individuals to contribute health information on the varied social media platforms.
“We´re just not there yet, but we´ll probably get there in the future,” Thackeray said.

Defibrillator Danger: FDA Recommends New Safety, Efficiency Regulations

Brett Smith for redOrbit.com – Your Universe Online

Although they have been credited with saving thousands of lives, the US Food and Drug Administration (FDA) has recommended new regulations for automated external defibrillators (AED), including the requirement that the devices must meet federal approval before hitting the market.

“[The] FDA is not questioning the clinical utility of AEDs,” said FDA spokesman Dr. William Maisel in a statement on Friday. “These devices are critically important and serve a very important public health need.”

“The importance of early defibrillation for patients who are suffering from cardiac arrest is well-established,” he added.

AEDs can be found in workplaces, schools and restaurants across the country. They are designed to electrically jump-start the heart after it has gone into arrest and are supposed to be used only by properly trained individuals.

FDA officials said their new proposal would require manufacturers to submit evidence showing their devices are safe and efficient in order to be approved for or to remain on the market. The FDA rules would also necessitate that manufacturers provide inspection reports and submit details to the FDA of any changes made to the device.

“Today’s action does not require the removal or replacement of AEDs that are in distribution. Patients and the public should have confidence in these devices, and we encourage people to use them under the appropriate circumstances,” Maisel said.

He noted that it would be difficult to determine how many deaths defective devices have caused as the patients may have died due to cardiac arrest anyway. However, Maisel said there were enough problems with devices to warrant action by the federal government.

“Tens of thousands of adverse events is too many. We think 88 recalls are too many,” Maisel said. “So, by calling for pre-market approval we can focus our attention on the types of problems that have been observed and our expectation is that we will observe an improvement in the reliability over time with these devices.”

Stories of AED malfunctions range from software problems to power failures to component failures.

The FDA said the new proposals are based on recommendations by its Circulatory System Devices Panel, which said AEDs should be a Class III medical device, requiring pre-market approval.

Many of the device´s makers stand by their AEDS and say they have been expecting this announcement for some time. Device maker Physio-Control’s chief executive, Brian Webster, said the increasing number of device failures could be due to their increasing prevalence.

“While it is the FDA’s position that defibrillator failures have risen over the past several years, the number of devices in the market has grown dramatically over that same time period,” he told Reuters. “An evaluation of the number of defibrillator malfunctions must include the number of devices deployed to accurately assess whether the failure rate is increasing or actually decreasing.”

Rachel Bloom-Baglin of Philips also told Reuters that while the company has not yet seen the official written proposal, they have been in discussions with the FDA regarding their AEDs. She said the company believes more federal oversight would not result in an interruption of device or parts supply.

Outdoor Education Helps Minority Students Close Gap In Environmental Literacy

Environmental education programs that took middle school students outdoors to learn helped minority students close a gap in environmental literacy, according to research from North Carolina State University.
The study, published March 22 in PLOS ONE, showed that time outdoors seemed to impact African-American and Hispanic students more than Caucasian students, improving minority students’ ecological knowledge and cognitive skills, two measures of environmental literacy. The statewide study also measured environmental attitudes and pro-environmental behavior such as recycling and conserving water.
“We are interested in whether outdoor experiences can be part of a catch-up strategy that can help in narrowing the environmental literacy gap for minority students,” said lead author Kathryn Stevenson, an NC State graduate student who has taught outdoor education in California and high school biology and science in North Carolina.
Researchers tested the environmental literacy of sixth- and eighth-grade students in 18 North Carolina schools in the fall and spring. Half of the schools studied had registered an environmental education program with the state.
Using a published environmental curriculum, such as Project Learning Tree, Project WET or Project WILD, helped build students’ cognitive skills, researchers found. Learning in an outdoor environment improved students’ ecological knowledge, environmental attitudes and behavior.
“This is one of the first studies on a broad scale to focus on environmental literacy, which is more than mastering facts,” said co-author Nils Peterson, associate professor of fisheries and wildlife in NC State’s College of Natural Resources. “Being environmentally literate means that students learn cognitive skills so that they can analyze and solve problems, and it involves environmental attitudes and behaviors as well.”
Girls and boys appeared to have complementary strengths that contributed to environmental literacy. Boys scored highest on knowledge, while girls led in environmental attitudes and cognitive skills.
Sixth graders showed greater gains in environmental literacy than eighth graders, suggesting that early middle school is the best window for environmental literacy efforts, Stevenson said.
Teachers’ level of education played an important role in building environmental literacy. Those with a master’s degree had students with higher levels of overall environmental literacy.
Teachers who had spent three to five years in the classroom were more effective at building students’ cognitive skills than new teachers. Efforts are needed to engage veteran teachers in environmental education, Stevenson said.
In a follow-up to the study, Stevenson is studying coastal North Carolina students’ perceptions of climate change.

On the Net:

Stem Cells Heal Damaged Intestinal Tissue In Premature Babies

Lawrence LeBlond for redOrbit.com – Your Universe Online

Researchers studying stem cells removed from amniotic fluid have found a possible role the cells have on healing damage caused by necrotizing enterocolitis (NEC), a severe inflammation that can destroy tissues in the gut and lead to major organ failure.

The findings, published in the journal Gut, are based on early animal tests that reveal healing and an increase in survival. The researchers say the evidence could lead to a new form of cell therapy for premature babies, but cautioned that more research is needed first.

The study was funded by Great Ormond Street Hospital (GOSH) Children´s Charity and led by University College London´s (UCL) Institute of Child Health (ICH). The researchers investigated how the stem cells work in relation to NEC, which is the most common gastrointestinal surgical emergency in newborns, with mortality rates or around 15 to 30 percent in the UK.

While breast milk and probiotics are known to offer some level of protection against NEC, there are currently no medical treatments available other than emergency surgery. Surgical removal, however, shortens the bowel and can lead to intestinal failure, with some babies needing ongoing intravenous nutrition or intestinal transplant.

Babies born prematurely often have guts that are ill-prepared to handle food, and about one in 10 preemies in neonatal intensive care go on to develop NEC. The inflammation can cause tissue death and lead to holes in the intestine which can lead to even more serious infections.

“It is quite a problem and we think it is on the increase,” said Dr. Simon Eaton, from UCL´s Institute of Child Health.

Dr. Eaton, who was part of the research team investigating the role of amniotic stem cells in laboratory rats programmed to develop fatal NEC, said the injections of the stem cells appeared to increase the survival times of the rats.

“We’re able to prolong survival by quite a long way,” he told the BBC. “What appears to be happening is a direct effect on calming inflammation and also stimulating resident stem cells in the gut to be more efficient at repairing the intestines.”

The researchers harvested amniotic fluid stem (AFS) cells from rodent amniotic fluid and injected them into rats with NEC. Other rats with the same condition were given bone marrow stem cells taken from their femurs, or were kept on normal nutrition with no treatment, to compare clinical outcomes.

The rats injected with AFS cells showed significantly higher survival rates a week after given the treatment, compared to the other two groups. Upon further inspection of their intestines, the researchers found inflammation was significantly reduced, with fewer dead cells, greater self-renewal of gut tissue and better overall intestinal function.

The researchers noted that bone marrow stem cells have been shown in the past to reverse colonic damage in irritable bowel syndrome, but the beneficial effects from bone marrow did not spill over into NEC therapy, indicating different mechanisms were at play for NEC. Following the injections of AFS cells into the gut, the cells moved into the intestinal villi — small, finger-like projections that protrude from the lining of the intestinal wall which pass nutrients from the intestine to the blood.

The researchers found that instead of directly repairing the damaged tissue, the AFS cells appeared to release specific growth factors that acted on progenitor cells in the gut which in turn, reduced inflammation and triggered the formation of new villi.

“Stem cells are well known to have anti-inflammatory effects, but this is the first time we have shown that amniotic fluid stem cells can repair damage in the intestines,” said lead author Dr. Paolo De Coppi, of UCL´s Institute of Child Health.

“In the future, we hope that stem cells found in amniotic fluid will be used more widely in therapies and in research, particularly for the treatment of congenital malformations. Although amniotic fluid stem cells have a more limited capacity to develop into different cell types than those from the embryo, they nevertheless show promise for many parts of the body including the liver, muscle and nervous system,” he added.

“Once we have a better understanding of the mechanisms by which AFS cells trigger repair and restore function in the gut, we can start to explore new cellular or pharmacological therapies for infants with necrotizing enterocolitis,” noted Dr. Eaton.

Before the research can be tested in human premature babies, the researchers said far more testing is needed to ensure it would be a safe treatment option. Stem cells would have to be taken from a donor as it would not be practical to store fluid from every birth. And with donor cells, there runs the risk of rejection.

Also, there is a risk that the stem cells can become other types of cells posing a risk of developing cancer.

The researchers hope that in the future, if the treatment is proven viable, a drug can be harnessed from the AFS cells.

“It’s not the cells, they’re delivering something and if we knew what that was then we could deliver that directly,” Dr. Eaton told the BBC´s James Gallagher.

Many Moms Introduce Solid Food To Their Babies Much Too Early

Lawrence LeBlond for redOrbit.com – Your Universe Online

A recent study has shown (based on animal models) that babies who are started too early on foods high in carbohydrates will likely have a lifelong struggle with excess weight gain and obesity. New research to be published in the April issue of Pediatrics and released online today has found that forty percent of mothers start feeding their babies solid foods much too early, with many claiming they were given the go-ahead by their healthcare providers.

The researchers found that mothers who used baby formula (54 percent) were more than twice as likely as those who exclusively breastfed (28 percent) to start solid foods before their babies turned 4 months old. While many moms claimed their providers said it was fine to feed their babies earlier, other popular reasons for the switch to solids were that moms believed their babies were ready or that it would help them sleep better and longer at night.

Study coauthor Kelley Scanlon, and epidemiologist with the US Centers for Disease Control and Prevention (CDC), said an early switch to solids is associated with numerous health problems, so it is important to understand the motivation these mothers have.

These findings “don’t offer a full understanding why, but they give us some insight,” she said.

The American Academy of Pediatrics (AAP) states that head and neck control and other coordination skills infants need to safely eat solid foods does not develop until around 4 months of age. The group also notes that introducing solids too soon ends the exclusive benefits of breastfeeding, which it recommends be continued until 6 months of age as it offers many health benefits including reduced risk of respiratory and ear infections, diarrhea and sudden infant death syndrome (SIDS).

The study team noted that the introduction of solids too early may also increase the risk of chronic disease, such as diabetes, obesity and celiac disease.

Some healthcare providers recommend that a small amount of cereal be added to formula to help babies with reflux, said Lana Gagin, a pediatrician at the Helen DeVos Children´s Hospital in Grand Rapids, Michigan, who was not involved in the study. She warned, however, that “there is no good, solid evidence that it helps a baby sleep.”

For the study, researchers analyzed a monthly record of information collected from 1,334 mothers on when and why they introduced solid food to their babies.

“We didn’t expect to see so many (give solids) before 4 months,” said Scanlon. While similar studies in the past asked mothers to recall when and why they offered solids two to three years down the road, the new study asked mothers to recall what was fed within the past week.

Among the findings in the study, the researchers discovered that mothers who introduced solids before 4 months are more likely to be younger, single, have less education or are in the federal Women, Infants and Children (WIC) program.

They also found that 8 percent of mothers introduced solid food as early as 1 month or earlier.

Furthermore, a whopping 89 percent of mothers said they introduced solids early because they believed the baby was old enough to begin eating them; 71 percent said the baby seemed hungrier than usual; 67 percent said the baby showed interest in solid food; 56 percent said the healthcare provider recommended solids; and 8 percent said the baby had a medical condition that may have been helped by eating solids.

As for getting advice from a medical provider to start solids, the researchers said: “We don’t know actually what advice the health care provider gave. But at least this was the perception the parents got – that this was the time to begin solids.”

These findings highlight the importance that pediatricians and other medical providers need to give clear, accurate and supportive advice to parents, said Gagin.

“We sometimes wait until (parents) come in for the 4-month well visit to discuss complementary foods, when introducing the subject during the 2-month check might be better,” Gagin told USA Today´s Michelle Healy. “We may not spend enough time explaining why they should wait and explaining that every time a baby cries doesn’t mean they’re hungry.”

Men And Women Have Gender-specific Medical Needs

Alan McStravick for redOrbit.com – Your Universe Online

Whether perusing the shelves of your local bookstore, watching commercials during your favorite television program, or just opening your eyes, the evidence that men and women are distinctly different and unique animals is seemingly everywhere. Researchers in the field of epidemiology have understood this truth but couldn’t necessarily provide proof of it, until now.

In fact, the male of the species has been the benchmark by which decisions and understandings were formulated with regard to illness, disease symptoms and general medical research for the previous 40 years. However, with the publishing of a new article entitled “Gender medicine: a task for the third millennium”, researchers have presented research on gender-related differences across a whole host of previously uncharted areas in medicine. The research was conducted by Giovannella Baggio of Padua University Hospital.

Baggio´s article, published in the journal Clinical Chemistry and Laboratory Medicine, presents evidence for considerable differences between the sexes in five domains. These are cardiovascular disease, cancer, liver diseases, osteoporosis and pharmacology.

Due to the recognition of cardiovascular disease being perceived as primarily a male disease, symptoms for women have long been overlooked. For instance, a heart attack in men presents a tightness in the chest accompanied by pain that radiates down the left arm. Conversely, symptoms for women include nausea and abdominal pain in the lower region. When women experience a heart attack, the consequences are considered more dire. However, mentioning these non-specific symptoms often leads to being practically dismissed by a healthcare provider rather than leading to a battery of examinations that could identify the potentially fatal condition.

The team also detailed the differences between men and women with regard to cancer and liver disease. Not only are onsets typically different, but locations within the body and options for treatment should vary between the sexes

Previous research on osteoporosis has recognized the condition typically only affects women. Therefore, treatments for this disease are usually female-centric. However, men are fully capable of developing osteoporosis. Men are unfortunately often overlooked and therefore experience an increased mortality due to bone fractures.

Additionally, Baggio and her team provided evidence showing the variation between men and women in the pharmacology of aspirin and other substances. According to the team, differences in action and side effects are attributable to different body types, varying reaction times in the absorption and elimination of substances, and a fundamentally different hormonal status. The team contends gender must be taken into account when considering the effective and safe administration, dosage, and duration of treatment with certain medications.

The study concludes that additional and more far-reaching clinical investigations of gender differences are needed in order to eliminate fundamental inequalities between men and women in the treatment of disease.

Clarithromycin Use Linked To Increased Risk Of Sudden Cardiac Death In Lung Patients

Jason Pierce, MSN, MBA, RN for redOrbit.com — Your Universe Online

A study published in the British Medical Journal is the first of its kind to link the use of the antibiotic clarithromycin for treatment of chronic obstructive pulmonary disease (COPD) and community acquired pneumonia with an increased risk of death related to heart problems long after the course of treatment. Previous studies have suggested the risk of heart related mortality may increase during the use of the drug, but until now the long term consequences were less clear.

Clarithromycin, also known by the brand name Biaxin, is an antibiotic in the macrolide class, which is commonly used to treat flare ups of COPD and community acquired pneumonia. Macrolide drugs work by preventing bacterial cells from growing or multiplying.

A temporary change in heart rhythm known as prolonged QT interval is known to be a possible adverse effect of macrolide use. This prolonged QT interval has been associated with an increased risk of sudden cardiac death in patients taking another commonly used macrolide antibiotic called azithromycin, or Zithromax. In fact, the FDA recently revised the required label for azithromycin to include a stronger warning related to the risk of developing deadly heart rhythms.

The current study, by a team of researchers at the University of Dundee, found the increased risk of heart related illness and death extends beyond the period of time the patient is on the drug. The authors report the increased risk persists over the year following the medication use. Since the risk of prolonged QT interval is elevated only while the patient is taking the medication they conclude there must be another reason for these heart related problems.

The researchers suggest clarithromycin use may cause a weakening of plaques already formed on the walls of arteries. These weakened plaques can break away from the vessel wall leading to blockage of the vessel. When these blockages occur in the arteries supplying blood flow to the heart muscle then a heart attack can occur. This scenario could explain why there seems to be an increased risk of heart related problems and death after the medication has been stopped.

The research involved data collected from 1,343 patients admitted to a hospital for acute attacks of COPD and 1,631 patients admitted with community acquired pneumonia. The researchers categorized the patients as either macrolide users or non-macrolide users. Patients who received at least one dose of clarithromycin during their hospital stay were considered macrolide users.

In the year following the hospital stay, 268 of the COPD patients and 171 of the pneumonia patients were admitted to a hospital or died as the result of a heart related illness. Analysis of the data suggests there will be an additional heart related event for every eight patients given clarithromycin compared to patients not given the drug. For patients with pneumonia, there will be an additional injury for every 11 patients given the drug. The authors add, “Our findings require validation in independent datasets, especially from primary care settings and through randomized controlled trials of macrolides with long term follow-up.”

New Technique Helps Robots Walk Through Sand

Lee Rannals for redOrbit.com — Your Universe Online

“Terradynamics” could be the next big thing in robotics, helping robots move on granular and other complex surfaces like beaches.

Studying the field of terradynamics involves researching and understanding how small-legged robots move on and interact with complex materials like sandy beaches. The team who coined the name terradynamics wrote about their achievements in the field in the journal Science.

“We now have the tools to understand the movement of legged vehicles over loose sand in the same way that scientists and engineers have had tools to understand aerodynamics and hydrodynamics,” said Daniel Goldman, a professor in the School of Physics at the Georgia Institute of Technology. “We are at the beginning of tools that will allow us to do the design and simulation of legged robots to not only predict their performance, but also to optimize designs and allow us to create new concepts.”

[ Watch the Video: Terradynamics Helps Robots Move Through Sand ]

Designers could use terradynamics to better develop robots like those designed to go on search-and-rescue missions. Currently, robots in this field rely on wheels for locomotion, but Goldman says new techniques need to be developed using terradynamics.

The researchers examined the motion of a small-legged robot as it moved on granular surfaces. They created a variety of shapes for legs using a 3D printer and studied how different configurations affected the robot’s speed along a track bed. They also measured granular force in experiments to predict forces on the legs.

Goldman said the key finding in their examinations was that the forces applied to independent elements of the robot legs could be summed together to provide a reasonably accurate measurement of the net force on a robot moving through granular media. This technique worked well for legs moving in diverse kinds of granular surfaces.

“We discovered that the force laws affecting this motion are generic in a diversity of granular media, including poppy seeds, glass beads and natural sand,” said Li, who is now a Miller postdoctoral fellow at the University of California at Berkeley. “Based on this generalization, we developed a practical procedure for non-specialists to easily apply terradynamics in their own studies using just a single force measurement made with simple equipment they can buy off the shelf, such as a penetrometer.”

The researchers also learned convex legs made in the shape of the letter “C” worked better than other variations.

“As long as the legs are convex, the robot generates large lift and small body drag, and thus can run fast,” Goldman said. “When the limb shape was changed to flat or concave, the performance dropped. This information is important for optimizing the energy efficiency of legged robots.”

Terradynamics worked well for more complicated granular materials, but future studies might need to look into the degree to which particles resemble a sphere. After more research, their technique could eventually provide designers with a better understanding of motion through media that flows around legs of terrestrial animals and robots.

“Using terradynamics, our simulation is not only as accurate as the established discrete element method (DEM) simulation, but also much more computationally efficient,” said Tingnan Zhang, who is a graduate student in Goldman’s laboratory. “For example, to simulate one second of robot locomotion on a granular bed of five million poppy seeds takes the DEM simulation a month using computers in our lab. Using terradynamics, the simulation takes only 10 seconds.”

Goldman said terradynamics opens up a new era, providing tools that help understand why lizards have feet and bodies of certain shapes.

“We think that the kind of approach we are taking allows us to ask questions about the physics of granular materials that no one has asked before,” Goldman added. “This may reveal new features of granular materials to help us create more comprehensive models and theories of motion. We are now beginning to get the rules of how vehicles move through these materials.”

In 2009, another group of researchers looked into finding out why robots get stuck in the sand. The team wrote in the journal Proceedings of the National Academy of Sciences about how they discovered that when a robot rotates its legs too fast or the sand is packed loosely enough, it transitions from rapid walking motion to a slower swimming motion.

Not only do these research projects pave the path for future “Baywatch” robots, but they also could help in further advancing our exploration of other planets.

Robotic Therapist Helps Train Kids With Autism Disorder

Lee Rannals for redOrbit.com – Your Universe Online

One two-foot-tall humanoid robot is acting as a therapist to help train children diagnosed with autism spectrum disorder (ASD).

The humanoid robot NAO (pronounced “now”) features an elaborate system of cameras, sensors and computers designed to help children learn how to coordinate their attention with other people and objects. Known to researchers as ℠joint attention´, children with autism typically have a difficult time mastering this social skill — which is where NAO comes in.

Researchers wrote in the March issue of the publication IEEE Transactions on Neural Systems and Rehabilitation Engineering that children with ASD paid more attention to the robot and followed its instructions almost as well as they did that of a human therapist in standard exercises.

The authors suggest that robots could play a crucial role in responding to the growing number of children being diagnosed with ASD. According to recent statistics, there has been a 78 percent increase in ASD diagnoses in the past four years.

“This is the first real world test of whether intelligent adaptive systems can make an impact on autism,” said team member Zachary Warren, who directs the Treatment and Research Institute for Autism Spectrum Disorders (TRIAD) at Vanderbilt´s Kennedy Center.

NAO is a commercial humanoid robot made in France, so the team had to essentially build the brains of the robot beforehand. In order to get NAO to work with children, the researchers had to develop a sophisticated adaptive structure around the robot that they dubbed Adaptive Robot-Mediated Intervention Architecture (ARIA). They say their system has the greatest potential for working with young children.

“Research has shown that early intervention, individualized to the learner´s needs, is currently the most effective approach for helping children with autism develop the foundational social communication skills they need to become productive adults,” said“¯Julie Crittendon, assistant professor of pediatrics at the Vanderbilt University Medical Center.

The team created an “intelligent environment” around NAO, where the robot stands on a table at the front of the room and the child sits in a chair at eye level with it. The room is equipped with a number of web cameras aimed at the chair to track the child’s head movements, which allows the system to determine whether he or she is paying attention.

They programmed NAO to have a series of verbal prompts like “look over here” and “let’s do some more.” The robot also has gestures like looking and pointing at one of the displays in the room.

The training sessions begin with a verbal prompt that asks the child to look at an image or video displayed on one of the screens. If the child doesn’t respond, the therapists provide increasing support by combining a verbal prompt with physical gestures. Once the child looks at the target, the therapist responds with “good job.”

Researchers found that children in groups who spent time with the robot versus those who spent time training with a human-led therapist actually looked at NAO more than they did the real therapist.

“The children´s engagement with the robot was excellent,” Crittendon said, “and we saw improvements across the board in both groups.”

NAO is also able to adapt its behavior to each child automatically depending on how he or she is responding to the prompts — an effective tool since all children with ASD tend to behave a little bit differently.

“There is a saying in the field, ℠If you´ve seen one child with ASD, you´ve seen one child with ASD.´ So one size does not fit all. To be useful, the system must be adaptive,” Warren said.

He said he hopes the robotic system can act as an “accelerant technology” that increases the rate at which children with ASD learn the social skills they need.

NAO is not just a one-trick-pony robot when it comes to education. NAO was named the best robot for education as part of Carnegie Mellon’s “Robot Hall of Fame” competition last year.

“NAO has provided us with an exciting tool to teach students robotics, to introduce robotics related concepts and to show how robots can be applied in the real world,” said Timothy Gifford, researcher at UConn, and CEO of Movia Robotics.

The robot also showed off its dance skills last year when redOrbit attended the 2012 ICRA last year.

Chicago Schools Find Double Dose Of Algebra Boosts Long-Term Success

Brett Smith for redOrbit.com – Your Universe Online

Math fans rejoice! A new study in Education Next has shown that doubling down on algebra classes leads to long-term positive effects on college entrance exam scores, high school graduation rates and college enrollment rates.

The study focused on a pilot program in Chicago public schools (CPS) that was initiated in 2003. In an attempt to boost sagging ninth-grade algebra scores, then-school math official Martin Gartzman began a program of “double-dose” algebra classes to give struggling students extra time.

“Double-dosing had an immediate impact on student performance in algebra, increasing the proportion of students earning at least a B by 9.4 percentage points, or more than 65 percent,” said the report . “The mean GPA across all math courses taken after freshman year increased by 0.14 grade points on a 4.0 scale.”

To investigate the impact of the program, a trio of American researchers analyzed data that tracked students from 8th grade all the way to enrollment in college. They then divided the students into two subsets: students performing just above and just below the double-dose threshold.

The researchers found that in addition to raising student performance, the program had a connection to raised graduation rates for certain students.

“Although the intervention was not particularly effective for the average affected student, the fact that it improved high school graduation and college enrollment rates for even a subset of low-performing and at-risk students is extraordinarily promising when targeted at the appropriate students,” the authors wrote.

While Gartzman´s CPS team could only follow the students for a year after the program began, the researchers saw improvements appear five years after implementing the double-dose of algebra.

Gartzman said that he first saw the Education Next article sitting in his dentist´s waiting room and said it was “was mind-blowing for me. I had no idea that the researchers were continuing to study these kids.”

Gartzman later found out that“¯Takako Nomi, an affiliated researcher at the University of Chicago´s Consortium on Chicago School Research (CCSR), and Elaine Allensworth, interim director of CCSR, had been tracking his students over the years.

“Nomi and Allensworth did some really sophisticated modeling that only researchers could do, that school districts really can´t do,” Gartzman said. “It validates school districts all over the country who had been investing in double-period strategies.”

“These are really hard problems. A great 21st-century university ought to try to solve the hardest problems facing our society,” he added. “In the world of K-12 education, this is one of the hardest problems.”

In their report, the researchers added that the double-dose strategy is becoming a popular and effective way to aid students struggling in mathematics. According to the report, almost half of “large urban districts” in the US use double math classes to remediate their struggling students.

With many urban schools subscribing to the theory that algebra is a “gateway for later academic success,” many districts are using “effective and proactive intervention” for those who lack foundational mathematical skills, the report said.

Researchers Develop Device To Detect Secondhand Smoke

redOrbit Staff & Wire Reports – Your Universe Online
Researchers at Dartmouth College have developed a device that is capable of detecting secondhand smoke — a gizmo that could be a powerful new tool in those attempting to avoid exposure to tobacco.
The device, which is currently in the patent-pending prototype stage, weighs less than a cellphone and is approximately the same size as a Matchbox car, the researchers said. It uses polymer films to collect and measure the amount of nicotine in the air, then utilizes a sensor chip to record the data on a standard SD memory card.
“We have developed the first ever tobacco smoke sensor that is sufficiently sensitive to measure secondhand smoke and record its presence in real time,” Dartmouth chemistry professor Joseph BelBruno, head of the lab where the device was developed, said in a statement.
“This is a leap forward in secondhand smoke exposure detection technology and can be considered the first step in reducing the risk of health effects,” he added. The unit itself, which is also said to be capable of detecting third-hand smoke residue on clothing or other materials, is described in a paper which was recently published in the journal Nicotine and Tobacco Research.
The unit could have multiple uses, according to the researchers. Parents who attempt to shield their children from cigarette smoke by going to different rooms or outdoors to partake of their habit now have a way to measure the effectiveness of those efforts. In addition, the detectors could also be installed in rental cars, hotel rooms and restaurants to help enforce smoking bans — working similarly to fire-preventing smoke detectors.
“The intent of the project isn’t to make [parents] stop smoking, but it is to make them stop exposing their children to smoke. On the other hand, if they are worried about their children, demonstrating these exposures may be an incentive for them to stop,” BelBruno said.
He added that he ultimately plans to release a lower-cost consumer version that will feature a computer processor, reusable polymer films, a rechargeable battery, and perhaps even an LED panel to provide immediate results. The research was supported by the American Academy of Pediatrics (AAP) Julius B. Richmond Center of Excellence, and clinical trials are scheduled to get underway this summer.
According to the American Cancer Society (ACS), “secondhand smoke is classified as a ℠known human carcinogen´ (cancer-causing agent) by the US Environmental Protection Agency (EPA), the US National Toxicology Program (NTP), and the International Agency for Research on Cancer (IARC)“¦ Tobacco smoke contains more than 7,000 chemical compounds. More than 250 of these chemicals are known to be harmful, and at least 69 are known to cause cancer.”

Children’s ‘Stomach Bug’ Now Primarily Caused By Norovirus, Says CDC

redOrbit Staff & Wire Reports – Your Universe Online

A new study has shown that norovirus is now the primary cause of acute gastroenteritis, or the ℠stomach bug,´ among children under five years of age who seek medical treatment for the condition, the Centers of Disease Control and Prevention (CDC) reported on Thursday.

Gastroenteritis involves the inflammation of the stomach and small intestine, and is typically caused by one of several different types of viruses. Also commonly known as the ℠stomach virus´ or ℠stomach flu,´ gastroenteritis often results in a combination of vomiting, diarrhea and abdominal cramping.

The authors of the paper, which appears in the latest edition of The New England Journal of Medicine, tracked over 141,000 infants and young children suffering from the illness from October 20o8 through September 2010. The subjects lived in three different US counties, and their specimens underwent laboratory testing to determine whether norovirus or another type of virus was present.

According to the CDC, norovirus was found in 21 percent (278) of the 1,295 confirmed cases of acute gastroenteritis. Rotavirus, normally the most common cause of the stomach bug in children, was only found in 12 percent (152) of the cases. Nearly half of the medical care visits linked to norovirus infections were in children between the ages of six and 18 months, and infants to one-year-old youngsters were more likely to be hospitalized than older children.

Furthermore, the health organization reported that norovirus was responsible for nearly one million pediatric medical in the US in 2009 and 2010, and that the overall rates of norovirus in emergency rooms and outpatient offices were at least 20 and perhaps up to 40 times higher than hospitalization rates.

“Infants and young children are very susceptible to norovirus infections, which often result in a high risk of getting dehydrated from the sudden onset of intense vomiting and severe diarrhea,” explained Dr. Daniel Payne, an epidemiologist in the CDC´s Division of Viral Diseases. “Our study estimates that 1 in 278 US children will be hospitalized for norovirus illness by the time they turn 5 years of age.”

“It is also estimated that about 1 in 14 children will visit an emergency room and 1 in 6 will receive outpatient care for norovirus infections,” he added. “Our study confirmed that medical visits for rotavirus illness have decreased. Also, our study reinforces the success of the US rotavirus vaccination program and also emphasize the value of specific interventions to protect against norovirus illness.”

The researchers estimate that in 2009 and 2012, there were approximately 14,000 hospitalizations, 281,000 emergency room visits, and 627,000 outpatient visits due to norovirus illness in children under the age of five — resulting in an estimated annual treatment cost of more than $270 million. The CDC reports that norovirus vaccines are currently in development for patients most at risk, including young children.

“Each year, more than 21 million people in the United States get infected with norovirus and develop acute gastroenteritis, and approximately 800 people die,” the CDC said. “Young children and elderly people are more likely to suffer from severe norovirus infections.”

“The virus spreads primarily through close contact with infected people, such as caring for someone who is ill. It also spreads through contaminated food, water and hard surfaces,” they added. “The best ways to reduce the risk of norovirus infection are through proper hand washing, safe food handling, and good hygiene.”

If Your Child Is A Picky Eater, Blame The Genes

Lee Rannals for redOrbit.com — Your Universe Online

Are you having to force feed your child carrots and broccoli to ensure their diet isn’t solely based on plain-and-dry McDonald’s cheeseburgers? Well, a new study suggests bad parenting may not be why your child is a picky eater, but rather, it may be because of their genes.

Researchers at the University of North Carolina at Chapel Hill say fearing new foods, or food neophobia, is similar to a child’s temperament or personality.

“Some children are more genetically susceptible than others to avoid new foods,” said Myles Faith, an associate professor of nutrition at UNC’s Gillings School of Global Public Health. “However, that doesn’t mean that they can’t change their behaviors and become a little less picky.”

Researchers looked at 66 pairs of twins between the ages four and seven years old, and found genes explained 72 percent of the variation among children in the tendency to avoid new foods. They examined the relationship between food neophobia and body fat measurements in both parent and child. They found if the parent was heavier, the child was heavier only if he or she avoided trying new foods.

“It’s unexpected, but the finding certainly invites interesting questions about how food neophobia and temperament potentially shape longer-term eating and influence body weight,” said Faith.

The authors suggest parents should consider each child’s idiosyncrasies when thinking about how to increase a child’s acceptance of new foods. They say parents can serve as role models and provide repeated exposure to new foods at home, or show their child how much they enjoy the food being avoided.

“Each child may respond differently to each approach, and research needs to examine new interventions that take into account children’s individuality,” said Faith. “But what we do know through this and other emerging science is that this individuality includes genetic uniqueness.”

Previous studies have shown a similar genetic influence for food neophobia in 78 percent of children and 69 percent of adults, which suggests the impact of genes on food neophobia is constant across the developmental spectrum.

Sneaking vegetables and fruits into a child’s food isn’t a great strategy to overcome this phobia either. Scientists wrote in the Journal of Nutrition Education and Behavior picky eaters are less apt to like food they are unfamiliar with, even if the new food has been snuck into past meals by parents.

Researchers do suggest parents may want to try adding a little more color to a child’s meal to entice them to eat more nutritionally. Cornell University scientists found children were more attracted to food with color, particularly food plates with seven different items and six different colors.

“What kids find visually appealing is very different than what appeals to their parents,” said Brian Wansink, professor of Marketing in Cornell´s Dyson School of Applied Economics and Management. “Our study shows how to make the changes so the broccoli and fish look tastier than they otherwise would to little Casey or little Audrey.”

Emotional Trends In Literature Reflect Real Historical Events, Say Researchers

Brett Smith for redOrbit.com – Your Universe Online

Using five million books digitized by Google in recent years as a database, the research team looked at how often words that carry emotional content, or ‘mood words,´ were used throughout the 20th Century. Previous work by one of the researchers, Vasileios Lampos from the University of Bristol in the UK, looked at the word content of Twitter messages in six mood categories: anger, disgust, fear, joy, sadness, surprise.

“We thought that it would be interesting to apply the same methodology to different media and, especially, on a larger time scale,” said lead author Alberto Acerbi, an anthropologist at the University of Bristol.

Acerbi said the team quickly picked up on historical distinct trends within the books´ word content.

“We were initially surprised to see how well periods of positive and negative moods correlated with historical events,” he said. “The Second World War, for example, is marked by a distinct increase in words related to sadness, and a correspondent decrease in words related to joy.”

The team also found that there was a distinct split between American and British literature around the 1960s. They found that American literature tended to become loaded with more emotional words toward the end of the 20th century compared to British literature.

“We don’t know exactly what happened in the Sixties but our results show that this is the precise moment in which literary American and British English started to diverge,” said co-author Alex Bentley from the University of Bristol. “We can only speculate whether this was connected, for example, to the baby-boom or to the rising of counterculture.”

“In the USA, baby boomers grew up in the greatest period of economic prosperity of the century, whereas the British baby boomers grew up in a post-war recovery period so perhaps ’emotionalism’ was a luxury of economic growth,” he added.

The researchers also found that writers from both sides of the Atlantic tended to use more words associated with ℠fear´ starting in the 1970s.

In their conclusion, the study´s author left the question open whether there was actually a direct causal connection between historical events and the emotional content of literature. While the word usages could represent actual moods or behaviors in society, they could also represent a form of escapism not connected to reality.

“It has been suggested, for example, that it was the suppression of desire in ordinary Elizabethan English life that increased demand for writing obsessed with romance and sex,” the research team wrote.

Acerbi noted that the digitization of literature has enabled anthropologists to apply new metrics to historical trends and data.

“Today we have tools that are revolutionizing our understanding of human culture and of how it changes through time,” he said. “Interdisciplinary studies such as this can detect clear patterns by looking at an unprecedented amount of data, such as tweets, Google trends, blogs, or, in our case, digitized books, that are freely available to everyone interested in them.”

Researchers Tackle Leukemia With Engineered T-Cell Therapy

Brett Smith for redOrbit.com – Your Universe Online

Job training can allow workers to take on new responsibilities and a team of oncologists from Memorial Sloan-Kettering Cancer Center (MSKCC) in New York found that the same thing applies to the body´s immune system.

By essentially taking T cells, a type of white blood cell, back to school — the oncologists were able to successfully eradicate a certain type of acute leukemia.

Similar previous research has shown promise in young children, but the newly developed treatment was able to beat back cancer in four out of the five patients who participated in a clinical trial. The previous treatment for acute lymphoblastic leukemia (ALL) showed only about a 30-percent effectiveness for adults.

According to the oncologists´ report in journal Science Translational Medicine, T cells play a huge role in the new treatment. These cells are designed to eliminate viruses and other invaders that are studded with homing beacons that the cells can recognize. Because T cells cannot recognize certain beacons, they are not able to remove invaders such as cancer cells and some viruses.

Therefore, the oncologists decided to train the T cells to spot beacons of leukemia´s cancerous cells and eliminate them. First, the doctors removed T cells from patients with ALL. Then, the research team introduced them to a harmless virus that delivered genes for a three-part molecule into the cell. One part of the molecule allows the T cells to recognize foreign leukemia cells that are marked with an antigen called CD19. Another part of the specially designed molecule, trains T cells to kill any such marked cells they find. The third part allows the T cells to survive longer than usual.

After a 10- to 12-day ℠training process,´ the modified T cells were then returned to their five patients, ages 23, 58, 56, 59 and 66. Leukemia in four of five patients went into remission after the treatment, becoming undetectable in 18 to 59 days. Unfortunately, the fifth patient was too sick to undergo a bone marrow transplant and died.

After their treatment sessions, one of the four relapsed patients subsequently died of a blood clot. The three surviving patients have been in remission from five to 24 months, depending on when they were treated.

Despite the relative success of the treatment, the process was not without complications. After receiving the modified T cells, one patient developed a cytokine storm, in which cytokines, or hormones, are produced en masse, leading to dropping blood pressure and an intense fever. Another patient also suffered from the cytokine storm, but both patients were successfully treated with steroids.

The researchers said the results of their study reinforced the clinical approach of using the body´s own immune system to fight off the advance of cancer and keep it at bay.

“The T cells are living drugs,” said study co-author Dr. Michel Sadelain. “They see the CD19, they kill the cancer cells, and they persist in the body.”

The oncology team said they are currently raising funds in the hopes of performing more extensive clinical trials in the near future. They also plan to see the treatment trials expanded to other cancer centers such as the Dana-Farber Cancer Institute in Boston.

Regenerative Heart Medicine Could Get Boost With Nanotechnology

redOrbit Staff & Wire Reports – Your Universe Online

Researchers at the Stanford University School of Medicine have developed a new visualization technique which they believe could eventually help make the repair of damaged hearts through regenerative medicine a reality.

In a study published in Wednesday´s edition of the journal Science Translational Medicine, senior author and Stanford radiology professor Sam Gambhir and colleagues describe how they plan to mark the stem cells which would be used in the repair process.

By marking the cells, doctors would be able to track them by using standard ultrasounds as they leave the needle and enter a patient´s body. The process would allow for the stem cells to be guided to their intended destination more precisely, and would also allow doctors to monitor them using magnetic-resonance imaging (MRI) technology for several weeks afterwards, the researchers explained.

To date, both human and animal trials in which stem cells were injected into cardiac tissue to treat severe heart attacks or heart failure have been largely unsuccessful, said Gambhir.

“We´re arguing that the failure is at least partly due to faulty initial placement,” he explained in a statement. “You can use ultrasound to visualize the needle through which you deliver stem cells to the heart. But once those cells leave the needle, you´ve lost track of them.”

For this reason, scientists have been unable to precisely determine whether or not the stem cells actually reached the heart wall, and whether they remained there or diffused away from the cardiac tissue. In addition, there has been no way to determine how long the cells managed to stay alive, or if they successfully replicate and eventually develop into heart cells.

Gambhir´s team method could help answer some of those questions.

“All stem cell researchers want to get the cells to the target site, but up until now they´ve had to shoot blindly,” he said. “With this new technology, they wouldn´t have to. For the first time, they would be able to observe in real time exactly where the stem cells they´ve injected are going and monitor them afterward.”

“If you inject stem cells into a person and don´t see improvement, this technique could help you figure out why and tweak your approach to make the therapy better,” Gambhir added.

In addition to the issues surrounding the initial position of the therapeutic stem cells, tracking them once they enter the body has proven troublesome since there is no way to distinguish them from any other cell in the patient´s body. Since they normally cannot be tracked upon entering the body, if the attempt to repair the heart fails, doctors often are unable to pinpoint exactly why the process proved unsuccessful.

The new technique, however, aims to solve those problems by using extremely small nanoparticles that act as imaging agents. The nanoparticles, which have a diameter slightly less than one-third of a micron (or less that one-thirtieth the diameter of a red blood cell), are made of silica so that they can be visualized by ultrasound. Furthermore, an MRI contrast agent known as gadolinium was also added to the imaging agents.

Gambhir and his colleagues were able to successfully demonstrate that mesenchymal stem cells — a class of cells frequently used in heart-regeneration research — could store the nanoparticles without sacrificing any of their ability to survive, replicate and differentiate into living heart cells.

Lead author Jesse Jokerst, a postdoctoral scholar in Gambhir´s lab, said there were concerns that the signal would be fairly weak. However, he and his colleagues found that once they were ingested, they clumped together within the cells, reflecting the ultrasound waves far more dramatically and providing a far stronger signal than anticipated.

Despite the optimism, it will probably be at least three years before the technique can be tested in humans.

Urine, Hair Samples Overestimate Mercury Exposure From Dental Fillings

April Flowers for redOrbit.com – Your Universe Online

A new study from the University of Michigan reveals that a common test currently used to determine mercury exposure from dental amalgam fillings may significantly overestimate the amount of toxic metal released from fillings.

Although scientists agree that mercury vapor is released into the mouth by dental amalgam fillings, the amount released remains controversial, as does the question of the health risk posed by this release.

Previous public health studies have assumed that the mostly inorganic mercury in urine can be used to estimate exposure to mercury vapor from amalgam fillings, while the mostly organic mercury in hair is used to estimate exposure to organic mercury from a person’s diet.

The new study, published in a recent issue of Environmental Science and Technology, measured mercury isotopes in the hair and urine of 12 Michigan dentists. The researchers found that the urine contained a mixture of mercury from two sources: organic mercury from the dietary consumption of fish, and inorganic mercury vapor from amalgam fillings.

“These results challenge the common assumption that mercury in urine is entirely derived from inhaled mercury vapor,” said Laura Sherman, a postdoctoral research fellow in the Department of Earth and Environmental Sciences (EES).

“These data suggest that in populations that eat fish but lack occupational exposure to mercury vapor, mercury concentrations in urine may overestimate exposure to mercury vapor from dental amalgams. This is an important consideration for studies seeking to determine the health risks of mercury vapor inhalation from dental amalgams,” U-M biogeochemist Joel D. Blum, professor in the Department of Earth and Environmental Sciences, said in a statement.

The study findings demonstrate that mercury isotopes can be used to more accurately test for exposure to the metal and the related health risks, than traditional measurements of mercury concentrations in hair and urine samples. Isotopes provide a unique chemical tracer that the scientists can use to “fingerprint” both organic and inorganic mercury sources.

Mercury is an element that occurs naturally, but more than 2,000 tons are emitted into the atmosphere annually from man-made sources such as coal-fired power plants, small-scale gold-mining operations, metals and cement production, incineration and caustic soda production.

These mercury emissions are deposited onto land and into water. Micro-organisms then convert some of it to methylmercury, which is a highly toxic organic form that builds up in fish and the animals that eat them, including humans. There are harmful side effects to this build up, including damage to the central nervous system, heart and immune system. Fetuses and young children are especially vulnerable with their developing brains.

Inorganic mercury, of the sort found in dental amalgam fillings, can also cause central nervous system and kidney damage. Inhalation of elemental mercury vapor accounts for the majority of exposure to inorganic mercury, with industrial workers, gold miners and dentists being at highest risk. The risk to dentists has been decreasing as they have increasingly switched to resin-based composite fillings and restorations in recent years.

A vast majority of inhaled mercury, approximately 80 percent, is absorbed into the bloodstream in the lungs and transported to the kidneys. It is then excreted in urine. The mercury in urine is almost entirely inorganic, leading scientists to use it as an indicator, or biomarker, for exposure to mercury from dental fillings.

The U-M study suggests that the mercury found in urine consists of a mixture of inorganic mercury from dental amalgams and methylmercury from fish that has undergone a chemical breakdown in the body called demethylation. The amount of inorganic mercury in the urine contains a significant amount of inorganic demethylated mercury from fish.

The research team used a natural phenomenon called isotopic fractionation to distinguish between the two types of mercury. All the atoms of a particular element, in this case mercury, have the same number of protons in the nucleus. There can be various forms of any given element, called isotopes, each with a different number of neutrons in the nucleus.

There are seven stable, or nonradioactive, isotopes of mercury. Different mercury isotopes react to form new compounds at slightly different rates during isotopic fractionation. Specifically, the U-M research team used a type of isotopic fractionation called mass-independent fractionation to obtain the chemical fingerprints. These chemical fingerprints enabled them to distinguish between exposure to methylmercury from fish and mercury vapor from dental amalgam fillings.

WHO Report Says Nurses Play Key Role In Fighting Deadliest Global Health Risks

Jason Pierce, MSN, MBA, RN for redOrbit.com — Your Universe Online

A report recently published by the World Health Organization (WHO) and co-author Dr Linda Sarna emphasizes the vital role of nurses and midwives in reducing the worldwide impact of non-communicable diseases (NCD). Because of this vital role, we have seen an increase in accelerated nursing programs all over the world.

“The global burden of non-communicable diseases is already high and continues to grow in all regions of the world,” said Sarna. “Nurses and midwives have the expertise to help individuals and communities improve health outcomes.”

With more than 19 million members worldwide, nurses and midwives make up the largest group of healthcare providers and are among the most respected and most trusted members of the healthcare team. The holistic approach used by nurses to address the health of individuals and communities incorporates strategies to prevent illness, promote wellness and build on existing strengths and resources. In addition, nurses frequently interact with patients of all ages, from every socioeconomic status, and in a variety of settings.

Noncommunicable diseases are illnesses that typically last a long time and lead to slow deterioration of an individual´s health. Four broad types of NCDs were discussed in the report: cardiovascular diseases, cancers, chronic respiratory diseases and diabetes. These four types were chosen by the WHO because they collectively account for 60 percent of deaths worldwide. In addition, 80 percent of deaths due to these NCDs occur in low-income and middle-income countries largely before the age of 60.

Although deadly, these four disease types are preventable by addressing four primary risk factors: tobacco use, alcohol abuse, physical inactivity and unhealthy diets. The report points out that nurses and midwives are uniquely positioned to address these risk factors through evidence-based interventions such as implementing smoking cessation interventions, promoting physical activity, and providing dietary education and guidance.

The report includes examples of nursing interventions designed to address NCDs and associated risk factors, and calls for nurses to become more involved in policy development, advocacy, research and education.

“The examples contained in the report are proven activities that nurses can start doing today to make a meaningful impact with their patients and in their community,” Sarna said. “Many of the interventions have been proven to reduce costs and improve the quality of care.”

The authors hope the report will inspire further expansion of nursing and midwifery practice related to disease prevention. The report encourages increasing nurse education in the area of counseling patients about unhealthy behaviors as well as greater funding for nursing research.

The report also calls for nurses and midwives to become more active in policy development and patient advocacy at the highest levels. For example, it suggests strategies such as working with legislators to draft policies and providing leadership toward developing standards of practice within organizations.

According to Sarna, “This document is a template for focused activities that nurses can implement today to reduce risk factors and that can direct policy and funding for education programs and research.”

Plant-Based Mediterranean Diet Can Be Much Easier On The Wallet, Study Claims

For even the casual reader of redOrbit.com, news of the benefits of adopting the so-called Mediterranean diet is nothing new. Most notably, our own Michael Harper has written some compelling articles on how this diet can lower your stroke risk and, in one very clever insurance scheme in South Africa, assist you in thinning your waistline while maintaining a plumped pocketbook.

Additionally, the introduction of olive oil into your diet, it has been determined, aids you in feeling fuller longer. The German study that introduced this idea compared olive oil to lard, butterfat and rapeseed oil and found the aroma compounds within olive oil were effective at providing a sense of satiety unmatched by the other three.

Another fantastic benefit of adopting a Mediterranean-style diet has been detailed in a new study conducted by researchers at The Miriam Hospital and the Rhode Island Community Food Bank.

Their findings, published in the March issue of the Journal of Hunger and Environmental Nutrition, are believed to be the first to show how placing importance on an increase in plant-based meals can lead to a decrease in food insecurity. Food insecurity is defined as a lack of access to nutritional foods for at least some days or some meals for members of a household.

The 34-week study, conducted by Mary Flynn, PhD, RD, LDN, and Andrew Schiff, PhD, recruited individuals to participate in an introductory six-week cooking program followed by the home implementation of simple, plant-based recipes. Their findings showed adherence to the study´s aims resulted in a decrease in total food spending, an increase in the purchase of healthier food items and an increase in their overall food security.

Flynn, a research dietician at The Miriam Hospital, which is associated with Brown University, designed the study with Schiff who is the chief executive officer at RIFoodBank. The 34-week study was derived from previous research conducted by Flynn on a plant-based diet she developed that emphasizes cooking with olive oil, following a Mediterranean diet pattern.

“I had a number of people — mainly women from my breast cancer weight loss study — say how inexpensive a Mediterranean-style diet was, so I approached the food bank about designing a study using food pantry items for the recipes,” says Flynn.

Most people trying to put together a well rounded menu for their household spend the bulk of their budget on meats, poultry and seafood. These items, especially the recommended lower-fat versions, tend to be the most expensive items one will see on their grocery store receipt. Families of a lower socioeconomic status will typically purchase these items first leaving little, if anything, left in the budget for healthier fruits and vegetables.

Flynn comments, however, if the focus of the shopper can be changed to eliminate foods not needed to improve health from the shopping list, a healthy diet can be quite economical. Some foods scratched off the list include meats, snacks, desserts and carbonated beverages.

For the study, Flynn and Schiff recruited 83 clients from emergency food pantries and low-income housing sites. At the end of the 34 week study, 63 of the recruited individuals had completed the diet protocol and six month follow-up requirement.

The first six weeks of the study, as noted above, consisted of cooking classes where instructors prepared quick and easy plant-based recipes that incorporated ingredients like olive oil, whole grain pasta, brown rice and fruits and vegetables. The participant´s progress was tracked for six months after the culmination of the cooking program.

While participants were not required to assist in the preparation during the classes, the staff discussed, in limited detail, the benefits of some of the individual ingredients and encouraged the class to seek out those items in their own food pantry. No additional nutrition or food information was provided to the study participants.

One particular benefit for those attending the six-week cooking class was that they were provided with groceries that contained most of the ingredients discussed by the class facilitators. The allotted ingredients provided to the participants would allow them to make three of the discussed recipes for their family members.

Once the classes were over, the researchers collected grocery receipts throughout the remainder of the study. Analysis of these receipts showed a significant decrease in overall purchases of meats, carbonated beverages, desserts and snacks. This was particularly interesting to the research team as they never offered instruction to the participants to avoid purchasing these items.

Further review of the receipts showed each household enjoyed an increase in the total number of different fruits and vegetables consumed each month.

“Not only did study participants cut their food spending by more than half, saving nearly $40 per week, we also found that the reliance on a food pantry decreased as well, from 68 percent at the start of the study to 54 percent, demonstrating a clear decline in food insecurity,” Flynn says.

The research team also noted following a plant-based diet resulted in unexpected health benefits for the study participants. Nearly one half of all participants presented a loss in weight. This, according to Flynn, was not a study objective. Additionally, there was an overall decrease in the body mass index (BMI) of the study participants.

“Our results also suggest that including a few plant-based meals per week is an attainable goal that will not only improve their health and diet, but also lower their food costs,” concludes Flynn in a statement.

Diet fads, since the very first one in the late 1800s, come and go. However, each successive research study into plant-based and Mediterranean-styled diets is showing how the benefits to your overall health probably shouldn´t be ignored. And if that wasn´t enough of a nudge you might need to adopt this diet, it turns out its healthier for your wallet, as well.

Brain Maps Help To Understand Cognitive Effects Of Alcohol On College Students

Alan McStravick for redOrbit.com – Your Universe Online

Each fall, legions of freshman descend upon the campuses of our nation´s colleges and universities. The first real taste of freedom too many is often at the bottom of a beer bottle. With stories and studies on underage binge drinking spattering the news sites, a new study conducted by several Penn State scientists aimed to zero in on the long-term effects that drinking at this important stage might have on the neurological development of these post-adolescent students.

This first-of-its-kind longitudinal pilot study, researchers have used brain maps to study the neural processes that occur when these students are presented with alcohol-related cues and how they change during a freshman´s first year of college.

In just the 2012-2013 academic year, numerous stories have grabbed the attention of the nation highlighting not only the negative social but also the negative physical effects associated with the dramatic increase in alcohol use that many students experience after setting foot on their higher education campus. Perhaps the most dramatic example of this came from the University of Tennessee and the now infamous ℠butt chugging´ incident that sent one student to the hospital with alcohol poisoning.

Behind the headlines, however, the behavioral changes a student exposed to alcohol undergoes go unreported. These behavioral changes are indicative of significant alterations in how the brain functions. While the immediate effects of alcohol use are known, its effect on the brain´s continuing development from adolescence into early adulthood is not well understood. This important period includes the transition from high school to college.

The research team, headed up by psychology graduate student Adriene Beltz, sought to investigate the changes to the neural processes in the brains of a small group of first-year students as a result of exposure to alcohol.

To conduct their study, researchers recruited 11 incoming students. Each of these participants underwent functional magnetic resonance imaging (fMRI) along with a data analysis technique known as effective connectivity mapping. The students underwent three individual fMRI and mapping sessions from just before the start of classes and culminating mid-way through their second semester

“We wanted to know if and how brain responses to alcohol cues — pictures of alcoholic beverages in this case — changed across the first year of college, and how these potential changes related to alcohol use,” said Beltz. “Moreover, we wanted our analysis approach to take advantage of the richness of fMRI data.”

Upon reviewing the analyzed collected data, the team says the study participants revealed signs in their brains´ emotion processing networks of habituation to alcohol-related stimuli. Additionally, they found a significant alteration in the region of the brain responsible for cognitive control.

Previous studies have alluded to the fact that young adults´ cognitive development isn´t fully complete until they are in their mid-20s, specifically in those regions of the brain responsible for decision-making or judgment-related activity. The timing of this development has been likened to a form of cognitive ℠fine tuning,´ the culmination of which is responsible for defining who we are and who we will become.

Still other studies have suggested that binge drinking during this integral period of neural development may negatively affect the brain in ways that could last into adulthood.

THE BRAIN AS A COMPLEX NETWORK

This current research by Beltz and colleagues suggests that dramatic changes may occur among the emotion processing and cognitive control regions of the brain as a result of exposure to both alcohol and alcohol-related cues. These changes, they say, might also exert influence over regions within the brain responsible for a young adult´s decision-making and judgment abilities.

As Beltz explained: “The brain is a complex network. We know that connections among different brain regions are important for behavior, and we know that many of these connections are still developing into early adulthood. Thus, alcohol could have far-reaching consequences on a maturing brain, directly influencing some brain regions and indirectly influencing others by disrupting neural connectivity.”

The student participants in the study were observed in an fMRI scanner at the Penn State Social, Life, and Engineering Sciences Imaging Center. While in the scanner, the participants were asked to complete a task for the researchers. Their task involved responding as quickly as possible by pressing a button on a grip device to an image of either an alcoholic or non-alcoholic beverage when presented with images of both consecutively on a screen. The resultant data allowed the researchers to create an effective connectivity map for each individual as well as for the group as a whole.

After they finalized connectivity maps, the researchers noted brain regions involved in the processing of emotion exhibited a vastly diminished connectivity when the study participants responded to an alcohol cue as opposed to a non-alcohol cue. The timing of the measurements also showed that regions of the brain associated with cognitive control displayed greater connectivity during the first semester for these new students. What this finding suggests is that the student participants needed to heavily recruit brain regions involved in cognitive control in order to overcome the alcohol-associated stimuli when they were instructed to respond to the non-alcohol cues presented to them.

“Connectivity among brain regions implicated in cognitive control spiked from the summer before college to the first semester of college,” said Beltz. “This was particularly interesting because the spike coincided with increases in the participants’ alcohol use and increases in their exposure to alcohol cues in the college environment. From the first semester to the second semester, levels of alcohol use and cue exposure remained steady, but connectivity among cognitive control brain regions decreased. From this, we concluded that changes in alcohol use and cue exposure — not absolute levels — were reflected by the underlying neural processes.”

While the team believes their pilot study paints a clear picture of the effects of alcohol use in these first-year students, they note that there are still a number of unanswered questions in relation to the longer-term effects of alcohol use on neural development, both after the first year of college and, perhaps more importantly, later in the individuals´ lives.

Because the long-term effects on neural development are still unknown, Beltz intends to conduct a follow-up study, tracking a larger number of participants over a greater length of time.

Parents, no doubt, will be concerned about cutting ties with their kids as they send them off to school, in light of this most recent research. However, another recent study from Penn State may help put their minds at ease. In it, researchers found that parents who have a meaningful conversation with their children about alcohol use can positively affect their relationship with alcohol once they arrive on campus.

Tiny Implantable Blood Lab Can Transmit Patient Data To The Doctor

WATCH VIDEO: [Under The Skin, A Tiny Laboratory]

Brett Smith for redOrbit.com – Your Universe Online

Swiss scientists from the École polytechnique fédérale de Lausanne (EPFL) have developed a tiny blood-work laboratory that sits just under the skin and wirelessly transmits results to a patient´s physician.

According to a statement from the EPFL team, the device can detect as many as five proteins and organic acids simultaneously, allowing for a more personalized and immediate level of care than traditional blood test monitoring. The device could be particularly useful for patients with a chronic illness or those undergoing chemotherapy.

A report on the device will be published and presented today at the 2013 Design, Automation and Test in Europe conference (DATE 13), Europe’s largest electronics show.

Developed by a team led by EPFL scientists Giovanni de Micheli and Sandro Carrara, the bio-implant measures only a few millimeters across and operates on a battery that is contained within a patch, which sits on top of the skin and generates 0.1 watts of electricity.

To collect information from a patient´s bloodstream, sensors on the implant are individually coated with an enzyme.

“Potentially, we could detect just about anything,” explains De Micheli. “But the enzymes have a limited lifespan, and we have to design them to last as long as possible.”

The enzymes remain active for about a month and a half — which is long enough to perform the necessary amount of blood work in most cases.

Once the information has been collected, the implant sends information to the skin patch via radio waves. After the patch collects the data, it transmits the information via Bluetooth to a mobile phone, which then sends it on to the doctor over a wireless network.

The scientists emphasized that the device could be particularly useful during chemotherapy treatments since oncologists use periodic blood tests to evaluate their patients’ ability to tolerate a treatment dosage.

De Micheli said the new system will enable a better, more personalized form of chemotherapy.

“It will allow direct and continuous monitoring based on a patient’s individual tolerance, and not on age and weight charts or weekly blood tests,” he said.

For patients who suffer from a chronic condition, the device will be able to detect early warning signs of the condition progressing into a more serious stage.

“In a general sense, our system has enormous potential in cases where the evolution of a pathology needs to be monitored or the tolerance to a treatment tested,” De Micheli said.

The EPFL prototype has been set to test for five different substances, and compared favorably to traditional analysis methods in a laboratory setting, according to the EPFL scientists.

To make the device, the researchers enlisted electronics experts, computer scientists, doctors and biologists from EPFL, and other Swiss institutions, including the Istituto di Ricerca di Bellinzona, the Swiss Federal Laboratories for Materials Science and Technology (EMPA) and ETHZ, a university in Zurich. The team predicted that the system will be commercially available within four years.

Heart Attack Prevention With Aspirin Still Popular, Despite Lack Of Beneficial Evidence

redOrbit Staff & Wire Reports – Your Universe Online

Using aspirin for cardiovascular health remains a popular preventative medicine regimen for many people, even though recent studies have suggested that the popular pain killer might not be as beneficial as previously believed, according to new research published in the journal Canadian Family Physician.

Study authors Olga Szafran and Mike Kolber of the University of Alberta´s Department of Family Medicine polled patients over the age of 50 from two Alberta-based clinics.

They discovered that more than 40 percent of those individuals took daily doses of aspirin in order to prevent a heart attack, stroke, or similar cardiovascular ailment.

“A lot of this comes from many years ago where they did this study on physicians and it showed that physicians who take Aspirin seem to do better than those who don’t,” Kolber said in a statement.

“The problem is, physicians aren’t a generalized group of people, physicians are healthier and they’re educated,” he added. “All the literature that’s been coming out over the last three to five years, said Aspirin for primary prevention really doesn’t change long-term mortality.”

In addition, the researchers also found that 62 percent of those individuals who have cardiovascular disease are also taking the anti-inflammatory medication each day.

They refer to this style of treatment as secondary prevention, and unlike the previous group — who are practicing what is known as primary prevention — this group actually benefits from a regular aspirin regiment, they report.

“I think the hope is that this paper will sensitize physicians to their own practice and create a growing awareness of the issue,” Szafran said.

“If we can get it out there and ensure physicians and patients have the discussion, perhaps we could shift the use a little bit,” added Kolber. “Discuss the importance, if you have cardiovascular disease, of taking an anti-platelet like Asprin, and probably less important if you don’t have an event, to be taking that.”

According to the US National Library of Medicine (NLM), nonprescription aspirin can be used to prevent heart attacks in those who have already had one and those who have angina, or chest pains resulting from a lack of oxygen to the heart.

The drug can also be used to reduce the fatality risk of those who are experiencing or have recently experienced a heart attack, and is capable of preventing some types of stroke, including those that occur when a blood clot blocks the flow of blood to the brain, they explained.

Stroke Victims Under The Age Of 50 More Likely To Die Within Two Decades

redOrbit Staff & Wire Reports – Your Universe Online

Adults who suffer a stroke before the age of 50 are far more likely to die within the next two decades than the general population, according to a new study published in Wednesday´s edition of JAMA.

According to background information presented with the study, six million people die as a result of stroke, which is a loss of brain function due to a disturbance in the brain´s blood supply. While it primarily affects the elderly, an estimated 10 percent of all strokes occur in patients 18 to 50 years of age. Even so, the researchers report that there has been limited research on the long-term prognosis for stroke survival in this younger group.

Loes C. A. Rutten-Jacobs of Radboud University Nijmegen Medical Centre (UMCN) and colleagues set out to correct that by conducting a long-term mortality and cause-of-death study focusing on first-time stroke victims under 50 years of age. They would then compare their findings with national age and gender-related morality rates.

They focused on 959 patients who were admitted to medical facilities with one of three different types of strokes — transient ischemic attack (TIA), ischemic stroke, or hemorrhagic stroke — between January 1980 and November 2010.

Of those, 262 people suffered a TIA (also known as a mini-stroke), which occurs due to a disruption of blood flow to the brain without tissue death; 606 suffered an ischemic stroke, which occurs when the part of the brain´s blood supply decreases, resulting in damage to the tissue in that area; and 91 suffered an intracerebral hemorrhage, a stroke that originates within the brain itself, often due to trauma.

The survival status of each of those subjects was reviewed as of November 1, 2012, with an average follow-up duration of just over 11 years, Rutten-Jacobs and co-authors explained. During follow-up, 20 percent of the patients (192) had died. Mortality risk was 24.9 percent for TIA patients, 26.8 percent for ischemic stroke patients, and 13.7 percent for intracerebral hemorrhage patients.

They also found that mortality rates were higher for those who survived the first 30 days after an ischemic stroke, and that the cumulative 20-year mortality rate for those stroke victims was higher in men (33.7 percent) than women (19.8 percent). Furthermore, the study revealed an excess in mortality for stroke victims in comparison to the general population, even decades after suffering a cerebrovascular event.

“This may suggest that the underlying (vascular) disease that caused the stroke at relatively young age continues to put these patients at an increased risk for vascular disease throughout their lives,” the authors explained in a statement Tuesday. “It may also be noted that risk factors indicated in the study group, such as smoking and alcohol consumption, seem likely to confer risk as well.”

“Although data are currently lacking, the observation of long-term increased risk for vascular disease could have important implications for the implementation of secondary prevention (both medical and lifestyle) treatment strategies,” they added. “Future studies should address the role of this stringent implementation in these patients with young stroke.”

Genetically Altered Tomatoes Help Promote Good Cholesterol

Lee Rannals for redOrbit.com — Your Universe Online

UCLA researchers writing in the Journal of Lipid Research say they have genetically engineered tomatoes to produce a peptide that mimics the actions of good cholesterol when consumed.

The authors said this is one of the first examples of a peptide that acts like the main protein in good cholesterol, and it can be delivered simply by eating the fruit. A peptide is a chemical compound containing two or more amino acids coupled by a peptide bond, which is a special link in which a nitrogen atom of one amino acid binds to the carboxyl carbon atom of another.

“There was no need to isolate or purify the peptide – it was fully active after the plant was eaten,” said senior author Dr. Alan M. Fogelman, executive chair of the department of medicine and director of the atherosclerosis research unit at the David Geffen School of Medicine at UCLA.

Once the tomatoes were eaten, researchers found the peptide was active in the small intestine, but not in the blood. This suggests targeting the small intestine may be a new strategy to prevent diet-induced atherosclerosis, which is a disease of the arteries that leads to heart attacks and strokes.

The team genetically engineered tomatoes to produce a small peptide that mimics the action of the chief protein in high-density lipoprotein, or apoA-1. Scientists then fed the genetically engineered tomatoes to mice that lacked the ability to remove low-density lipoprotein from their blood. They found mice that ate the peptide-enhanced tomatoes had significantly lower levels of inflammation, higher levels of good cholesterol, and decreased lysophosphatidic acid, which accelerates plaque build-up in the arteries.

After the mice finished eating, the team observed the intact peptide was found in the small intestine, which suggests the peptide acted in the small intestine and was then degraded to natural amino acids before being absorbed into the blood.

“It seems likely that the mechanism of action of the peptide-enhanced tomatoes involves altering lipid metabolism in the intestine, which positively impacts cholesterol,” said the study’s corresponding author, Srinavasa T. Reddy, a UCLA professor of medicine and of molecular and medical pharmacology.

Previous studies suggested a large number of conditions with an inflammatory component might benefit from treatment with an apoA-1 mimetic peptide, including Alzheimer’s disease. With many chronic diseases, inflammation becomes an abnormal, ongoing process with long-lasting harmful effects. Fogelman said if the work in the animal trial is applied to humans, consuming food genetically altered to contain apoA-1 related peptides could improve these conditions.

“This is one of the first examples in translational research using an edible plant as a delivery vehicle for a new approach to cholesterol,” said Judith Gasson, a professor of medicine and biological chemistry, director of UCLA’s Jonsson Comprehensive Cancer Center and senior associate dean for research at the Geffen School of Medicine. “We will be closely watching this novel research to see if these early studies lead to human trials.”

Genetically altering food has been a popular focus for scientists in recent years. In 2011, scientists in China wrote in the Proceedings of the National Academy of Sciences about how they genetically modified grains of rice to help burn victims and patients who suffered severe blood loss. The scientists said their genetically altered rice could help reduce the need for hospitals to obtain plasma through donations.

Depression In Alzheimer’s Patients Associated With Declining Ability To Handle Daily Activities

Worsened cognitive status also associated with faster decline in functional abilities

More symptoms of depression and lower cognitive status are independently associated with a more rapid decline in the ability to handle tasks of everyday living, according to a study by Columbia University Medical Center researchers in this month’s Journal of Alzheimer’s Disease.

Although these findings are observational, they could suggest that providing mental health treatment for people with Alzheimer’s disease might slow the loss of independence, said senior author Yaakov Stern, PhD, professor of neuropsychology (in neurology, psychiatry, psychology, the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain and the Gertrude H. Sergievsky Center) at CUMC.

“This is the first paper to show that declines in function and cognition are inter-related over time, and that the presence of depression is associated with more rapid functional decline,” said Dr. Stern, who also directs the Cognitive Neuroscience Division of the Department of Neurology at CUMC.

Because almost half of Alzheimer’s patients have depression, the researchers, who were studying the long-term association between cognitive and functional abilities in the disease, also looked at the role of depressive symptoms in disease progression. They reviewed data that tracked changes in cognition, depression, and daily functioning in 517 patients with probable Alzheimer’s at NewYork-Presbyterian Hospital/Columbia, Johns Hopkins School of Medicine, Massachusetts General Hospital, and the Hôpital de la Salpêtrière in Paris, France. Patients were assessed prospectively every six months for more than 5.5 years.

“Making a prognosis for Alzheimer’s disease is notoriously difficult because patients progress at such different rates,” said first author Laura B. Zahodne, PhD, a postdoctoral fellow in the cognitive neuroscience division in the Department of Neurology and the Taub Institute at CUMC. “These results show that not only should we measure patients’ memory and thinking abilities, we should also assess their depression, anxiety, and other psychological symptoms that may affect their prognosis.”

On the Net:

Military Personnel Return To Duty Following Severe Injury To Lower Extremity

Return to Run orthotic rehab initiative successfully returns

High-energy lower-extremity trauma (HELET) is common in modern warfare, often resulting in severe tissue damage, chronic pain, neurovascular injury and significant muscle loss, according to the new research presented today at the 2013 Annual Meeting of the American Academy of Orthopaedic Surgeons (AAOS).

The Return to Run (RTR) program is an integrated orthotic and rehabilitation initiative designed to return high-level function to wounded warriors. It includes use of the new Intrepid Dynamic Exoskeletal Orthosis (IDEO), a custom-fit device made from carbon and fiberglass that supports the foot and ankle allowing for greater mobility and vigorous rehabilitation.

In the study, researchers reviewed RTR records of 14 Special Operations Command (SOCOM) personnel — 10 Army Special Forces, three Navy Sea Air Land (SEALs), and one Air Force Pararescue Jumper (PJ) — who sustained HELET injuries and completed the RTR program. Records were reviewed for functional capabilities — the ability to walk, run and jump without assistive devices; and occupational capabilities — standing continuously for more than one hour, moving with a load of 20 pounds or more, and the ability to return to duty and combat. Recreational capabilities include running and agility sport participation.

Following RTR, 13 of the 14 service members (including three who had initially considered amputation) were deemed fit to return to duty, and at least seven returned to combat.

On the Net:

When Healthy Food Is Cheap, People Eat More Of It

Michael Harper for redOrbit.com — Your Universe Online

Americans are pretty good at knowing what they should and shouldn´t eat. One would be hard pressed to find someone who looks at a cheeseburger and sees a healthy meal. Yet there exists a significant gap between what we know we should eat and what we choose when meal time comes around. One of the most common excuses for not eating a healthier diet is the high cost of organic foods, produce and other natural foods.

In South Africa, however, grocery store shoppers can earn a rebate for choosing better foods, bringing the overall cost of a healthier diet to a more comfortable level. Now, the RAND Corporation has analyzed the results of this program and claims when the price is right, people will choose a healthier option. This provides an interesting data point to an ongoing debate about similar programs being implemented in the US.

South Africa´s largest private insurance company, Discovery, first rolled out the “HealthyFood” program in 2009. Under this program, all Discovery insurance subscribers can earn up to 25 percent cash back on all healthy food purchases. Family members from more than 260,000 South African households can shop at nearly 800 participating supermarkets to take advantage of this program.

A panel of nutritionists has determined which items can be considered “healthy” and can therefore earn shoppers a discount on their bill. There are currently over 6,000 items eligible for a rebate through the HealthyFood program. According to a press release, these items account for 20 percent of an average shopper´s food bill.

The RAND team gathered data from the HealthyFood participating supermarkets and surveyed 350,000 South Africans, including those who did not take advantage of HealthyFood. After analyzing the data using several different techniques, the RAND Corporation found lower prices for healthier food were associated with a better diet.

“These findings offer good evidence that lowering the cost of nutritionally preferable foods can motivate people to significantly improve their diet,” explained Roland Sturm, co-author of this study and senior economist at RAND.

“But behavior changes are proportional to price changes. When there is a large gap between people´s actual eating behaviors and what nutritionists recommend, even a 25 percent price change closes just a small fraction of that gap.”

Specifically, the supermarket data reveals that a 25 percent discount can increase the amount of healthy food a shopper purchases by 9.3 percent. When given the same 25 percent rebate, shoppers bought 8.5 percent more fruits and veggies than before and purchased 7.2 percent less of the bad stuff.

HealthyFood shoppers are given rebates on fruits, vegetables, nonfat dairy and foods which are rich in whole grains. This 25-percent rebate has encouraged these shoppers to choose less of the unhealthy foods, such as sugary foods, foods with high salt content, fast food items and processed meat. The RAND study also notes during the course of the study, the price subsidies remained stable and the health effects remained positive.

The RAND team also asked participants to complete self-surveys about their diets. According to these surveys, HealthyFood shoppers ate half a serving more of fruits and vegetables each day than those who were not a part of the program. HealthyFood shoppers were also less likely to eat fast food or any of the other items listed as unhealthy.

Even this small change in eating habits had a significant effect on the participants´ overall health. According to the study, HealthyFood members began to lose weight, thus reducing the number of obese subscribers to Discovery´s insurance plans.

President’s Bioethics Commission Releases Report On Pediatric Medical Countermeasure Research

Recommends that multiple steps must be taken before ethical pediatric anthrax vaccine trials can be considered by the US government

In a report released today, the Presidential Commission for the Study of Bioethical Issues concluded that the federal government would have to take multiple steps before anthrax vaccine trials with children could be ethically considered. The Bioethics Commission was responding to a request from Health and Human Services Secretary Kathleen Sebelius who last year asked the members to study the question of anthrax vaccine trials with children after receiving a recommendation from another federal committee that such research be initiated, pending ethical review.

“The safety of our children is paramount and we have to get this precisely right,” said Commission Chair Amy Gutmann, Ph.D. “The Bioethics Commission concludes that many significant steps would have to be taken, including additional minimal-risk research with adult volunteers, before pediatric anthrax vaccine trials prior to an attack should be considered.

A major ethical consideration in “pre-event” trials, in which testing occurs before an actual or imminent attack, is that children do not stand to benefit directly from participating in the study, and so risk must be kept very limited and small. In addition to recommending that pre-event trials with children not go forward in the absence of further testing on adults, the Bioethics Commission clarifies other rigorous conditions that must be met before such pediatric research may ethically proceed.

“Out of respect for every individual, our nation must protect children enrolled in research studies while also doing its best to develop the knowledge needed to save children’s lives during a possible emergency,” Gutmann said.

In the report, Safeguarding Children: Pediatric Medical Countermeasure Research, the Bioethics Commission also more generally considered the ethics of research on pediatric medical countermeasures (MCM), the catchall term for the use of federally-regulated drugs and products in response to chemical, biological, radiological, and nuclear attacks.

Background:

In 2011, the U.S. government conducted a bioterrorism preparedness exercise to study the likely results of a large-scale release of weaponized anthrax spores in a city such as San Francisco. The casualty estimates were staggering: almost 8 million citizens would be affected, nearly a quarter of them children.

If such an event were to occur, current federal plans call for immediate distribution of antibiotics proven effective in treating anthrax infections, and a follow-up widespread vaccination program using Anthrax Vaccine Adsorbed (AVA). Vaccination is believed necessary because anthrax spores would likely continue to pose a threat of infection long after the initial release had taken place. AVA administration is challenging, however, because there is no definitive understanding of its effect in children despite having been in commercial production for more than four decades and having been safely administered to more than a million adults in the military.

After the 2011 bioterrorism exercise, the National Biodefense Science Board (NBSB) recommended that the U.S. government conduct a study to test the safety and effectiveness (“immunogenicity”) of AVA in children before an anthrax attack occurs, contingent upon a review of the ethical issues. Secretary Sebelius called on the Bioethics Commission for a thorough ethical review.

“The Bioethics Commission recognizes both the federal government’s fundamental duty to protect individual children from undue risk during research and its obligation to protect all children — as far as ethically and practically possible — during an emergency by being prepared,” said Daniel Sulmasy, M.D., Ph.D., Member of the Bioethics Commission.

Ethical Considerations and Recommendations:

Research with children is ethically distinct from other research, especially when the research in question promises no prospect of direct benefit for the participants. Competent adults can consent to accept risks for the benefit of others during research. Children are legally prohibited and ethically unable to consent to accept this burden.

Pediatric MCM research that would take place before a bioterrorism attack occurs is also ethically distinct from pediatric MCM research that would take place after an attack. Pre-event pediatric research involving MCMs involves research on a hypothetical condition with an undefined (and perhaps unknowable) likelihood of occurring.

“While the knowledge gained could be very useful in the event of an attack, we may never have — and hope never to have — occasion to use it,” Gutmann said.

By contrast, post-event testing offers a chance of directly benefiting participants and the opportunity to learn about their condition, which resulted from the attack.

Because the individual children who would be enrolled in pre-event MCM research do not stand to directly benefit from the research, the Bioethics Commission concludes that, absent extraordinary circumstances, pre-event MCM research with children is ethical only if it presents “no more than minimal risk” to study participants, where “minimal” means no greater risk than that routinely faced by a healthy child in daily life or at a medical check-up.

In keeping with its recommendation of a strict risk limit in pre-event pediatric MCM research, the Bioethics Commission called for completing all prior ethically sound testing — for example, modeling, testing in animals, and testing in the youngest adults — to assess the level of risk likely posed by pre-event pediatric MCM research. If the risk level for the oldest group of children is determined to be minimal, then progressive testing with younger and younger children should be employed, beginning with the oldest children in order to provide additional protection to younger children. This approach–called age de-escalation–would help to ensure that data from an older age group inform the research design and risk level for the next younger age group. For example, an intervention shown to be minimal risk in the youngest adults — adults 18 years of age —may make it possible to infer that a study with the oldest children of 16 and 17 years of age would present only minimal risk.

When it is impossible to design a pre-event pediatric MCM trial that poses no more than minimal risk, the proposed study must first pass muster under a “national-level review.” The Commission recommends a carefully specified set of strict preconditions before a national-level review of pre-event pediatric medical countermeasure research can proceed:

-Researchers must demonstrate and reviewers must concur that it is impossible to design a minimal risk study; and
-The proposed study still must pose no more than a minor increase over minimal risk to research participants, a level that still presents no substantial threat to a child’s health or well-being.

If a proposed research trial reaches this point, the Bioethics Commission recommends that reviewers be required to employ the rigorous ethical framework that is developed in its report, and cautions that it should be applied only in the rare circumstances where research risks present a minor increase over minimal.

“Current regulations are ambiguously worded and review panels that would be examining proposed pediatric MCM research would have difficulty applying them consistently. Whether the criteria outlined in the Commission’s ethical framework lead to approval or disapproval of proposed MCM research, they clarify what is at stake,” Yolanda Ali, M.B.A., Bioethics Commission Member said.

If there is an attack, post-event MCM research might offer the prospect of direct benefit to participants, or of gaining generalizable knowledge about the participants’ condition because the participants would already have been exposed to a pathogen during an attack. This is ethically distinct from pre-event research.

The Bioethics Commission recommends that post-event research be planned for and conducted when either untested or minimally tested MCMs are used to protect children in an emergency situation. In addition, the Bioethics Commission recommends that if children receive untested MCMs in an emergency in an effort to save lives, that rigorous research protections be in place.

As it examined the ethical issues surrounding research with children, the Bioethics Commission built on its previous work on the issue of protecting research participants. In December 2011, it released Moral Science: Protecting Participants in Human Subjects Research, a report that assesses the current rules and regulations that protect research participants.

“The rules that protect children are even more stringent, as they should be,” Gutmann said. “Medical countermeasure research warrants an ongoing national conversation to ensure an unwavering commitment by our society to safeguard all children both from unacceptable risks in research and through ethically sound research that promotes their health and well-being.”

On the Net:

Alternative Fuels Could Cut 80 Percent Of Greenhouse Gas Emissions By 2050

redOrbit Staff & Wire Reports – Your Universe Online

The United States could cut petroleum consumption and greenhouse gas emissions by 80 percent by 2050 for light-duty vehicles through the use of alternative fuels and strong government policies aimed at overcoming high costs and influencing consumer choices, according to a National Research Council report released on Monday.

“To reach the 2050 goals for reducing petroleum use and greenhouse gases, vehicles must become dramatically more efficient, regardless of how they are powered,” said Douglas Chapin, principal of MPR Associates and chair of the committee that wrote the report.

“In addition, alternative fuels to petroleum must be readily available, cost-effective and produced with low emissions of greenhouse gases. Such a transition will be costly and require several decades,” he said in a statement.

“The committee’s model calculations, while exploratory and highly uncertain, indicate the benefits of making the transition, i.e. energy cost savings, improved vehicle technologies, and reductions in petroleum use and greenhouse gas emissions, exceed the additional costs of the transition over and above what the market is willing to do voluntarily,” he added.

Improving the efficiency of conventional vehicles is, up to a point, the most economical and easiest-to-implement approach to saving fuel and lowering emissions, according to the report. This approach includes reducing work the engine must perform by reducing the vehicle weight, aerodynamic resistance, rolling resistance, and accessories, along with improving the efficiency of the internal combustion engine powertrain.

However, improved efficiency alone will not meet the 2050 goals, wrote the authors of the report.

Indeed, the average fuel economy of vehicles on the road would have to exceed 180 mpg — something highly unlikely with current technologies.

With this in mind, the study committee analyzed vehicle and fuel alternatives, including hybrid electric vehicles like the Toyota Prius, plug-in hybrid electric vehicles like the Chevy Volt, battery electric vehicles such as the Nissan Leaf, hydrogen fuel cell electric vehicles and compressed natural gas vehicles like the Honda Civic Natural Gas.

While the per-mile driving costs will be lower, particularly for vehicles powered by natural gas or electricity, the high initial purchase cost would likely be a significant barrier to broad consumer acceptance, according to the report. Indeed, all the vehicles considered in the report are several thousand dollars more expensive than today’s conventional vehicles, and will likely remain so in the future.

Furthermore, alternative vehicles will likely be limited to just a few body styles and sizes, particularly in the early years. The authors of the report also predict some vehicles will rely on fuels that are not readily available, or that have restricted travel range, while others may require bulky energy storage that will limit their cargo and passenger capacity.

GOVERNMENTS MUST PLAY A ROLE

This is where governments can help, since widespread consumer acceptance is one of the most critical factors, and large numbers of alternative vehicles must be purchased long before 2050 if the on-road fleet is to meet desired performance goals.

Strong policies and technology advances are essential in overcoming this challenge, the committee said.

The report identified several scenarios that could meet the 2050 greenhouse gas goal, each of which combines highly efficient vehicles with at least one of three alternative power sources: biofuel, electricity, or hydrogen.

Natural gas vehicles were considered, but their greenhouse gas emissions are too high for the 2050 goal. However, if the costs of these vehicles can be reduced and an appropriate refueling infrastructure created, they have great potential for reducing petroleum consumption.

While corn-grain ethanol and biodiesel are the only biofuels to have been produced in commercial quantities in the US, the study committee found much greater potential in biofuels made from lignocellulosic biomass, which includes crop residues like wheat straw, switchgrass, whole trees and wood waste.

This “drop-in” fuel is designed to be a direct replacement for gasoline and could lead to large reductions in both petroleum use and greenhouse gas emissions. It can also be introduced without major changes in fuel delivery infrastructure or vehicles.

The report finds sufficient lignocellulosic biomass could be produced by 2050 to meet the goal of an 80 percent reduction in petroleum use when combined with highly efficient vehicles.

Vehicles powered by electricity will not emit any greenhouse gases, but the production of electricity and the additional load on the electric power grid are factors that must be considered.

To the extent fossil resources are used to generate electricity, successful implementation of carbon capture and storage will be essential, the authors concluded. These vehicles also rely on batteries, which are projected to drop steeply in price.

Nevertheless, limited range and long recharge times will likely restrain the use of all-electric vehicles mainly to local driving.

Meanwhile, all of the advanced battery technologies under development face serious technical challenges. For instance, when hydrogen is used as a fuel cell in electric vehicles, the only vehicle emission is water. However, varying amounts of greenhouse gases are emitted during hydrogen production, and the low-greenhouse gas methods of making hydrogen are more expensive and will need further development to become competitive.

Hydrogen fuel cell vehicles could become less costly than the advanced internal combustion engine vehicles of 2050. Fuel cell vehicles are not subject to the limitations of battery vehicles, but developing a hydrogen infrastructure in concert with a growing number of fuel cell vehicles will be difficult and expensive, according to the report.

FOCUS ON EFFICIENCY

The technology advances required to meet the 2050 goals are challenging and not assured. Yet, the committee considers dramatic cost reduction and overall performance enhancement possible without unpredictable technology breakthroughs.

Rather, achieving these goals requires the improved technology focus on reducing fuel use rather than adding greater power or weight.

It is impossible to know which technologies will ultimately succeed, the report says, because all involve uncertainty. The best approach is to promote a portfolio of vehicle and fuel research and development, supported by both government and industry, designed to solve the critical challenges in each major candidate technology.

Such primary research efforts need continuing evaluation of progress against performance goals to determine which technologies, fuels, designs, and production methods are emerging as the most promising and cost-effective, the committee said in its report.

NEW POLICIES NEEDED

Overcoming the barriers to advanced vehicles and fuels will require a rigorous policy framework that is more stringent than the proposed fuel economy standards for 2025. This policy intervention could include high and increasing fuel economy standards, R&D support, subsidies, and public information programs aimed at improving consumers’ familiarity with the new fuels and powertrains. Because of the high level of uncertainty in the pace and scale of technology advances, this framework should be modified as technologies develop and as conditions change.

It is essential that policies promoting particular technologies to the public are not introduced before these new fuels and vehicle technologies are close to market readiness, and consumer behavior toward them is well understood. The report warns forcing a technology into the market should be undertaken only when the benefits of the proposed support justify its costs.

‘Brazilians’ And Other Types Of Pubic Hair Removal May Boost Viral Infection Risk

Micro trauma prompted by shaving and scratching might aid spread of Molluscum contagium

Historically, pubic hair used to be removed for religious or cultural reasons, but in recent decades it has become fashionable to shave it off, with men also increasingly following the trend, say the authors.

Molluscum contagiosum is a pox virus, which is relatively common in children and people whose immune systems are compromised by illness or drugs. But it can also be passed on through sex, and over the past decade the number of sexually transmitted cases has risen.

The authors wanted to know if the rise in in the number of such infections was connected to the increasing popularity of pubic hair removal among patients who visited a private skin clinic in Nice, France, between January 2011 and March 2012.

Of the 30 cases infected with Molluscum contagiosum during this time, six were women, and the average age of the entire group was 29.5.

Signs of the infection (pearly papules) had spread up to the abdomen in four cases and to the thighs in one. In 10 cases, there were other associated skin conditions, including ingrown hairs, warts, folliculitis (bacterial skin infection), cysts and scars.

Among the 30 patients, most (93%) had had their pubic hair removed, with most opting for shaving (70%). Among the rest, it had either been clipped (13%) or waxed (10%).

As the Molluscum contagiousum can spread relatively easily by self infection, such as scratching, hair removal might also facilitate transmission as a result of the micro trauma it causes to the skin, suggest the authors.

They go on to speculate about the reasons for the popularity of pubic hair removal.

“The reasons for choosing genital hair removal remain unclear, but may be linked with internet based pornography … increased sexual sensation … an unconscious desire to simulate an infantile look … or a desire to distance ourselves from our animal nature,” they write.

On the Net:

Chances Are, You Have More Devices At Home Than Human Bodies

Michael Harper for redOrbit.com — Your Universe Online

It´s finally happened. The machines have taken over.

The number crunchers at the NPD Group now say the typical American household has more Internet-connected devices residing inside than people. While PCs remain a mainstay in American homes, NPD says this insurgence is led by tablet sales.

All told, there are an estimated half billion of these connected devices in US houses.

The Connected Home Report, which was conducted by the NPD Group, has found the average household has 5.7 connected devices. It was only three months ago the NPD Group predicted there to be 5.3 devices per household. It´s a bit of growth the report attributes to tablets, saying there are now 18 million more tablets in use than there were just three months ago.

As the number of smartphone users grows, so too does the number of these devices in the home. Last October, Strategy Analytics issued a report which claimed one in every seven people across the world now own a smartphone, placing the grand total just over one billion.

Smartphones have a strong showing in the US as well, and NPD now says the number of smartphones in US homes has increased by 9 million over the past three months.

The Connected Home Report also says the majority of these smartphones bear one of two brands: Apple and Samsung.

As far as tablets are concerned, NPD says Apple “continues to dominate” the tablet market, with Samsung coming in second. This claim is quite a bit different from numbers released by IDC just last week which found the gap between Android and iOS devices to be much narrower. IDC believes this gap will soon close and Android tablet makers such as Samsung will claim 48.8 percent of the global market, leaving Apple with 41 percent.

Yet, no matter how popular these tablets have become, NPD finds PCs remain a popular fixture in American homes.

“Even with this extraordinary growth in the smartphone and tablet market, PCs are still the most prevalent connected device in US Internet households, and this is a fact that won´t be changing any time soon,” says John Buffone, the director of NPD´s Connected Intelligence group, which conducted the research.

“However, when you look at the combined number of smartphones and tablets consumers own, for the first time ever it exceeded the installed base of computers.”

The Connected Home Report doesn´t give specifics about which devices are most often used to connect to the Internet, listing only devices which are capable and ready.

It´s an important distinction. While many American households still own a PC, it´s not known if these PCs are in active use or if they´ve been relegated to the back room of the house and used to backup the newer and sleeker smartphones and tablets.

NPD estimates 93 percent of all US homes have a PC setup, a number which has not seen the same dramatic shift over the past three months as smartphones and tablets.

Buffone says even though smartphone and tablet numbers have grown significantly over the past three months, he believes these numbers will only get higher.

“It´s hard to believe that tablets and smartphones are still somewhat in their infancy,” said Buffone. “But as we have seen in just the past few months, there is significant potential for this market to develop further.”

Boiled Greek Coffee Could Be The Secret To Longevity

redOrbit Staff & Wire Reports – Your Universe Online

What you fill your coffee cup with every morning could be the secret to enjoying a long life, according to new research appearing in an upcoming edition of the journal Vascular Medicine.

Gerasimos Siasos, a professor at the University of Athens Medical School, and colleagues set out to discover the secrets of the inhabitants of the Greek island Ikaria — home to the highest rates of longevity in the world. While only 0.1 percent of all Europeans live to be 90 years of age, the number of Ikaria residents who are at least nonagenarians is ten times that amount.

While many researchers have studied the island´s elderly residents seeking Ikaria´s proverbial fountain of youth, Siasos set out to investigate if the secret was the boiled Greek coffee that they regularly imbibe. He and his associates set out to determine what impact that their special blend of the popular caffeinated beverage had on the health of the island´s populace. Specifically, whether or not there was a correlation between it and the subjects´ endothelial function.

The endothelium is a thin layer of cells that line the interior surface of blood vessels throughout a person´s circulatory system. They are often affected by aging and by specific lifestyle habits (including smoking), the researchers explained. Recent research has suggested that moderate coffee consumption on a regular basis could positively affect multiple aspects of endothelial health, as well as reducing a person´s risk of coronary heart disease.

For their study, Siasos and his colleagues randomly selected 71 men and 71 women who were over the age of 65 and had lived on Ikaria for their entire lives. Each was put through a battery of health checks in order to make sure that they did not suffer from diabetes, high blood pressure, or similar ailments. Their endothelial function was also tested, and all subjects completed a questionnaire to provide details about their medical backgrounds, their lifestyle information and their coffee-drinking habits.

More than 87 percent of those who responded to those questionnaires said that they drank boiled Greek coffee every day, and those individuals tended to have better endothelial function than those who consumed other varieties of coffee, the researchers discovered. Furthermore, even in patients who had high blood pressure, there was a correlation between drinking the boiled version of the caffeinated beverage and improved endothelial function.

“Boiled Greek type of coffee, which is rich in polyphenols and antioxidants and contains only a moderate amount of caffeine, seems to gather benefits compared to other coffee beverages,” Siasos said in a statement. While the study suggests that boiled Greek coffee could have cardiovascular health benefits, the researchers caution that additional studies will be required to determine the exact beneficial mechanisms of the beverage on a person´s wellbeing.