Dietary Supplement Use By Americans On The Rise

April Flowers for redOrbit.com – Your Universe Online

A new study from Ipsos Public Affairs for the Council for Responsible Nutrition (CRN) reveals that dietary supplement use by US adults is more prevalent than previous studies from the National Health and Nutrition Examination Surveys (NHANES) have indicated. Ipsos conducted five years of online market research to gather the data described in the Journal of the American College of Nutrition.

According to Annette Dickinson, Ph.D., a consultant for CRN, “This new review adds to the literature about usage patterns of dietary supplement users. The NHANES data is of course invaluable, but it only asks respondents about their dietary supplement usage over a 30-day period. The CRN/Ipsos data included regular, occasional and seasonal use throughout the year, which more realistically captures the full scope of dietary supplement utilization.”

Overall, the results revealed that supplement use, as reported by participants responding to CRN surveys in 2007-2011, ranged from 64 to 69 percent. The number of participants who reported “regular” use ranged from 48 to 53 percent, which was equivalent to the overall prevalence reported in NHANES. In the CRN surveys, regular users were asked whether they used a variety of products or only a multivitamin. Over the study period, the number of regular users who reported using a variety of supplements increased, while the number using only a multivitamin decreased. By the final year of the study, 2011, twice as many participants who identified themselves as regular supplement users said they used a variety of products than those who used only a multivitamin. The most common reasons given for using a variety of products were “overall health and wellness” and “to fill nutrient gaps in the diet.”

According to the findings of the CRN surveys, users of dietary supplements are more likely to adopt a variety of healthy habits than nonusers. This corresponds to previous research recently published in Nutrition Journal.

“What the data tells us,” said Judy Blatman, senior vice president of communications at CRN, “is that dietary supplement usage is a mainstream practice, and, contrary to some assertions, supplement users do not use these products as a license to slack off on eating right or exercising, but instead are health conscious individuals trying to do all the right things to be healthy. They are more likely than nonusers to try to eat a balanced diet, visit their doctor regularly, get a good night’s sleep, exercise regularly, and maintain a healthy weight.”

Dr. Dickinson observed, “The CRN data and NHANES data both indicate that half to two-thirds of American adults use dietary supplements and that their motivation comes from a desire to stay healthy. The evidence suggests that supplement use is viewed as one component of an overall wellness strategy.”

Early Fatherhood Associated With High Risk Of Depression In Men

Lawrence LeBlond for redOrbit.com – Your Universe Online
New research from Northwestern University has found that young fathers are especially at risk from depression, with those who are at around 25 years of age when they become dads and whom live in the same home as the child at greatest risk.
The new study, published Sunday in the journal Pediatrics, found that depressive symptoms increased on average by 68 percent over the first five years of fatherhood. It is the first study of its kind to identify when young fathers are at increased risk of depression.
The results of this longitudinal study are significant and could lead to more effective interventions and treatments for young men during their early fatherhood years, according to lead author of the study, Craig Garfield, MD, an associate professor in pediatrics and medical sciences at the university’s Feinberg School of Medicine.
“It’s not just new moms who need to be screened for depression, dads are at risk, too,” Garfield said in a statement. “Parental depression has a detrimental effect on kids, especially during those first key years of parent-infant attachment. We need to do a better job of helping young dads transition through that time period.”
Previous research had shown that depressed fathers used stricter punishments, read less and interacted less with their kids, and were more likely to neglect their children in the process. Children of depressed dads are also more likely to be at risk of poor language skills and have reading difficulties, as well as a rise in behavioral problems.
“We knew paternal depression existed and the detrimental effects it has on children, but we did not know where to focus our energy and our attention until this study,” Garfield said. “This is a wakeup call for anyone who knows a young man who has recently become a new father. Be aware of how he is doing during his transition into fatherhood. If he is feeling extreme anxiety or blues, or not able to enjoy things in life as he previously did, encourage him to get help.”
For the study, Garfield and colleagues collected data from 10,623 young men who were enrolled in the National Longitudinal Study of Adolescent Health. It includes a nationally representative sample of adolescents in the US and follows them for more than 20 years into adulthood. The participants were followed through a series of waves over the course of the study, with depression scores taken during each wave using a survey – in the most recent wave, men were between ages 24 and 32 and 33 percent had become fathers.
The majority of these fathers lived in the same home as the child. However, in those fathers who did not live in the same home as the child, no dramatic increase in depressive symptom scores were picked up during early fatherhood years. In this subset, any depression symptoms scores were elevated before fatherhood and actually decreased during early fatherhood. Residential fathers had depressive symptom scores that were lower before fatherhood and then dramatically increasing after the birth of a child and into early fatherhood.
This study was supported by the National Institutes of Health.

Experienced Nurses Are Most Cost-effective

Columbia University Medical Center

Four-year study of VA hospitals also finds experienced nurses are most cost-effective

When it comes to the cost and quality of hospital care, nurse tenure and teamwork matters. Patients get the best care when they are treated in units that are staffed by nurses who have extensive experience in their current job, according to a study from researchers at Columbia University School of Nursing and Columbia Business School. The study was published in the current issue of the American Economics Journal: Applied Economics.

The review of more than 900,000 patient admissions over four years at hospitals in the Veterans Administration Healthcare System is the largest study of its kind to link nurse staffing to patient outcomes. The researchers analyzed payroll records for each nurse and medical records for each patient to see how changes in nurse staffing impacted the length of stay for patients. Because length of stay is increased by delays in delivery of appropriate care and errors in care delivery, a shorter length of stay indicates that the hospital provided better treatment. At the same time, a shorter length of stay also makes care more cost-effective. The study found that a one-year increase in the average tenure of RNs on a hospital unit was associated with a 1.3 percent decrease in length of stay.

“Reducing length of stay is the holy grail of hospital management because it means patients are getting higher quality, more cost-effective care,” says senior study author Patricia Stone, PhD, RN, FAAN, Centennial Professor of Health Policy at Columbia Nursing. “When the same team of nurses works together over the years, the nurses develop a rhythm and routines that lead to more efficient care. Hospitals need to keep this in mind when making staffing decisions – disrupting the balance of a team can make quality go down and costs go up.”

While many hospitals rely on temporary staffing agencies at least some of the time to fill RN vacancies, the study found that it’s more cost-effective for hospitals to pay staff RNs overtime to work more hours on their unit. RNs working overtime resulted in shorter lengths of stay than hours worked by nurses hired from staffing agencies, the study found.

Nursing skill also mattered, the study found. Length of stay decreased more in response to staffing by RNs than by unlicensed assistive personnel. Furthermore, the study showed that length of stay increased when a team of RNs was disrupted by the absence of an experienced member or the addition of a new member.

“This rigorous econometric analysis of nurse staffing shows that hospital chief executives should be considering policies to retain the most experienced nurses and create a work environment that encourages nurses to remain on their current units,” says the senior economist on the study team, Ann Bartel, PhD, Merrill Lynch Professor of Workforce Transformation at Columbia Business School.

The researchers used the VA’s Personnel and Accounting Integrated Data for information on each nurse’s age, education, prior experience, VA hire date, start date at the current VA facility, and start date for the current unit at that facility. To assess patient outcomes, the researchers used the VA’s Patient Treatment File for information on each patient including dates of admission and discharge for each unit and for the overall hospitalization, as well as age and diagnoses. The final sample accounts for 90 percent of all acute care stays in the VA system for the fiscal years 2003 to 2006.

New Advances In HCC Diagnosis, Staging And Treatment All Predicted To Improve Patient Outcomes

Epidemiological, genetic and clinical data presented today at the International Liver CongressTM 2014 are collectively focused on different approaches designed to improve the diagnosis, staging and treatment of hepatocellular carcinoma (HCC).
“Human hepatocellular carcinoma is one of the most prevalent cancers worldwide and the second most frequent cause of cancer-related death,” said EASL’s Scientific Committee Member Dr Helen Reeves Senior Lecturer & Honorary Consultant Gastroenterologist at Newcastle Hospitals NHS Foundation Trust, UK.
“Because HCC is such an extremely diverse and heterogeneous disease, improving patient outcomes has proved a difficult undertaking. A number of existing therapeutic options have been subjected to rigorous study but have not shown any patient benefit. The findings from these HCC diagnosis, staging and treatment studies are important because they have the potential to significantly improve patient outcomes,” Dr Reeves explained.
Key findings from the studies include:
the need for centrally-coordinated screening programs across Europe
the potential of gadoxetic acid-enhanced MRI to more accurately stage HCC patients with early disease to ensure each patient receives the optimum treatment
the development of a 3-gene signature blood test, which can be used as an alternative to imaging techniques to reliably identify early stage HCC in high-risk individuals
impressive long-term data reinforcing the importance of percutaneous RFA in the HCC treatment armamentarium, including its use in the treatment of advanced HCC where a single HCC is associated with thrombosis of the main portal vein
Wide geographical variation in HCC survival explained by differing intensity of country screening programs
In Japan, approximately 80% of hepatocellular carcinoma (HCC) cases are detected by screening. In marked contrast, the figures for the UK, Spain and Hong Kong data were significantly lower at 15%, 35% and less than 10% respectively.
There was also a dramatic difference in the stage of disease at diagnosis. In Japanese patients, 59% were within the Milan Criteria (generally accepted set of parameters designed to assess the suitability of HCC patients for liver transplantation) and 71% were suitable for potentially curative treatment. Comparative figures for Spain were much lower at 26 and 32%, the UK 37 and 38% and Hong Kong, 8 and 16%, respectively.
Median HCC survival for Japan, Spain, UK and Hong Kong were 47, 26, 20 and 7 months respectively.
“The wide geographical variation in survival among HCC patients had been attributed to intrinsic ethnic differences, different aetiologies, or disease stages at presentation,” explained Dr Reeves. “However, age, gender and Child-Pugh class distribution were all similar between the HCC patient populations from each of these four countries. Statistical analysis showed that aetiology had little impact on survival,” she said.
“It would appear that the marked difference in the intensity of screening programs between different countries, and the consequent variation in curative therapeutic options goes a long way to explaining the wide geographical variation in HCC survival,” said Dr Reeves. “What we urgently need are centrally-coordinated screening programs across Europe to improve outcomes,” she added.
In this study, more than 5,000 patients were recruited from two high incidence areas, Japan (n=2599; predominantly HCV) and Hong Kong (1112; predominantly HBV), from a medium-incidence area, Spain (n=834; predominantly HCV & alcohol) and the UK (n=724; multiple aetiologies). Comprehensive demographic, aetiological and staging data along with treatment details were made available.
Staging HCC with gadoxetic acid-enhanced MRI improves treatment outcomes in patients with early disease
Additional staging of hepatocellular carcinoma (HCC) patients using gadoxetic acid-enhanced MRI has been shown to be associated with lower recurrence and better survival in patients presumed to have a single nodular HCC on the basis of a dynamic CT scan. These were the exciting findings of a study presented today at the International Liver CongressTM 2014.
Using multivariable analysis, the group of patients who were additionally evaluated with gadoxetic acid-MRI (CT+MR group) was shown to be associated with a significantly lower risk of HCC recurrence (hazard ratio [HR] 0.72, P=0.02) and overall mortality (HR 0.67, P=0.04) compared with the CT alone group.
According to Dr Reeves, “Early recurrence of HCC after curative treatment is frequent and thought to represent a metastasis from the primary tumour that was actually present before treatment was started. Using gadoxetic acid-enhanced MRI to more accurately stage HCC patients with early disease has the potential to significantly improve outcomes by ensuring each patient receives the optimum treatment,” she explained.
Gadoxetic acid is a contrast agent for MRI that has combined perfusion and hepatocyte-specific properties. This technique has shown higher detection sensitivity for HCC compared to dynamic CT or MRI.
In this historical cohort study, a total of 700 consecutive patients presumed to have a single nodular HCC by dynamic CT scan were analyzed. Out of this patient population with early disease, 323 were additionally evaluated with gadoxetic acid-MRI (CT+MR group); 377 were not (CT group).
Results of the initial CT scanning using the Barcelona Clinic Liver Cancer (BCLC) staging system had identified 243 (34.7%) patients at a very early stage (0) with a single lesion <2 cm; and 457 (65.3%) at an early stage A with a single lesion <5 cm, or three lesions <3 cm. There was no statistical difference in the numbers of patients at stages 0 and A between the CT+MR and CT groups.
After evaluation of those HCC patients in the CT+MR group with gadoxetic acid-MRI, a total of 74 HCC nodules were additionally detected in 53 (16.4%) patients, escalating the BCLC staging in 34 (10.5%) of the patients.
In the 298 propensity score-matched pairs (a statistical matching technique that attempts to estimate the effect of treatment by accounting for the covariates that predict receiving the treatment), the CT+MR group was again associated with a significantly lower risk of HCC recurrence (HR 0.74, P=0.047) and mortality (HR 0.61, P=0.02).
Novel gene signature able to predict development of HCC in high-risk individuals
A 3-gene signature that can be identified from analysis of a blood sample was able to reliably identify hepatocellular carcinoma (HCC) with a high degree of sensitivity and specificity, according to the results of a study presented today at the International Liver CongressTM 2014.
Comprehensive gene expression profiling of purified RNA from peripheral blood mononuclear cells taken from patients with chronic hepatitis B (CHB) and cirrhosis has identified 3 genes namely AREG, TNFAIP3 and GIMAP5 that were differentially expressed.
Subsequent studies on an independent cohort of 206 HCC patients and 194 patients with CHB and cirrhosis validated that these 3 genes were able to identify HCC with an accuracy of 82.5%, 81.5% and 71.8%, respectively.
Using multivariate logistic regression, these 3 genes in combination accurately predicted the development of HCC, with an AUC of 0.929 (95% CI, 0.90-0.95), and yielded a sensitivity of 82% and a specificity of 90.2%.
“Timely and effective diagnosis of HCC is critically important because surgical resection in early disease remains the only cure,” said Dr Reeves. “Hence, the identification of early stages of disease in high-risk individuals for the development of HCC would greatly improve clinical outcomes.”
“However, as it is already possible to detect early stage human HCC in high-risk patients with chronic hepatitis and cirrhosis using imaging techniques, this 3-gene signature would only be a true innovation if it was able to actually predict the development of HCC,” Dr Reeves commented.
In this context, this signature is being validated in an ongoing prospective study.
Radiofrequency ablation effective in treating both small HCC and advanced disease
Percutaneous radiofrequency ablation (RFA) is an effective and safe treatment for small hepatocellular carcinoma (HCC). The technique provides excellent overall survival and tumour-to-progression rates, according to the results of a 10 year follow-up study in a population of Chinese patients presented today at the International Liver CongressTM 2014.
“The development of local ablative therapy has been one of the major advances in the treatment of HCC,” said Dr Reeves. “Percutaneous RFA, performed under radiological guidance, is a minimally invasive, repeatable procedure with few complications.”
“In the treatment of hepatocellular carcinoma (HCC), where less than 40% of patients are candidates for surgery, and the rate of recurrence after curative surgery is high, percutaneous techniques like RFA are very important.”
“These impressive long-term data reinforce the importance of percutaneous RFA in the HCC treatment armamentarium,” Dr. Reeves said.
Between May 2000 and May 2012, a total of 1020 small tumour nodules in 837 patients were treated with percutaneous RFA. Complete ablation was achieved in 98.8% (1008/1020) with major complications occurring in only 0.59% (5/837) of patients. The estimated overall 1-, 3-, 5-, 10-year survival rates were 91%, 71%, 54%, and 33%, respectively. The 1-, 3-, 5-, and 10-year recurrence-free survival rates were 74%, 44%, 30% and 15%, respectively.
Dr Reeves then went on to report the results from a second 7-year follow-up study evaluating the efficacy of percutaneous RFA in HCC, but this time looking specifically at patients with advanced disease.
In this patient population, in whom there was accompanying main portal vein tumour thrombus (MPVTT) and compensated liver cirrhosis, percutaneous RFA significantly prolonged long-term survival compared with no treatment.
The one, three, five and seven year cumulative survival rates of treated patients were 62%, 29%, 18% and 5%, respectively, compared to a 12-month cumulative survival rate of 0% in untreated patients (p < 0.001). The disease free survival rates in the treated group was 52%, 38%, 35% and 23% at one, three, five and seven years, respectively.
From January 2005 to January 2012, among 3144 consecutive cirrhosis patients, 772 had HCC with MPVTT; of these, 70 patients had a single HCC with MPVTT and were therefore eligible for percutaneous RFA. A total of 48 out of these 70 patients (38 men; mean age 69 years) with 48 HCC nodules 3.7-5.0 cm in diameter invading the main portal trunk (MPT) underwent percutaneous RFA. The remaining 22 matched patients (18 men; mean age 69 years) with 22 HCC nodules 3.6-4.8 cm in diameter extending into the MPT, refused RFA and therefore made up the control group.
Efficacy of RFA was defined complete when both complete necrosis of the HCC and complete re-canalisation of the MPT and its branches were achieved.
“Based on these long-term efficacy data, percutaneous RFA could be considered an effective tool in the treatment of advanced HCC where a single HCC is associated with thrombosis of the main portal vein but these data do need confirmation in a prospective randomized trial,” Dr Reeves concluded.

On the Net:

Silly Putty Ingredient Could Help Stem Cells Become Motor Neurons

redOrbit Staff & Wire Reports – Your Universe Online

An ingredient found in Silly Putty could help scientists more efficiently turn human embryonic stem cells into fully functional specialized cells, according to research published online Sunday in the journal Nature Materials.

In the study, researchers from the University of Michigan report how they were able to coax stem cells to turn into working spinal cord cells by growing them on a soft, extremely fine carpet in which the threads were created from polydimethylsiloxane, one component of the popular children’s toy.

According to the authors, the paper is the first to directly link physical signals to human embryonic stem cell differentiation, which is the process by which source cells morph into one of the body’s 200-plus other types of cells that go on to become muscles, bones, nerves or organs.

Furthermore, their research increases the possibility that scientists will be able to uncover a more efficient way to guide differentiation in stem cells, potentially resulting in new treatment options for Alzheimer’s disease, ALS, Huntington’s disease or similar conditions, assistant professor of mechanical engineering Jianping Fu and his colleagues explained in a statement.

“This is extremely exciting,” said Fu. “To realize promising clinical applications of human embryonic stem cells, we need a better culture system that can reliably produce more target cells that function well. Our approach is a big step in that direction, by using synthetic microengineered surfaces to control mechanical environmental signals.”

He and his University of Michigan colleagues designed a specially engineered growth system in which polydimethylsiloxane served as the threads, and they discovered that by varying the height of the posts, they were able to alter the stiffness of the surface upon which the cells were grown.

Shorter posts were more rigid, while the taller ones were softer. On the taller ones, the stem cells that were grown morphed into nerve cells more often and more quickly than they did on the shorter ones. After a period of three weeks and two days, colonies of spinal cord cells that grew on the softer micropost carpets were four times more pure and 10 times larger than those growing on rigid ones, the study authors noted.

Eva Feldman, a professor of neurology at the university, believes that both embryonic and adult-based stem cell therapies have the potential to help patients grow new nerve cells. She studies ALS, a condition also known as Lou Gehrig’s Disease that kills motor neurons in the brain and spinal cord, and is using the technique to create fresh neuron’s from a patient’s own cells.

“Professor Fu and colleagues have developed an innovative method of generating high-yield and high-purity motor neurons from stem cells,” Feldman explained. “For ALS, discoveries like this provide tools for modeling disease in the laboratory and for developing cell-replacement therapies.”

“Fu’s findings go deeper than cell counts,” the university added. “The researchers verified that the new motor neurons they obtained on soft micropost carpets showed electrical behaviors comparable to those of neurons in the human body. They also identified a signaling pathway involved in regulating the mechanically sensitive behaviors.”

The specific signaling pathway, or route by which proteins carry chemical signals from the borders of a cell to its interior, that the study authors are analyzing is known as Hippo/YAP. This pathway is also involved in controlling the size of organs, as well as alternately preventing and causing tumors to grow.

Newly Discovered Particle Could Be The First Ever Confirmed Tetraquark

redOrbit Staff & Wire Reports – Your Universe Online

The Large Hadron Collider (LHC) has already played an essential role in the discovery of the so-called God particle, and now the world’s largest particle collider may have helped scientists discover a new form of matter known as the tetraquark.

According to Maggie McKee of New Scientist, researchers conducting experiments at the LHC have confirmed the existence of a particle known as Z(4430). While physicists had previously theorized that this particle exists, it had never been observed before – and now that it has, it could be the strongest evidence yet that the tetraquark exists.

Quarks, which are subatomic particles that serve as the fundamental building blocks of matter, are known to exist in either pairs or groups of three. In pairs, they form short-lived mesons, while in groups of three, they form the protons and neutrons that comprise the nucleus of an atom, explained Mashable’s Jason Abbruzzese.

Scientists have suspected for decades that groups of four could also bind together to form a quartet, thus forming a tetraquark. However, they previously have been unable to complete the complex quantum calculations required to test and verify those beliefs.

The newly discovered particle, Z(4430), is believed to be one of these tetraquarks, and in the recent LHC experiments, up to 4,000 of the particles were discovered. Now that its existence has been confirmed, the physicists will attempt to determine whether or not it truly is a still-hypothetical tetraquark.

Physicists have at least one reason to be hopeful, McKee said. While other suspected tetraquarks could be nothing more than loosely-bound pairs of mesons, Tel Aviv University physics professor Marek Karliner explained that Z(4430) is different because of its mass. “There aren’t any mesons at the right masses to make such a thing,” he explained, suggesting that it is an actual particle quartet.

There is one mystery remaining, Karliner (who was not involved in the research) told the New Scientist reporter. The decay rate of Z(4430) is at least 10 times faster than previous tetraquark suspects, which does not mesh with models of the particle group’s behavior. Additional data on how this particle decays could reveal whether or not is truly a tetraquark, and potentially lead to new insights into how matter behaves at the most basic scales.

The LHC team’s findings are currently available online at arXiv.org.

Fruit Flies Use Not Just Eyes, But Antennae To Control Air Speed

Brett Smith for redOrbit.com – Your Universe Online

Geneticists may know the fruit fly genus Drosophila as go-to organisms for their research, but a new study focused instead on how these insects go into a sort of “cruise control” while in flight.

Using bursts of air and sophisticated software, the new study revealed that fruit flies use a combination of vision and their wind-sensitive antennae to maintain a constant flight speed relative to the ground. The new study, published in Proceedings of the National Academy of Sciences, is based on previous research from the 1980s, according to study author Sawyer Fuller.

“In the old study, the researchers simulated natural wind for flies in a wind tunnel and found that flies maintain the same groundspeed—even in a steady wind,” said Fuller, a post-doctoral bioengineering researcher currently at MIT.

For the new study, researchers said they wanted to use potent blasts of air instead of the subtle breezes used in the earlier experiment. The quick gusts, which moved down the tunnel at the speed of sound, were designed to see how the fly deals with rapidly changing winds.

The researchers saw their flies act in a counter-intuitive manner – speeding up when the wind was coming from behind and slowing down into a headwind. In each case the flies gradually recovered to reestablish their original groundspeed, but the first response was a bit mysterious, according to the researchers.

“This response was basically the opposite of what the fly would need to do to maintain a consistent groundspeed in the wind,” Fuller said.

The scientists said they presumed that flies, like people and most other animals, used their eyesight to gauge their speed in wind, speeding up and slowing their flight based on what they saw. However, the researchers were also wondering about the in-flight purpose of the fly’s wind-sensing antennae.

To find their answer, they delivered powerful gusts of wind to both normal flies and flies without antennae. The flies devoid of antennae still raised their speed in the exact same way, but they only sped up half as much as the flies whose antennae remained intact. Furthermore, the flies without antennae were not able keep a constant speed, drastically switching between acceleration and deceleration. The findings indicated that the antennae were offering wind data that was essential for speed control, the researchers said.

To determine the role of eyesight, the study team projected visual images to trick the flies into thinking there was no increase in wind speed, despite what the antennae were sensing. When the researchers delivered strong winds to flies in this situation, the insects slowed and were unable to recover to their original speed.

“We know that vision is important for flying insects, and we know that flies have one of the fastest visual systems on the planet,” said study author Michael Dickinson, a bioengineering professor at Caltech. “But this response showed us that as fast as their vision is, if they’re flying too fast or the wind is blowing them around too quickly, their visual system reaches its limit and the world starts getting blurry.”

“A challenge here is that vision typically takes a lot of computation to get right, just like in flies, but it’s impossible to carry a powerful processor to do that quickly on a tiny robot,” he added. “So they’ll instead carry tiny cameras and do the visual processing on a tiny processor, but it will just take longer. Our results suggest that little flying vehicles would also do well to have fast wind sensors to compensate for this delay.”

Doctors Implant Lab-Grown Vaginas Into Women With Rare Disease

[ Watch the Video: Behind The Research To Engineer Human Vaginal Organs ]
Brett Smith for redOrbit.com – Your Universe Online
Mayer-Rokitansky-Küster-Hauser (MRKH) syndrome is a rare genetic disorder in which a female’s vagina and uterus are underdeveloped or missing. A new technique has now allowed doctors to implant laboratory-grown vaginas into teenage girls, bringing some sense of normalcy into their lives, according to a new report in The Lancet journal.
The laboratory-grown organs were generated using the patients’ own cells – eliminating the complications associated with organ transplants. The team that conducted the procedure said their method could also be used to treat patients with vaginal cancer or injuries.
“This pilot study is the first to demonstrate that vaginal organs can be constructed in the lab and used successfully in humans,” said Dr. Anthony Atala, director of Wake Forest Baptist Medical Center’s Institute for Regenerative Medicine. “This may represent a new option for patients who require vaginal reconstructive surgeries. In addition, this study is one more example of how regenerative medicine strategies can be applied to a variety of tissues and organs.”
Operations described in the study were performed between June 2005 and October 2008 on girls who were between 13 and 18 years old at the time. Follow-up analyses revealed that the organs were functioning normally up to eight years after the surgeries.
“Tissue biopsies, MRI scans and internal exams using magnification all showed that the engineered vaginas were similar in makeup and function to native tissue,” said study author Atlantida-Raya Rivera, director of the HIMFG Tissue Engineering Laboratory at the Metropolitan Autonomous University in Mexico City.
The artificial organs were made by using muscle and epithelial tissue derived from a small biopsy of each patient’s external genital area. Cells were then taken from the tissues, expanded and then put on a biomaterial that was formed into a vagina-like design. The scaffolds were customized to fit each patient individually.
Next, the specialists developed a space in the patient’s pelvis and attached the scaffold to internal structures. Once implanted in the body, cell-seeded scaffolds develop surrounding nerves and blood vessels while the cells increase in size and form tissue. Concurrently, the scaffolding material is absorbed by the body as the cells set down materials to form a lasting replacement structure.
A follow-up analysis revealed the border between native tissue and the engineered segments was impossible to differentiate and the scaffold had developed into tri-layer vaginal tissue, the researchers said.
Conventional treatments for MRKH include dilation of pre-existing tissue or reconstructive surgical treatment to produce new vaginal cells. However, these replacements often don’t have a normal muscle layer, leading to the narrowing or contracting of the vagina. The scientists claimed that with these treatments the complication rate can be as much as 75 percent in pediatric patients, with the requirement of vaginal dilation due to narrowing being the most standard issue.
The newly-described work is based on previous research by the same team surrounding lab-generated vaginas in mice and rabbits. In these studies, researchers found the significance of using cells on the scaffolds. The team used a comparable procedure for replacement bladders that were implanted in nine children starting in 1998.
The researchers said they plan to determine the overall effectiveness of their newly-developed procedure.

Many Common Household Products Contain DNA-Damaging Nanoparticles: Study

redOrbit Staff & Wire Reports – Your Universe Online

Some nanoparticles commonly added to thousands of consumer products can significantly damage DNA, according to a new study by researchers at MIT and the Harvard School of Public Health (HSPH).

These products, which include cosmetics, sunscreens, clothing and other common items, contain nanoparticles added by manufacturers to, among other things, improve texture, kill microbes, or enhance shelf life.

But the current study suggests these tiny particles can be toxic to cells.

For instance, the researchers found that zinc oxide nanoparticles, often used in sunscreen to block ultraviolet rays, significantly damages DNA. Nanoscale silver, which has been added to toys, toothpaste, clothing, and other products for its antimicrobial properties, also produces extensive DNA damage.

The US Food and Drug Administration (FDA) does not require manufacturers to test nanoscale additives for a given material if the bulk material has already been shown to be safe. However, there is evidence that the nanoparticle form of some of these materials may not be safe due to their extremely small size and differences in physical, chemical, and biological properties. They can also penetrate cells more easily.

“The problem is that if a nanoparticle is made out of something that’s deemed a safe material, it’s typically considered safe. There are people out there who are concerned, but it’s a tough battle because once these things go into production, it’s very hard to undo,” said Bevin Engelward, professor of biological engineering at MIT and lead researcher of the current study.

Engleward and associate professor Philip Demokritou, director of HSPH’s Center for Nanotechnology and Nanotoxicology, used a high-speed screening technology to analyze the DNA damage caused by nanoparticles, allowing them to study the potential hazards at a much faster rate and larger scale than previously possible.

The researchers focused on five types of engineered nanoparticles — silver, zinc oxide, iron oxide, cerium oxide, and silicon dioxide (also known as amorphous silica) — that are used industrially. Some of these nanomaterials can produce free radicals called reactive oxygen species, which can alter DNA. Furthermore, once these particles get into the body, they may accumulate in tissues, causing even more damage.

“It’s essential to monitor and evaluate the toxicity or the hazards that these materials may possess. There are so many variations of these materials, in different sizes and shapes, and they’re being incorporated into so many products,” said Christa Watson, a postdoc at HSPH and the lead author of a paper about the study published in the journal ACS Nano.

“This toxicological screening platform gives us a standardized method to assess the engineered nanomaterials that are being developed and used at present,” she said.

The researchers said they hope the screening technology could also be used to help design safer forms of nanoparticles, and are already working with commercial partners to create safer UV-blocking nanoparticles.

Until now, most studies of nanoparticle toxicity have focused on cell survival after exposure, while few have examined genotoxicity – the ability to damage DNA. Although genotoxicity may not necessarily kill a cell, it can lead to cancerous mutations if the DNA damage is not repaired.

A common way to study DNA damage in cells is something called a “comet assay,” named for the comet-shaped smear that damaged DNA forms during the test. The procedure is based on gel electrophoresis, a test in which an electric field is applied to DNA placed in a matrix, forcing the DNA to move across the gel. During electrophoresis, damaged DNA travels farther than undamaged DNA, producing a comet-tail shape. Measuring how far the DNA can travel reveals how much DNA damage has taken place.

However, this procedure, while highly sensitive, is very tedious. In 2010, Engelward and MIT professor Sangeeta Bhatia developed a much more rapid version of the comet assay.  Dubbed the CometChip, it uses microfabrication technology that allows single cells to be trapped in tiny microwells within the matrix, enabling as many as 1,000 samples to be processed in the time it formerly took to process just 30 samples. This allows researchers to test dozens of experimental conditions at a time, which can be analyzed using imaging software.

In the current study, the researchers used the CometChip to test the nanoparticles’ effects on two types of cells that are commonly used for toxicity studies – lymphoblastoids, a type of human blood cell, and an immortalized line of Chinese hamster ovary cells.

The results showed that zinc oxide and silver produced the greatest DNA damage in both cell lines. At a concentration of 10 micrograms per milliliter — a dose not high enough to kill all of the cells — these generated a large number of single-stranded DNA breaks, the study found.

Silicon dioxide, which is commonly added during food and drug production, generated very low levels of DNA damage, while iron oxide and cerium oxide also showed low genotoxicity.

The researchers said additional studies are needed to determine how much exposure to metal oxide nanoparticles is safe for humans.

“The biggest challenge we have as people concerned with exposure biology is deciding when is something dangerous and when is it not, based on the dose level. At low levels, probably these things are fine,” Engelward said. “The question is: At what level does it become problematic, and how long will it take for us to notice?”

One of the greatest areas of concern is nanoparticle exposure among children and fetuses, who may be more vulnerable to DNA damage because their cells divide more frequently than adult cells do. Occupational exposure to nanoparticles is also an important area of concern.

The researchers said the most common routes that engineered nanoparticles follow into the body are through the skin, lungs, and stomach. They are currently investigating nanoparticle genotoxicity on those types of cells, along with studying the effects of other engineered nanoparticles.

NASA Signs Agreement with German, Canadian Partners to Test Alternative Fuels

WASHINGTON, April 10, 2014 /PRNewswire-USNewswire/ — NASA has signed separate agreements with the German Aerospace Center (DLR) and the National Research Council of Canada (NRC) to conduct a series of joint flight tests to study the atmospheric effects of emissions from jet engines burning alternative fuels.
The Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS II) flights are set to begin May 7 and will be flown from NASA’s Armstrong Flight Research Center in Edwards, Calif.
“Partnering with our German and Canadian colleagues allows us to combine our expertise and resources as we work together to solve the challenges common to the global aviation community such as understanding emission characteristics from the use of alternative fuels which presents a great potential for significant reductions in harmful emissions,” said Jaiwon Shin, NASA’s associate administrator for aeronautics research.
NASA’s DC-8 and HU-25C Guardian, DLR’s Falcon 20-E5, and NRC’s CT-133 research aircraft will conduct flight tests in which the DC-8’s engines will burn a mix of different fuel blends, while the Falcon and CT-133 measure emissions and observe contrail formation.
“Cooperation between DLR and NASA is based on a strong mutual appreciation of our research work,” said Rolf Henke, the DLR Executive Board member responsible for aeronautics research. “We are very pleased to be performing joint test flights for the first time, and thus set an example by addressing pressing research questions in global aviation together.”
ACCESS II is the latest in a series of ground and flight tests begun in 2009 to study emissions and contrail formation from new blends of aviation fuels that include biofuel from renewable sources. ACCESS-I testing, conducted in 2013, indicated the biofuel blends tested may substantially reduce emissions of black carbon, sulfates, and organics. ACCESS II will gather additional data, with an emphasis on studying contrail formation.
Understanding the impacts of alternative fuel use in aviation could enable widespread use of one or more substitutes to fossil fuels as these new fuels become more readily available and cost competitive with conventional jet fuels.
Within NASA, ACCESS II is a multi-center project involving researchers at Armstrong, NASA’s Langley Research Center in Hampton, Va., and the agency’s Glenn Research Center in Cleveland. This research supports the strategic vision of NASA’s Aeronautics Research Mission Directorate, part of which is to enable the transition of the aviation industry to alternative fuels and low-carbon propulsion systems.
As part of an international team involved in this research, NASA will share its findings with the 24 member nations that make up the International Forum for Aviation Research (IFAR). DLR and NRC are participating members of IFAR and NASA is the current Chair.
For more information about aeronautics research at NASA, visit:
http://www.aeronautics.nasa.gov

New Worlds For Seafloor Animals Created Using Sunken Logs

MBARI
When it comes to food, most of the deep sea is a desert. Many seafloor animals feed on marine snow—the organic remnants of algae and animals that live in the sunlit surface waters, far above. However, marine snow only falls as a light dusting and doesn’t have much nutritional value. Thus, any other sources of food that reach the deep sea provide a temporary feast. Even bits of dead wood, waterlogged enough to sink, can support thriving communities of specialized animals. A new paper by biologists at the National Evolutionary Synthesis Center and the Monterey Bay Aquarium Research Institute (MBARI) shows that wood-boring clams serve as “ecosystem engineers,” making the organic matter in the wood available to other animals that colonize wood falls in the deep waters of Monterey Canyon.
In 2006, marine biologists Craig McClain and Jim Barry used MBARI’s remotely operated vehicle Tiburon to place 36 bundles of acacia wood on the canyon floor, 3,200 meters below the sea surface. Five years later, they retrieved the bundles and McClain painstakingly picked out every animal that had colonized the logs.
McClain performed a variety of statistical analyses on the different types of animals found in each of the bundles, looking for differences and similarities. The results were surprising. Even though all of the bundles contained exactly the same type of wood and were placed on the seafloor at the same time, within a few tens of meters of each other, there were huge differences in the numbers and types of animals from one bundle to another.
Some of the bundles had been heavily colonized. The seafloor around and under these bundles was carpeted with tiny wood chips and fecal pellets from wood-boring clams, as well as bacteria and fungi that help decompose this organic matter. Other wood bundles had few, if any animals living in them, and lacked these “halos” of discolored sediment.
McClain’s statistical analysis showed that as the wood got older and more heavily colonized by wood-boring clams (Xylophaga zierenbergi), the types and abundances of other animals also changed. Boring clams are one of the few marine animals that can digest the cellulose in wood. Like termites, these clams have specialized symbiotic bacteria in their bodies that help them digest cellulose. As McClain put it, “Without these wood-boring bivalves, the carbon energy in the wood would not be available to other species.”
McClain calls these clams “ecosystem engineers.” He explains, “Like oysters, beavers, and termites, these boring clams alter the landscape and provide new habitat for other species.”
As they eat their way through a sunken log, boring clams create lots of small holes in which other animals can hide. Their wood chips and feces provide food for a variety of smaller animals. And the clams themselves provide food for specialized predators.
The details of these interactions are still poorly known. For example, after boring clams, the most common animals McCain found in his bundles were tiny, shrimp-like crustaceans known as tanaids. The authors speculate that these tanaids may prey directly on the boring clams. Alternatively, the tanaids may consume the fecal pellets of boring clams and the bacteria associated with them (any organic matter is precious when you’re three kilometers below the ocean surface).
The statistical analyses also showed that large logs supported more diverse communities of animals than small logs. The researchers speculated that larger logs may provide food for a longer time period, which attracts more secondary colonizers, such as deep-sea snails. McClain explained, “Snails, contrary to what you might expect, are metabolically expensive. They may need a lot of energy, such as that found a large wood fall, to support a viable population.”
One of the most surprising discoveries was that, even after five years, some bundles contained only a few small, young boring clams, and other bundles contained no clams at all. The researchers could find no obvious reasons why the clams would colonize some bundles but not others.
McClain said, “The part that really excites and puzzles me is that not all the wood falls were colonized at the same time. At the end of the experiment some wood was not bored or bored very little. This didn’t reflect the placement, size, cut, or surface area of the logs. Basically it appears that the recruitment of larval clams into the wood fall was a nearly random process, even for logs that were just a few meters from one another.”
The rate at which a wood fall is colonized makes a big difference to deep-sea organisms living nearby. The faster that a piece of wood is colonized, the more rapidly its organic matter is made available to non-wood-eating animals. Thus, hard woods or woods containing clam-repellent compounds, might last much longer than soft woods, and might provide less food for animals on the nearby seafloor.
Whatever the reason, these differences in colonization from one wood fall to another created a wide variety of habitats and animal communities over a relatively small patch of seafloor. Thus, this study adds to the growing body of evidence that, although the deep seafloor may look flat and uniform, its animal communities often vary considerably over both space and time.

Liberia Is Home To The Second Largest Chimpanzee Population In West Africa

Max Planck Institute

When Liberia enters the news it is usually in the context of civil war, economic crisis, poverty or a disease outbreak such as the recent emergence of Ebola in West Africa. Liberia’s status as a biodiversity hotspot and the fact that it is home to some of the last viable and threatened wildlife populations in West Africa has received little media attention in the past. This is partly because the many years of violent conflict in Liberia, from 1989 to 1997 and from 2002 to 2003, thwarted efforts of biologists to conduct biological surveys. An international research team, including scientists of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, has now counted chimpanzees and other large mammals living in Liberia. The census revealed that this country is home to 7000 chimpanzees and therefore to the second largest population of the Western subspecies of chimpanzees. As Liberia has released large areas for deforestation, the local decision-makers can now use the results of this study in order to protect the chimpanzees more effectively.

Following the complete war-time collapse of the country’s economy, Liberia’s government has been trying to fuel economic growth by selling large amounts of its rich natural resources, including rubber, timber and minerals. Here, accurate biological datasets on the distribution and abundance of wildlife populations are key for making evidence-based management decisions that balance economic and conservation priorities. In addition, they are important for locating and delineating conservation priority areas, making assessments of anthropogenic threats, and proposing mitigation measures to policy-makers.

To close this data gap, researchers from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, The Wild Chimpanzee Foundation in Abidjan, Côte d’Ivoire and The Royal Society for the Protection of Birds in Bedfordshire, UK, via the Across the River Project, together with experienced rangers from the Forestry Development Authority in Liberia, local research assistants from Liberia and Sierra Leone and graduate students from the University of Liberia, embarked on a remarkably ambitious project: a survey of chimpanzees and other large mammals across the entire country of Liberia. For two years, the survey teams searched for the presence of chimpanzees and other large mammals in more than 100 locations throughout the country.

“This project was logistically very challenging,” says corresponding author Jessica Junker, who also supervised all data collection in the field. “To reach these locations, we occasionally had to drive for two days, then continue on motorbikes for several hours before embarking on a 20 to 30 kilometres hike during which we sometimes had to cross rivers, climb mountains and pass through steep valleys.” But the effort paid off. With an estimated population of more than 7000 individuals, Liberia now officially holds the second largest population of West African chimpanzees after Guinea. Even more excitingly for conservation, due to its relatively wide and continuous distribution within the country, the chimpanzee population of Liberia is also probably one of most viable chimpanzee populations in West Africa, making it a regional conservation priority.

Surprisingly, the survey results showed that more than 70 percent of the chimpanzees as well as some of the most species-diverse communities of large mammals occurred outside the fully-protected areas, which currently include only 3.8 percent of the country’s forests. In 2003, the Liberian government agreed to increase the extent of the protected area network to conserve at least 30 percent of the country’s forests. “The results of our study provide crucial information for site prioritization and selection in this ongoing process,” says lead author Clement Tweh of the Wild Chimpanzee Foundation in Liberia. “For example, the shape and location of some of the proposed protected areas might have to be re-considered. Also, it will be necessary to rapidly implement full protection status for proposed conservation priority areas, as future mining and forestry projects are encroaching fast.”

Since the ban on timber exports was repealed in 2006, more than 20,000 square kilometers of forest have already been assigned as forestry concessions and awarded to international and local investors. Additionally, since 2010, logging companies were issued so-called Private Use Permits, which were subject to virtually no sustainability requirements and accounted for almost half of Liberia’s remaining primary forest. Fortunately, President Ellen Johnson Sirleaf recently withdrew almost half of these allocations, thus saving tens of thousands of square kilometers of primary tropical rainforest. “Our survey makes it clear that this action has also saved a large number of West African chimpanzees,” says co-author Menladi Lormie, Max Planck researcher and FDA ecologist, of the President’s decision.

This survey showed that in areas where primary rainforest was still abundant, hunting was the anthropogenic threat most frequently encountered, followed by logging, mining and non-timber forest product extraction. “This combination of large-scale habitat destruction and high hunting rates may seriously jeopardize the long-term survival of Liberia’s wildlife populations,” says Dr. Annika Hillers, co-author of the article and conservation scientist for The Royal Society for the Protection of Birds in the Gola Forests, Sierra Leone and Liberia. “With this study, we provide an accurate and comprehensive data-based platform for local wildlife protection authorities, policy-makers and international conservation agencies to inform effective conservation strategies to best protect what is left of this country’s rich wildlife heritage.”

Predicting The Severity Of MRSA

April Flowers for redOrbit.com – Your Universe Online

A new technique to predict the toxicity of a MRSA (methicillin-resistant Staphylococcus aureus) infection from its DNA sequence has been developed by a team of researchers led by the Universities of Bath and Exeter.

MRSA remains a public health concern among doctors who are trying to find appropriate treatment options. It and other bacterial pathogens are able to cause disease partially because of toxicity, which is the bacterium’s ability to damage the host’s tissue. The new findings, published in Genome Research, could potentially lead to personalizing the treatment of individual MRSA infections.

Dr. Ruth Massey from the University of Bath’s Department of Biology & Biochemistry said, “the standard approach has always been to focus on a single or small number of genes and proteins.” This method has had limited success because toxicity is a complex trait encoded by many genetic loci.

The research team studied the whole genome sequences from 91 MRSA isolates. This allowed them to identify 125 genetic mutations that contributed to an individual isolate being either high-or low-toxicity. One surprising find was that isolates from the same clone varied hugely in toxicity.

All of the highly toxic strains shared a common genetic signature that, once recognized, allowed the researchers to predict which isolates were the most toxic and therefore would cause severe disease.

Massey explained, “In the future as the cost and speed of genome sequencing decreases, it will become feasible to take a swab from a patient, sequence the genome of the bacterium causing the infection, and then use this to predict the toxicity of the infection.”

“Clinicians will then be able to tailor the treatment to the specific infection – this technique can tell them which combination of antibiotics will be most effective, or tell them which drugs to administer to dampen the toxicity of the infection,” she continued. “The standard approach in studying MRSA’s toxicity has always been to focus on a single or small number of genes and proteins. However, this has not always been successful because toxicity is a complex trait encoded by many genetic loci. By looking at whole genome sequences we’ve been able to identify a number of new loci involved in toxicity.

“This work represents a step change in how genome sequencing can help us diagnose and control infections. It has also increased our understanding of how this pathogen causes severe infections by identifying novel regulators of toxicity.”

The findings could provide critical insight into the virulence of MRSA, according to Dr. Mario Recker, Associate Professor in Applied Mathematics at the University of Exeter. He said, “We know that many bacterial pathogens, such as MRSA, are so virulent in part because of their ability to damage a host’s tissue.”

“By using whole genome sequences we have been able to predict which would be most toxic, and so therefore would be more likely to cause severe disease. Having identified these novel genetic loci will also shed more light upon the complex machinery regulating bacterial virulence.”

The team is continuing their research by applying their methodology to other bacterial pathogens such as Streptococcus pneumonia, a leading cause of deaths in infants and children under the age of five.

Green Tomatoes May Help Prevent And Treat Muscle Atrophy

April Flowers for redOrbit.com – Your Universe Online

Muscle atrophy, or wasting, is a problem caused by aging and a variety of illnesses and injuries. The surprising answer to this according to a new study from the University of Iowa might be green tomatoes.

The research team, led by Christopher Adams, M.D., Ph.D., UI associate professor of internal medicine and molecular physiology and biophysics, used a screening method that previously helped them identify a compound in apple peel as a muscle-boosting agent. The team has now discovered that tomatidine, a compound found in green tomatoes, is even more potent than apple peel for building muscle and protecting against muscle atrophy.

Muscle atrophy can be caused by such conditions as cancer, heart failure, or orthopedic injuries. People suffering from atrophy become weak and fatigued, their physical activity and quality of life become impaired, and they are predisposed to falls and fractures. More than 50 million Americans are affected annually, including 30 million over the age of 60. People with muscle atrophy are often forced into nursing homes and rehab facilities.

“Muscle atrophy causes many problems for people, their families, and the health care system in general,” said Adams. “However, we lack an effective way to prevent or treat it. Exercise certainly helps, but it’s not enough and not very possible for many people who are ill or injured.”

Adams and his team started the study trying to find a small molecule compound that might be useful in treating muscle atrophy. Using a system biology tool known as the Connectivity Map — developed at the Broad Institute of MIT and Harvard University — the team focused in on tomatidine. The gene expression changes generated by tomatidine are essentially the opposite of the changes that happen in muscle cells when they are affected by muscle atrophy.

“That result was important because we are looking for something that can help people,” Adams said.

Next, the team began to feed a diet of tomatidine to mice, finding that healthy mice on this diet grew bigger muscles, became stronger and could exercise longer. More importantly, tomatidine was found to prevent and treat muscle atrophy.

The team was surprised to find that despite the larger muscles of mice fed tomatidine, the overall body weight of the mice did not change due to a loss of fat. This suggests that tomatidine might have applications for treating obesity, as well.

One of the most attractive qualities of tomatidine for treating muscle atrophy is that it is a natural compound, produced when alpha-tomatine, which is found in tomato plants and in green tomatoes in particular, is digested in the gut.

“Green tomatoes are safe to eat in moderation. But we don’t know how many green tomatoes a person would need to eat to get a dose of tomatidine similar to what we gave the mice. We also don’t know if such a dose of tomatidine will be safe for people, or if it will have the same effect in people as it does in mice,” Adams added. “We are working hard to answer these questions, hoping to find relatively simple ways that people can maintain muscle mass and function, or if necessary, regain it.”

This same research strategy helped the team to identify ursolic acid from apple peels as a compound that promotes muscle growth.

“Tomatidine is significantly more potent than ursolic acid and appears to have a different mechanism of action. This is a step in the right direction,” Adams said. “We are now very interested in the possibility that several food-based natural compounds such as tomatidine and ursolic acid might someday be combined into science-based supplements, or even simply incorporated into everyday foods to make them healthier.”

A biotech company called Emmyon has been formed by Adams and his colleagues to accelerate this research and translate it to people. To promote the development of strategies for preserving muscle mass and function during the aging process, Emmyon recently received funding from the National Institutes of Health (NIH). Tomatidine and ursolic acid are also being used by the company as natural leads to create new therapies targeting muscle atrophy and obesity.

The results were published in a recent issue of the Journal of Biological Chemistry.

Study Finds No Scientific Evidence That Animal Visits Help Hospitalized Children

redOrbit Staff & Wire Reports – Your Universe Online

Animals regularly make the rounds at children’s hospitals and other care facilities in an attempt to brighten the days and raise the spirits of the patients, but is there any scientific evidence that such visits are actually beneficial?

Based on a global review of studies analyzing the impact of “animal interventions” in healthcare settings for children conducted by researchers at the University of Adelaide, the answer to that question is no – there is no proof that these visits are beneficial for either the patients or the animals themselves.

Lead author Professor Anna Chur-Hansen, the head of the university’s school of psychology, and her colleagues report that there is a significant gap in scientific knowledge on the benefits of animals for sick youngsters. Their findings appear in a recent edition of the journal Anthrozoös.

“If you speak with most people they’ll say it’s a good thing for animals such as dogs and cats to be taken into hospitals, so that patients can derive some form of therapeutic effect from their association with the animals,” Chur-Hansen explained in a statement Wednesday.

She said that ever since the term “human-animal bond” was first coined during the 1980s, people have just assumed that animals could provide people in healthcare or palliative care with an enhanced sense of wellbeing. This assumption has given rise to multiple animal support organizations designed to assist patients.

“However, the scientific world has done such a poor job of researching this field that no-one can truly say what the benefits are, how they work, or whether such a situation causes problems or distress – or the exact opposite – for the animals themselves,” the professor noted.

“The assumption is that these programs are beneficial – and from the little evidence available they are likely to be – but no-one has yet fully assessed the range of issues associated with the human-animal bond in the healthcare setting,” Chur-Hansen added.

According to the study authors, there are several questions surrounding the practice of animal-hospital visits. For instance, is it better for patients to receive visits from their own pets or from someone else’s pet? Are there any risks associated with having animals visit sick kids on site? Does it raise any disease-control issues? What effect do these types of visits have on the creatures themselves, and are the visits inappropriate for some cultures?

“Given the limited information around AAI [animal-assisted interventions] for hospitalized children, including the risks and benefits and the limitations of existing studies, future research is required,” they wrote. “This should take into account the methodological considerations discussed in this review, so that our knowledge base can be enhanced and if and where appropriate, such interventions be implemented and rigorously evaluated.”

“Our hope is that by better understanding what’s really happening, we’ll be able to develop guidelines that will lead to best practice – guidelines that could be used by animal support groups and healthcare professionals alike,” added Chur-Hansen.

Neanderthals Were No Strangers To Good Parenting

University of York

Archaeologists at the University of York are challenging the traditional view that Neanderthal childhood was difficult, short and dangerous.

A research team from PALAEO (Centre for Human Palaeoecology and Evolutionary Origins) and the Department of Archaeology at York offer a new and distinctive perspective which suggests that Neanderthal children experienced strong emotional attachments with their immediate social group, used play to develop skills and played a significant role in their society.

The traditional perception of the toughness of Neanderthal childhood is based largely on biological evidence, but the archaeologists, led by Dr Penny Spikins, also studied cultural and social evidence to explore the experience of Neanderthal children.

In research published in the Oxford Journal of Archaeology, they found that Neanderthal childhood experience was subtly different from that of their modern human counterparts in that it had a greater focus on social relationships within their group. Investigation of Neanderthal burials suggests that children played a particularly significant role in their society, particularly in symbolic expression.

The research team, which also included Gail Hitchens, Andy Needham and Holly Rutherford, say there is evidence that Neanderthals cared for their sick and injured children for months and often years. The study of child burials, meanwhile, reveals that the young may have been given particular attention when they died, with generally more elaborate graves than older individuals.

Neanderthal groups are believed to have been small and relatively isolated, suggesting important implications for the social and emotional context of childhood. Living in rugged terrain, there will have been little selection pressure on overcoming the tendency to avoid outside groups with a consequent natural emotional focus on close internal connections.

Dr Spikins, who has a new book on why altruism was central to human evolutionary origins, How Compassion Made Us Human, (Pen and Sword) published later this year, said: “The traditional view sees Neanderthal childhood as unusually harsh, difficult and dangerous. This accords with preconceptions about Neanderthal inferiority and an inability to protect children epitomizing Neanderthal decline.

“Our research found that a close attachment and particular attention to children is a more plausible interpretation of the archaeological evidence, explaining an unusual focus on infants and children in burial, and setting Neanderthal symbolism within a context which is likely to have included children.

“Interpretations of high activity levels and frequent periods of scarcity form part of the basis for this perceived harsh upbringing. However, such challenges in childhood may not be distinctive from the normal experience of early Palaeolithic human children, or contemporary hunter-gatherers in particularly cold environments. There is a critical distinction to be made between a harsh childhood and a childhood lived in a harsh environment.”

New Process Could Turn Astronaut Urine Into Fuel, Drinking Water

[ Watch the Video: Space Potty: Using The Bathroom In Space ]

rredOrbit Staff & Wire Reports – Your Universe Online

It might not be as glamorous as taking the first steps on the moon or sending a rover to explore Mars, but new research appearing in the journal ACS Sustainable Chemistry & Engineering could solve an age-old problem associated with space travel: what to do with all of the astronaut pee produced during interstellar travel?

Rather than just ejecting the fluid waste into the universe, scientists from the University of Puerto Rico’s Rio Piedras Campus and the NASA Ames Research Center in Mountain View, California are reportedly developing a new technique that would allow astronaut pee to be converted into either fuel or drinking water.

As study authors Eduardo Nicolau, Carlos R. Cabrera and their colleagues explain, human waste on extended space missions comprises approximately half of the mission’s total waste amount. Recycling it is essential to keeping the environment clean for the astronauts, and when properly treated, recycled urine can become an essential source of drinking water when onboard H2O supplies begin to dwindle.

The method would eliminate the need for water to be delivered from Earth, which is tremendously expensive, and previous research has demonstrated that a treatment process known as forward osmosis could be combined with fuel cells to generate power. The findings could also lead to new ways to treat wastewater here on Earth.

The researchers looked to build upon previous studies in order to find a way to not only solve the problem of how to deal with urine during space travel, but to also make it useful to the crew members. They collected urine and shower wastewater and processed it using forward osmosis to filter contaminants from urea.

Using their new Urea Bioreactor Electrochemical system (UBE), Nicolau’s team was able to convert this combination of urine and wastewater into ammonia in a bioreactor. They then converted that ammonia into energy using its fuel cell. While the system was designed with space travel in mind, the investigators said “the results showed that the UBE system could be used in any wastewater treatment systems containing urea and/or ammonia.”

“The results of this research showed the feasibility of interfacing wastewater-recycling processes with bioelectrochemical systems to achieve water recycling while reusing useful resources,” they wrote in their study. They also reported that the UBE systems removed approximately 80 percent of organic carbon from the waste and successfully converted roughly 86 percent of the urea into ammonia.

Technical Tests Of Biodiversity

Physicists play with the genetics of populations

What happens when physicists play (using mathematical instruments) with the genetics of populations? They may discover unexpected connections between migration and biodiversity, for example, as recently done by a group of researchers from the International School for Advanced Studies (SISSA) in Trieste and the Polytechnic University in Turin in a study published in the journal Physical Review Letters.

The effect of migration on biodiversity (intended as the coexistence of different genetic traits) is an open question: does migration increase or decrease the genetic variability of populations? Or is the relationship more complex than that?

Imagine a population that lives subdivided among several “islands” separated by stretches of sea. On each island live two groups, A and B, which differ in one genetic trait, for example the individuals in group A have blond hair and those in group B have brown hair. If there is no migration between the two islands the biodiversity on each can only vary based on “stochastic” dynamics (i.e., with a random component), related to the progression of generations. However, if a certain degree of mobility is ensured within the group of islands, that is, if some individuals travel and migrate, then the biodiversity comes out of its “isolation” and is influenced by this migratory phenomenon.

The researchers reproduced this situation in a mathematical model and monitored changes in biodiversity with varying rates of migration, exploiting certain analogies with physical phenomena of a totally different nature.

“We started with simple ‘pen and paper’ calculations which took into account the known ‘rules’ of population genetics. However, as we proceeded with our work, the complexity of the model forced us to use a computer simulation of the system,” explains Pierangelo Lombardo, a SISSA PhD student and first author of the paper. “In actual fact, we expected a different result from what we obtained. Even looking at the data reported in previous studies, the most commonly held view is that the higher the migration rate the lower the biodiversity.”

“Our model, on the other hand, provided a very different result,” says Lombardo. “The function that relates the two variables is a curve, where with higher migration rates biodiversity can be seen to reach a minimum before starting to grow again.”

“This means that if we want to increase a population’s biodiversity, under the conditions described above, we could increase the migration rates above the value that makes biodiversity fall to a minimum,” explains Andrea Gambassi, the SISSA professor who coordinated the study. “Ours is clearly a simplified model, but it does take into account the essential mechanisms underlying the genetics of migrations.”

“Our findings may prove useful in guiding field research,” concludes Gambassi. “Our model can in fact be used to guide the planning of experiments aiming to monitor the relationship between migration and biodiversity. And should experimental observations confirm our model, then we could further refine it and use it to make predictions and control the behavior of simple populations, for example, colonies of bacteria.”

On the Net:

Men Often Go Undiagnosed, Untreated For Eating Disorders: Study

Brett Smith for redOrbit.com – Your Universe Online

Discussions about eating disorders and body image usually revolve around women and a new study from the University of Oxford in England has found that young men are not receiving as much attention as they should regarding these issues.

Published in the open-access journal BMJ Open, the new study revealed that young men are underdiagnosed and undertreated for eating disorders – regardless of the fact that they only comprised about 1-in-4 cases.

In the study, researchers talked with 39 individuals between the ages of 16 and 25 about eating disorders such as binge eating, bulimia nervosa and anorexia. All 10 male participants took some time to understand that their activities and behaviors were possible indications of an eating disorder. These behaviors included going days without eating; purging; and obsessing over weight and exercise.

Male participants cited gender stigma as one of the main reasons why it took them so long to see the potential signs. One male participant said eating disorders were a condition for “fragile teenage girls,” while another said bulimia and anorexia were “something girls got.”

Also, friends and family did not see their behaviors as symptoms of an eating disorder, chalking participants unhealthy behavior up to personal choice. The male volunteers said only a medical incident or crisis convinced them that something was wrong.

After delaying treatment, then finally seeking it – the males said their treatment experiences were mixed. They said they felt like low priority patients, found little male-specific documentation and in one case, told by the doctor “to man up.”

“I did start Googling it and I came across eventually on Facebook the ‘Men have eating disorders too’, as well, and there was a couple of other websites that I looked at,” one male participant told the researchers. “But there’s still in my opinion there’s still no real information of what you do or where you go.”

“Men with eating disorders are underdiagnosed, undertreated and under researched,” the study authors wrote. “Our findings suggest that men may experience particular problems in recognizing that they may have an eating disorder as a result of the continuing cultural construction of eating disorders as uniquely or predominantly a female problem.”

The researchers added that gender bias on this issue has “also been embedded in clinical practice,” adding that “early detection is imperative” in treating these disorders.

Leanne Thorndyke, of the UK eating-disorder charity Beat, told BBC News that a larger and larger section of society is feeling pressure regarding their body image.

“The pressures on body weight and body image are affecting a much wider range of people, which obviously includes men,” she said. “There is more pressure on men from magazines with celebrities and male models to have the ‘ideal’ body image.”

“Boys and men tend to want to be bigger and more muscular and toned, which is a different ideal to women,” she continued, adding that in modern society, we are “bombarded by images every day from all angles, something that just wasn’t there only a few years ago.”

Why Binge Drinkers Are Slower To Heal From Their Wounds

People who are injured while binge drinking are much slower to heal from wounds suffered in car accidents, shootings, fires, etc.
Now a new study is providing insights into why alcohol has such a negative effect on wound healing. Loyola University Chicago Stritch School of Medicine researchers report that binge alcohol exposure significantly reduced levels of key components of the immune system involved in healing.
The study by senior author Katherine A. Radek, PhD, and colleagues from Loyola’s Alcohol Research Program and the Infectious Disease and Immunology Research Institute is published online ahead of print in the journal Alcoholism: Clinical and Experimental Research.
In the United States, alcohol dependence and/or abuse affects 20 percent to 40 percent of hospitalized patients. Alcohol increases the risk of infections in the hospital, including surgical site infections. Patients with surgical-site infections are hospitalized for twice as long, have a higher rate of re-admission and are twice as likely to die as patients who did not binge drink.
The study showed, for the first time, that binge alcohol exposure reduces the amount of white blood cells called macrophages that chew up bacteria and debris. This defect, in part, makes the wound more likely to be infected by bacteria, such as Staphylococcus aureus.
The study also found that binge alcohol exposure impaired the production of a protein that recruits macrophages to the wound site. (This protein is called macrophage inflammatory protein-1 alpha, or MIP-1α.) Binge alcohol also reduced levels of another key component of the immune system known as CRAMP (cathelicidin-related antimicrobial peptide). CRAMP is a type of small protein present in the outermost layer of the skin, the epidermis. These small proteins, called antimicrobial peptides, kill bacteria and recruit macrophages and other immune system cells to the wound site.
“Together these effects likely contribute to delayed wound closure and enhanced infection severity observed in intoxicated patients,” researchers concluded.
The study involved an in vivo model and a typical pattern of binge drinking: three days of alcohol exposure, followed by four days without alcohol, followed by three more days of binge alcohol exposure. The binge alcohol exposures were equivalent to roughly twice the legal limit for driving.

On the Net:

Napping Helps Children Retain Memory, Learn New Skills

April Flowers for redOrbit.com – Your Universe Online

It seems like babies and young children are constantly learning new things and making giant developmental leaps. As if overnight, sometimes, they figure out how to  recognize certain shapes or what the word “no” means no matter who says it.

A new study from the University of Arizona and the University of Tübingen shows that, in reality, making those leaps could be just a nap away. The findings, presented at the Cognitive Neuroscience Society (CNS) annual meeting this week, demonstrate that infants who nap are better able to apply lessons to new skills, and preschoolers who nap are better able to retain learned knowledge.

“Sleep plays a crucial role in learning from early in development,” says Rebecca Gómez of the University of Arizona, whose work looks specifically at how sleep enables babies and young children to learn language over time.

“We want to show that sleep is not just a necessary evil for the organism to stay functional,” says Susanne Diekelmann, of the University of Tübingen in Germany, who is chairing the symposium on sleep and memory. “Sleep is an active state that is essential for the formation of lasting memories.”

[ Watch the Video: Good Sleep Equals Good Memory ]

According to a growing body of evidence, memories become reactivated during sleep. New research is illuminating exactly when and how memories are stored and reactivated. “Sleep is a highly selective state that preferentially strengthens memories that are relevant for our future behavior,” Diekelmann says. “Sleep can also abstract general rules from single experiences, which helps us to deal more efficiently with similar situations in the future.”

One of the developmental steps for young children is called generalization, or the ability to recognize similar, but not identical, instances to something they have already learned and apply it to the new situation. For example, the ability to recognize the letter “A” in different fonts, or understanding a word regardless of who is speaking it.

“Sleep is essential for extending learning to new examples,” Gómez says. “Naps soon after learning appear to be particularly important for generalization of knowledge in infants and preschoolers.”

To test the relationship between sleep and learning for infants, Gómez played a “training language” over loudspeakers to infants 12 months old who were playing. The babies were tested to find out if they recognized novel vocabulary in the artificial language after they had taken a nap, or been kept awake.

Gómez and her team found that infants who napped after hearing the new language were able to take the language rules learned before their nap and apply them, recognizing entirely new sentences in the language. To test the infants’ recognition of the linguistic rules, the team measured the length of time infants spent with their heads turned to listen correctly versus incorrectly structured sentences in the language.

In creating artificial languages for her studies, Gómez mimics structure in natural language that may be useful in language learning. In many languages, for instance, nouns and verbs have subtly different sound patterns.

“If I want to study whether these patterns help infants learn language at a particular age, I build stimuli with similar characteristics into an artificial language,” she says. “I can then test children of different ages to see when they are able to use this information.”

PRESCHOOLERS

The role of napping for preschoolers who were learning words was also examined.

“Infants who nap soon after learning are able to generalize after sleep but not after a similar interval of normal waking time,” Gómez says. “Preschoolers with more mature memory structures do not appear to form generalizations during sleep; however, naps appear to be necessary for retaining a generalization they form before a nap.”

The team suggests that the difference in learning and memory in infants versus preschool children could be the result of different neural mechanisms. Non-human primate research has revealed that although most substructures of the hippocampus are in place in infancy, the substructures that support the replay of memories during sleep do not begin forming until 16-20 months of age. These substructures take several more years to reach maturity.

“Therefore, we hypothesize that the benefits of sleep in infancy stem from different processes than those benefiting preschoolers,” she says.

For infants, sleep might help to reduce the less redundant information – for example, the speaker’s voice, the actual words infants hear over and above the rhythmic pattern occurring for all stimuli. For preschool children, however, Gómez says that hippocampally-based replay may begin to contribute to more active integration and retention of sleep-dependent memories.

ADULTS AND SLEEP

It seems that sleep not only helps us to remember our past, but also to remember the things we want to accomplish in the future.

“Whether we make plans for the next holiday or whether we just think about what to have for dinner tonight, all of these plans heavily depend on our ability to remember what we wanted to do at the appropriate time in the future,” says Diekelmann. “The likelihood that we remember to execute our intentions at the appropriate time in the future is substantially higher if we have had a good night’s sleep after having formed the intention.”

Diekelmann says that there are two methods for keeping our intentions in mind.

One method is to keep those intentions constantly in mind and be alert to opportunities to execute them. “For example, if I want to drop a letter at the post office on my way to work, I can look for a post office all the way to my work place and think all the time ‘I have to drop the letter.'” According to Diekelmann, this method is inefficient—using up cognitive resources necessary for other tasks such as driving or walking without stumbling.

“The second way to remember intentions is to store them in the memory network,” she says. “If the memory of the intention is stored well enough, it will come to mind automatically in the appropriate situation.”

Diekelmann and her team study this second method. Participants in one of her studies were asked to remember word pairs, and after learning, were told they would have to detect these words in a different task two days later. Half of the participants were allowed to sleep the first night, while half were kept awake. Both groups were allowed to sleep the second night so they would not be tired during testing.

Participants performed a task during testing that included some of the previously learned word pairs. They were not reminded of their intention to recognize the words. Instead, the researchers just recorded how many of the words each participant found. This allowed them to determine whether participants still succeeded in detecting the words when they had to do an additional task at the same time that required their full attention.

“We expected that, if participants had stored the intention sufficiently strong in their memory, then seeing the words should automatically bring to mind the intention to detect the words,” Diekelmann says.

Those participants allowed to sleep the first night were able to automatically detect the words.

“With sleep, the participants performed perfectly well and detected almost all of the words even when they had to perform two challenging tasks in parallel,” Diekelmann says. The participants who remained awake that first night, however, performed substantially worse in detecting words while performing other tasks.

“Even when we have to do a lot of different things at the same time, sleep ensures that our intentions come to mind spontaneously once we encounter the appropriate situation to execute the intention,” Diekelmann says.

Caring For Grandchildren Can Slow Cognitive Aging In Women, But Only In Smaller Doses

redOrbit Staff & Wire Reports – Your Universe Online

Grandmothers can help prevent cognitive decay and reduce the risk of developing Alzheimer’s disease by taking care of their grandchildren one day per week, according to new research published online Tuesday in the journal Menopause.

However, taking care of their grandchildren five days per week or more actually had some negative impact on the cognitive function of 186 Australian women between the ages of 57 and 68 involved in the study.

“We know that older women who are socially engaged have better cognitive function and a lower risk of developing dementia later, but too much of a good thing just might be bad,” North American Menopause Society (NAMS) executive director Dr. Margery Gass explained in a statement.

The women took three different tests of mental sharpness, and also told the researchers whether or not they felt as if their own children had been especially demanding of them over the past year. Of the 120 grandmothers in the study, those who cared for their grandchildren one day per week performed best on two of those three tests.

However, much to the authors’ surprise, grandmothers who cared for their grandchildren for at least five days per week did significantly worse on a test that measured those women’s mental processing speed and working memory. The investigation also revealed that the more time grandmothers spent taking care of grandkids, they more they felt that their own children had been more demanding of them, suggesting that mood could be a factor in this finding.

The women’s cognitive abilities were assessed using the Symbol-Digit Modalities Test (SDMT), California Verbal Learning Test, and Tower of London, and the authors say their findings could indicate that highly frequent grandparenting predicts lower mental performance. They are planning to follow up with additional research.

While previous studies have looked at the link between cognitive sharpness and social engagement, this is the first time that experts have looked at grandmothering. Since caring for grandchildren is “such an important and common social role for postmenopausal women,” Dr. Gass said that it was important “to know more about its effects on their future health.” She added that this new study was “a good start” towards accomplishing that goal.

Last August, researchers from Case Western Reserve University reported that grandmothers who were their household’s primary caregivers were more likely to suffer from depression. However, being a full-time caregiver had no negative impact on their resourcefulness or willingness to receive assistance.

Heartbleed Bug Threatens OpenSSL Encryption Service

Peter Suciu for redOrbit.com – Your Universe Online
This week computer security experts warned system administrators to patch a severe flaw in the software library that is now used by millions of websites to encrypt sensitive data and communications. This latest flaw is known as “Heartbleed,” and is reportedly contained in several versions of OpenSSL, which is a cryptographic library.
OpenSSL is one of those things that most people use every day, even if they don’t know it. OpenSSL enables SSL (Secure Sockets Layer) or TLS (Transport Security Layer) encryption, which is used by most websites. The bug is officially referenced as CVE-2014-0160 and it makes it possible for attackers to recover up to 64 kilobytes of memory from the server or client computer running a vulnerable OpenSSL version.
An OpenSSL Security Advisory was issued on Monday, after Neel Mehta of Google Security discovered the bug.
On Tuesday the following information was posted about the Heartbleed Bug by security firm Codenomicon on a website donning the “Heartbleed Bug” moniker:
“The Heartbleed Bug is a serious vulnerability in the popular OpenSSL cryptographic software library. This weakness allows stealing the information protected, under normal conditions, by the SSL/TLS encryption used to secure the Internet. SSL/TLS provides communication security and privacy over the Internet for applications such as web, email, instant messaging (IM) and some virtual private networks (VPNs).
“The Heartbleed bug allows anyone on the Internet to read the memory of the systems protected by the vulnerable versions of the OpenSSL software. This compromises the secret keys used to identify the service providers and to encrypt the traffic, the names and passwords of the users and the actual content. This allows attackers to eavesdrop on communications, steal data directly from the services and users and to impersonate services and users.”
Researchers for Codenomicon tested their own services from an attacker’s perspective and were able to do so without leaving any trace. They made this “attack” without utilizing any privileged information or credentials and yet were able to “steal” the secret keys used for their X.509 certificates, user names and passwords, as well as instant messages, emails, business critical documents and other communications.
According to Codenomicon, “Operating systems that may have a vulnerable version of OpenSSL include Debian Wheezy, Ubuntu 12.04.4 LTS, CentOS 6.5, Fedora 18, OpenBSD 5.3, FreeBSD 8.4, NetBSD 5.0.2 and OpenSUSE 12.2.”
As PC World reported, “OpenSSL also underpins two of the most widely used Web servers, Apache and nginx. The code library is also used to protect email servers, chat servers, virtual private networks and other networking appliances.”
TechCrunch also reported that this bug has been in OpenSSL for more than two years – “Since December 2011, OpenSSL versions 1.0.1 through 1.0.1f).”
“It appears that exploiting this bug leaves no trace in the server’s logs,” Greg Kumparak wrote for TechCrunch on Tuesday. “So there’s no easy way for a system administrator to know if their servers have been compromised; they just have to assume that they have been.”
Last week Nick Sullivan, a system engineer at content delivery network CloudFlare, patched the security flaw – but waited until Monday to announce the findings.
“This bug fix is a successful example of what is called responsible disclosure,” Sullivan said via a blog post on Monday. “Instead of disclosing the vulnerability to the public right away, the people notified of the problem tracked down the appropriate stakeholders and gave them a chance to fix the vulnerability before it went public. This model helps keep the Internet safe. A big thank you goes out to our partners for disclosing this vulnerability to us in a safe, transparent, and responsible manner. We will announce more about our responsible disclosure policy shortly.”

The Market For Renewable Energy Grew In 2013, Despite Less Investing

redOrbit Staff & Wire Reports – Your Universe Online

Renewable energy accounted for 43.6 percent of the world’s newly installed electricity-generating capacity last year, despite a 14 percent drop in investments, according to the latest Global Trends in Renewable Energy Investment report.

The report, produced by the Frankfurt School, the United Nations Environment Program (UNEP) and Bloomberg New Energy Finance, has become the standard reference for global renewable energy investment figures in recent years.  The data revealed that global investment in renewable energy fell $35.1 billion to $214.4 billion last year, due mainly to the falling cost of solar photovoltaic systems and policy uncertainty in many countries – an issue that also reduced investment in fossil fuel generation last year.

However, despite this drop in investment, renewable energy’s share of global electricity generation continued its steady climb.

In fact, were it not for renewables, global energy-related CO2 emissions would have been an estimated 1.2 gigatonnes higher in 2013, the researchers said. This would have caused a 12 percent increase in the gap between where emissions are heading and where they need to be in 2020 in order to have a realistic chance of keeping under a two degree Centigrade temperature rise, the authors of the report wrote.

“A long-term shift in investment over the next few decades towards a cleaner energy portfolio is needed to avoid dangerous climate change, with the energy sector accounting for around two thirds of total greenhouse gas emissions,” said Achim Steiner, UN Under-Secretary-General and Executive Director of UNEP. “The fact that renewable energy is gaining a bigger share of overall generation globally is encouraging. To support this further, we must re-evaluate investment priorities, shift incentives, build capacity and improve governance structures.”

“While some may point to the fact that overall investment in renewables fell in 2013, the drop masks the many positive signals of a dynamic market that is fast evolving and maturing,” he added.

“This should give governments the confidence to forge a new robust climate agreement to cut emissions at the 2015 climate change conference in Paris.”

Ulf Moslener, Head of Research of the Frankfurt School-UNEP Collaborating Centre for Climate & Sustainable Energy Finance, called the overall decline in investment dollars in renewable energy disappointing, but added that the “foundations for future growth in the renewable energy market fell into place in 2013.”

Michael Liebreich, Chairman of the Advisory Board for Bloomberg New Energy Finance, pointed to several hopeful signs for renewable energy after years of painful shake-out in the sector, including “lower costs, a return to profitability on the part of some leading manufacturers, the phenomenon of unsubsidized market uptake in a number of countries, and a warmer attitude to renewables among public market investors.”

The report notes the end of a four-and-a-half year, 78 per cent decline in clean energy stocks, which bottomed out in July 2012 and then gained 54 percent in 2013. That improvement took place as many companies in the solar and wind manufacturing chains regained profitability after a tough period of over-capacity and corporate distress.

Large hydroelectric projects were another important area of investment, with at least 20 Gigawatts (GW) of capacity estimated to have come online in 2013, equivalent to approximately $35 billion of investment.

Although investment in renewable energy capacity (including all hydro) in 2013 was once again below the gross investment in fossil-fuel power – at $227 billion compared with $270 billion – it was roughly double the net figure for investment in fossil-fuel power excluding replacement plant.

Last year also marked a deepening involvement of long-term investors – pension funds, insurance companies, wealth managers and private individuals – in the equity and debt of wind and solar projects. Part of their new engagement was through clean energy bond issuance, which set a new record of $3.2 billion raised in 2013, as well as through new types of financing vehicles including North American “yield companies” and real estate investment trusts.

But the standout performer among investment types in 2013 was public market equity raising by renewable energy companies, which soared 201 percent to $11 billion, the highest since 2010. This was largely due to the rally in clean energy share prices and institutional investors’ appetite for funds offering solid yields.

Additional highlights from the report include:

•    2013 was the first year that China invested more in renewable energy than Europe. China’s total investment was down by six percent to $56 billion, while Europe’s fell 44 percent to $48 billion. Meanwhile, US investment fell 10 percent to $36 billion, while India’s dropped 15 percent to $6 billion and Brazil’s fell 54 percent to $3 billion, the lowest since 2005.

•    The Americas, excluding the US and Brazil, increased investment in renewables by 26 percent, to $12 billion, in 2013. Japan’s solar boom helped to drive an 80 percent increase in renewable energy investment to $29 billion during that time.

•    Installed solar was up 26 percent, from 31 GW in 2012 to a record 39 GW in 2013, despite a 23 percent decline in solar capacity investment from $135.6 billion to $104.1 billion.

•    A handful of significant projects in Latin America, Middle East and Africa are taking place in wind and solar without any subsidy support, or in preference to more expensive fossil-fuel options.

•    Renewables, excluding large hydro, accounted for 8.5 percent of global electricity generation in 2013, up from 7.8 percent in 2012, and have seen cumulative investment of over $1.5 trillion since 2006.

Key findings from the report will be showcased at the Bloomberg New Finance Initiative “Future of Energy Summit” in New York April 4-7.

New Prostate Cancer Test May Remove The Need For Painful Biopsies

April Flowers for redOrbit.com – Your Universe Online

Globally, over one million men a year have prostate tissue samples taken to check for signs of prostate cancer. This procedure can involve up to 12 large biopsy needles and has been called barbaric by Mayo Graduate School of Medicine Professor Matthew Gettman. As well, it reveals that 70 percent of the men undergoing the procedure do not have cancer. Beyond being unnecessarily painful and risky for the patient, the procedure is prohibitively costly—approximately $2,500 USD.

A team of researchers from Eindhoven University of Technology (TU/e) collaborated with the Academic Medical Center (AMC) Amsterdam to create a patient-friendly examination which has the potential to drastically reduce the need for biopsies. The findings will be presented later this month at the European Association of Urology Congress in Stockholm.

Every year, prostate cancer kills hundreds of thousands of men worldwide. At the beginning of any prostate cancer exam, clinicians will measure the Prostate Specific Antigen (PSA) value in the blood. If the PSA is high, samples of prostate tissue will be collected through the anus at six to sixteen points for pathological testing. Despite high PSA values, 70 percent of those biopsied show no signs of cancer.

This doesn’t mean a high PSA level is a false alarm, however. It is possible the biopsy was taken at the wrong site.

Among those with negative results, 30 percent are later found to have cancer. Among those patients who have cancerous results from the biopsy, the exact size of the tumor remains unknown. For many of these, subsequent surgery shows that the tumors were so small that surgery was not required. The biopsy procedure leads to inflammations in up to five percent of the patients.

Massimo Mischi, TU/e Associate Professor of Engineering, led the research team. They have developed a new technique for determining whether and where men have prostate cancer using existing ultrasound scanners. Ultrasound scanners use sound waves to create images of the body’s internal organs, but they are usually unable to show the differences between healthy and tumor tissue. Tumor tissues produces a large number of small blood vessels with a characteristic pattern in order to grow, and Mischi’s team used this pattern to make the tumor visible.

A single injection of contrast medium that contains tiny bubbles can be seen by the ultrasound scanner, right down to the smallest blood vessels. The computer uses advanced image-analysis techniques that can recognize the characteristic blood vessel pattern to generate an image showing the location of the tumor. The exam takes a total of one minute, and results are available moments later—without costly biopsy analysis.

The team compared the results of their “tumor images” with the actual prostates removed by surgery from 24 patients from three hospitals in the Netherlands. The comparison confirmed that the locations and sizes of the tumors were well indicated by the images.

TU/e has patented the new technique, which can help prevent the need for biopsies for millions of men around the world. For a large portion of the 70 percent of men who currently have biopsies performed unnecessarily, the images will eliminate the need all together. For the 30 percent who are cancerous, the need for biopsies will be lessened because the location and size of the tumor will already be clear. The researchers hope that the need for the painful and costly biopsies will be eliminated all together once sufficient clinical practice has proven the imaging method.

Next year, AMC Amsterdam and the two other participating Dutch hospitals will conduct a major comparative study between the old and new methods. Both methods will be used on at least 250 men, and if all goes well, the new technique will be made available in 2016 to more patients. The introduction of the new method will be quite simple, as most hospitals already have the ultrasound equipment necessary.

Stegodyphus lineatus

Stegodyphus lineatus is the only European species belonging to the spider genus Stegodyphus.

The males of this species are up to 12 millimeters long, while the females are up to 15 millimeters long. The coloration can range from whitish to almost black. In the majority of individuals, the opisthosoma is whitish with two broad black longitudinal stripes. The males and females look similar, but the male is usually richer in contrast and has a bulbous forehead. The species name is in reference to the black lines on the back of these spiders.

They construct a web between twigs, mostly of low thorny shrubs. The web has a diameter of about 30 centimeters and is attached to a retreat consisting of silk and covered with debris and food remains. The retreat is a cone-shaped structure roughly 5 centimeters long, which has an entrance at one end. The spiderlings hatch in this retreat, being released from their cocoon and then protected by the mother for an additional two weeks. The adult males can be found during the spring.

Offspring are matriphagous, meaning they eat their own mother. The females can mate with several males. While the females may gain some benefits from multiple matings, overall polyandry is costly and mated females are frequently aggressive towards the males. The males may change upon only one or two mates for the duration of their lifetime. Egg sacs are lost quite often because of predation by ants. The female is unable to lay another clutch if they lose their first one. This represents an opportunity for the males, however, who can secure themselves a mate by simply disposing of her offspring. The males perform this action by detaching the egg sac with their chelicerae, moving it to the entrance, then simply tossing it to the ground.

However, this behavior, known as infanticide, is not so straightforward for the females. Fitness is greatly reduced by the loss of their young, with the female being less likely to survive until her next lot of young can hatch, as well as having fewer eggs the second time. Their offspring also hatch later in the season and are less likely to prosper. With this in mind, it comes with little surprise that the females aggressively attend to their nest, chasing off around half of the males that come nearby. The males can sustain injuries in these encounters with the larger females, and in some cases, the female not only kills, but eats the male trespasser. For the males, however, there is little option but to take this risk, resulting in sexual conflict that see around 8 percent of egg sacs ruined.

Image Caption: Stegodyphus lineatus – Taken in La Azohia.Cartagena.Spain 2007-04. Credit: JoaquinPortela/Wikipedia (CC BY-SA 3.0)

Tracking Ecology Of Sperm Whales Through Stomach Contents

University of Massachusetts at Amherst

In the largest regional study of its type to date, marine ecologists offer better understanding of the feeding ecologies of 2 very rare sperm whale species in waters off the southeast US coast, adding baseline data they say are important as climate change, fishing and pollution alters the animals’ environment and food sources.

“Understanding what resources support populations of these incredibly rare animals is important to conservation,” Staudinger, adjunct assistant professor in environmental conservation at the University of Massachusetts Amherst, says of the pygmy and dwarf sperm whales she studied. “If there are changes in the environment or their prey, we can now hope to better anticipate the potential impacts. There had been quite a knowledge gap about these animals, but this work gives us an idea of their ecological niche and requirements in the current environment.”

For the investigation, which used two complementary methods to characterize whale foraging ecology, Staudinger and colleagues at the University of North Carolina Wilmington (UNC) analyzed stomach contents collected by the marine mammal stranding network from 22 pygmy and nine dwarf sperm whales found dead on the mid-Atlantic coast between 1998 and 2011. Study results appear in the April issue of Marine Mammal Science.

These whales in the genus Kogia feed almost entirely on beaked squid, cephalopods whose bodies are digested in whale stomachs except for the hard beaks made of chitin, a fingernail-like substance. Staudinger explains, “Alldeceased stranded marine mammals are necropsied, and scientists save and evaluate the stomach contents. So the stranding network had a stockpile of stomachs collected over 13 years from two of the most commonly stranded whales along the southeast and mid-Atlantic coast.”

She adds, “Here I have to confess that I have a kind of unusual ability I learned in earlier research: I can identify cephalopod species by their beaks, a characteristic similar to birds. So when I heard about this study, I jumped at the chance to study these whales.”

Some cephalopod species she couldn’t recognize from her own reference samples, the marine ecologist noted, “so I went to the Smithsonian Institution’s collection, where there are hundreds of species in collections of whalers and fishermen dating back to the 1800s.”

Specifically, Staudinger and colleagues hoped to identify differences, if any, in ecological niches occupied by pygmy and dwarf sperm whales. These smaller cousins of the sperm whale were once thought to be a single species until modern analyses showed they are genetically distinct.

Beak analysis from cephalopod remains showed the diet of pygmy sperm whales to be more diverse than that of the dwarf species, the researchers report, and prey sizes were slightly larger for the pygmy than for the dwarf, but not statistically significantly so.

In the second analysis, they evaluated ratios of carbon and nitrogen isotopes in whale muscle samples, an indicator providing information on which habitats the whales were feeding in. That is, the eco-zone (e.g., mesopelagic and bathypelagic) and approximate depths where whales were feeding and whether their diets contained prey high on the food chain such as fish, or lower such as small crustaceans. Staudinger says, “As far as we know this the first time the isotopic signatures have been published for dwarf sperm whales.”

She adds that isotopic tracer data suggest these two rare species, while not exactly the same, showed no significant differences in foraging parameters. “We found the ecologic niche of the two species is very similar in U.S. Atlantic waters, which is consistent with other global studies,” Staudinger summarizes. “The pygmy sperm whale consumes a greater diversity and size of prey, which means they may be diving deeper than dwarf sperm whales to feed, this makes sense because pygmy sperm whales grow to larger sizes than dwarf sperm whales, however, this could also be an artifact of small sample sizes.”

This is important information, Staudinger says, because if these two species show no evidence of resource partitioning there are likely enough food resources to support both their populations in the region. Though if resources shifted or became limiting, pygmy sperm whales would likely have an advantage over dwarf sperm whales as they show evidence of being able to exploit a wider range of food resources and habitats.

In the future, Staudinger plans to expand her expertise in deep-sea squid ecology through additional studies of marine mammals with UNC Wilmington, and a new investigation of cephalopod biodiversity on the Bear Seamount in the North Atlantic with the Smithsonian Institution, and the National Marine Fisheries Service National Systematics Laboratory.

The current work was funded by the National Oceanic and Atmospheric Administration and the Center for Marine Science at UNC Wilmington.

Genetic Screening For Endometriosis-Associated Ovarian Cancer

April Flowers for redOrbit.com – Your Universe Online

Endometriosis is a chronic inflammatory disease that affects more than 176 million women and girls worldwide, according to the Endometriosis Foundation of America. Despite being one of the most common gynecological disorders, there is no definitive consensus on the cause of endometriosis. To add insult to injury, some women who have endometriosis are also predisposed to ovarian cancer.

A new study from the University of Pittsburgh Cancer Institute (UPCI) and Magee-Womens Research Institute (MWRI) reveals that genetic screening could someday help clinicians to know which women are most at risk.

The research team will present their results on the first comprehensive immune gene profile exploring endometriosis and cancer on Monday at the American Association for Cancer Research (AACR) Annual Meeting 2014.

“A small subset of women with endometriosis go on to develop ovarian cancer, but doctors have no clinical way to predict which women,” said Anda Vlad, MD, PhD, assistant professor of obstetrics, gynecology and reproductive sciences at MWRI. “If further studies show that the genetic pathway we uncovered is indicative of future cancer development, then doctors will know to more closely monitor certain women and perhaps take active preventative measures, such as immune therapy.”

Endometriosis is a painful condition that is often misdiagnosed for years before some form of correct treatment is attempted. As redOrbit reported in February, it is called a disease of theories, because so little is known about how it works, or who it will strike.

“We know there is a genetic component, we know there is an environmental component, and we know there is an inflammatory component. But it’s very difficult to say for individual patients what particular sequence of events led to particular symptoms,” Michael Beste, a postdoc in MIT’s Department of Biological Engineering, said.

It is the genetic component, and its association to cancer, that Vlad and her team are focused on finding.

Vlad and her team screened tissue samples from women with benign endometriosis, women with precancerous lesions and women with endometriosis-associated ovarian cancer. This allowed the researchers to identify the complement pathway, which refers to a series of protein interactions that trigger an amplified immune response, as the most prominent immune pathway that is activated in both endometriosis and endometriosis-associated ovarian cancer.

“If, as our study indicates, a problem with the immune system facilitates cancer growth through chronic activation of the complement pathway, then perhaps we can find ways to change that and more effectively prime immune cells to fight early cancer, while controlling the complement pathway,” said Swati Maruti Suryawanshi, PhD, a post-doctoral research fellow at MWRI.

NASA’s LADEE Satellite Prepares For Planned Lunar Impact

April Flowers for redOrbit.com – Your Universe Online
NASA’s Lunar Atmosphere and Dust Environment Explorer (LADEE) launched in September 2013 on a mission to gather detailed information about the atmosphere of the Moon. LADEE, pronounced “laddie,” was also supposed to report back on conditions near the surface, as well as environmental influences on lunar dust. LADEE’s science mission is nearly over, with an impact on the lunar surface planned for April 21 of this year. But LADEE isn’t done yet, according to a recent NASA report.
Ground controllers at NASA’s Ames Research Center are gradually lowering the spacecraft’s orbital altitude to fly approximately one to two miles above the surface of the Moon. This will allow LADEE to gather science measurements at the lowest altitude possible before it runs out of fuel, forcing an orbital decay.
The ground control team is planning a final maneuver to ensure that LADEE’s trajectory will cause the impact to occur on the far side of the Moon, which is not in view of Earth or near any previous lunar mission landings. The margin for error, however, is small. With the navigation system aboard LADEE, and the low orbital altitudes, even a small error could make the difference between an impact and the spacecraft remaining in orbit. For this reason, the team is not attempting to target a specific impact location.
“The moon’s gravity field is so lumpy, and the terrain is so highly variable with crater ridges and valleys that frequent maneuvers are required or the LADEE spacecraft will impact the moon’s surface,” said Butler Hine, LADEE project manager at Ames. “Even if we perform all maneuvers perfectly, there’s still a chance LADEE could impact the moon sometime before April 21, which is when we expect LADEE’s orbit to naturally decay after using all the fuel onboard.”
Until mid-April, LADEE’s altitude control thrusters will be fired once a week to keep the spacecraft in its target orbit. The final orbital maneuver will be performed on April 11, just ahead of the total lunar eclipse on April 15. During the four-hour eclipse, when Earth’s shadow passes over the Moon causing it to turn blood-red, LADEE will be exposed to conditions at the limits of what it was designed to withstand.
The eclipse will be easily observable with the naked eye over most of North America, CBS San Francisco reports. The color of the moon is a result of the sun’s light filtered through Earth’s atmosphere and projected onto the moon.
“If LADEE survives the eclipse, we will have nearly a week of additional science at low altitudes before impact,” said Rick Elphic, LADEE project scientist at Ames. “For a short mission like LADEE, even a few days count for a lot – this is a very exciting time in the mission.”
The ground control team will assess LADEE’s functionality after the eclipse to see if it is healthy enough to continue acquiring and transmitting data. If the spacecraft is able, this will continue as long as its altitude and contact with ground controllers allow.
“We’re very eager to see how LADEE handles the prolonged exposure to the intense cold of this eclipse, and we’ve used flight data to predict that most of the spacecraft should be fine,” said Hine. “However, the eclipse will really put the spacecraft design through an extreme test, especially the propulsion system.”
LADEE launched from NASA’s Wallops Island Flight Facility and reached lunar orbit on October 6, 2013. On November 10, the vending-machine sized spacecraft began gathering science data. LADEE began its primary 100-day science phase orbit on November 20 and has been in extended mission operations since February 28, 2014.
“Because the LADEE team has flawlessly performed every maintenance maneuver, they’ve been able to keep the spacecraft flying in its proper orbit and have enabled this amazing mission extension and science to continue up until the very end,” said Joan Salute, LADEE program executive at NASA Headquarters in Washington.
Aboard LADEE, three science payload instruments have taken more than 700,000 measurements in an effort to unravel the mysteries of the moon’s atmosphere. Previously, LADEE’s closest approach to the surface of the moon was between 12.5 and 31 miles, while the farthest was between 47 and 93 miles. This positioning allowed LADEE to pass from lunar day to lunar night every two hours, providing a unique vantage point on the full range of changes and processes occurring within the moon’s tenuous atmosphere.
The essential question driving LADEE’s data acquisition was to discover if lunar dust, electrically charged by sunlight, is responsible for the pre-sunrise glow detected during several Apollo missions above the lunar horizon. The science instruments are also gathering data about the structure and composition of the thin lunar atmosphere. Understanding these characteristics more thoroughly will help scientists understand other bodies in the solar system, such as large asteroids, Mercury, and the moons of outer planets.

Want to get involved? On Friday, NASA announced that it wants to hear your best guess on when LADEE will actually impact the lunar surface through the “Take the Plunge: LADEE Impact Challenge.” Winners will be announced after impact and will be e-mailed a commemorative, personalized certificate from the LADEE program. The submission deadline is 3 p.m. PDT Friday, April 11.

Hubble Data Shows True Size Of El Gordo Monster Galaxy

April Flowers for redOrbit.com – Your Universe Online | Source Material Provided by NASA
El Gordo is Spanish for “the fat one,” and the nickname of the largest galaxy known galaxy cluster in the distant universe, ACT-CLJ0102-4915. A new study from NASA’s Hubble Space Telescope reveals that the galaxy cluster, which is 9.7 billion light-years from Earth, definitely lives up to its nickname.
According to a NASA report, a team of astronomers measured how much the cluster’s gravity warps images of galaxies in the distant background. This allowed the scientists to calculate the cluster’s mass to be at least 3 million billion times the mass of our sun—roughly 43 percent more massive than was previously thought.
To measure how strongly the mass of the cluster warped space, the research team used Hubble’s high resolution, which allowed for measurements of “weak lensing.” Much like a funhouse mirror, weak lensing is seen when the cluster’s immense gravity subtly distorts space, warping the images of background galaxies. The more the images are warped, the greater the mass contained in the cluster.
“What I did is basically look at the shapes of the background galaxies that are farther away than the cluster itself,” explained James Jee of the University of California at Davis. “It’s given us an even stronger probability that this is really an amazing system very early in the universe.”
El Gordo’s mass is divided three ways. First, a small fraction of the mass is represented in the several hundred galaxies that make up the cluster. Second, a larger fraction of the mass is made up of the hot gas filling the entire volume of the cluster. The third and largest portion of the mass is found in dark matter, an invisible form of matter that makes up the bulk of the mass of the universe.
In the closer, and younger, parts of the Universe, there are equally massive galaxy clusters — for example, the Bullet cluster. In the far distant reaches of the Universe, however, nothing like this has been discovered to exist so far back in time — approximately half the current estimated age of the Universe, 13.8 billion years. Based on the current cosmological models, the researchers suggest that such massive cluster galaxies are very rare in the early Universe.
El Gordo’s impressive size was first reported in January 2012. Using observations from NASA’s Chandra X-Ray Observatory, and galaxy velocities measured by the European Southern Observatory’s (ESO) Very Large Telescope (VLT) array in Chile, astronomers estimated the mass of the cluster galaxy using the motions of the galaxies moving inside the cluster and the temperatures of the hot gas between those galaxies.
El Gordo presented a challenge, however. The cluster looks as if it might have been formed as a result of a titanic collision between a pair of cluster galaxies. Astronomers describe such an event as two cosmic cannonballs hitting each other.
“We wondered what happens when you catch a cluster in the midst of a major merger and how the merger process influences both the X-ray gas and the motion of the galaxies,” explained John Hughes of Rutgers University. “So, the bottom line is because of the complicated merger state, it left some questions about the reliability of the mass estimates we were making.”
This is where the team decided to include the data from Hubble.
“We were in dire need for an independent and more robust mass estimate given how extreme this cluster is and how rare its existence is in the current cosmological model. There was all this kinematic energy that was unaccounted for and could potentially suggest that we were actually underestimating the mass,” Felipe Menanteau of the University of Illinois at Urbana-Champaign said.
The scientists expected to find the “unaccounted energy” because the merger of the galaxy clusters is tangential to the observers’ line-of-sight, meaning that there is a possibility that the observers are missing a good fraction of the kinetic energy of the merger because their spectroscopic measurements only track the radial speeds of the galaxies.
The team intends to continue their observations, using Hubble data to compile an image of the cluster. This image will be a stitched together mosaic rather than a single image because El Gordo is too large to fit into Hubble’s field of view.
“We can tell it’s a pretty big El Gordo, but we don’t know what kind of legs he has, so we need to have a larger field of view to get the complete picture of the giant,” said Menanteau.

International Space Station Moves Away From Orbital Debris

NASA

On Thursday, flight controllers conducted a Debris Avoidance Maneuver to steer the International Space Station clear of orbital debris. Aboard the orbiting complex, the Expedition 39 crew prepared for the departure of a cargo craft Thursday and tackled a variety of experiments, including the checkout of device that incorporates electrical impulses to keep muscles fit in the absence of gravity.

Playing it conservatively, flight controllers conducted a Pre-Determined Debris Avoidance Maneuver (PDAM) Thursday to raise the altitude of the International Space Station by a half-mile and provide an extra margin of clearance from the orbital path of a spent payload deployment mechanism from an old European Ariane 5 rocket.

NASA and Russian flight controllers tracked the Sylda Adapter for the past few days before jointly deciding to perform the maneuver, which used the ISS Progress 53 thrusters at the aft end of the Zvezda Service Module for a 3 minute, 40 second firing at 4:42 p.m. EDT that provided a reboost for the orbital laboratory.

The Ariane 5 payload deployment mechanism was forecast to pass less than 2/10 of a mile of the station at 7:02 p.m. EDT had no action been taken. The six-man Expedition 39 crew was informed of the maneuver, was never in any danger and did not have to take shelter in their respective Soyuz return vehicles. The maneuver will have no impact on the upcoming launch of a new Russian Progress resupply vehicle on April 9 from the Baikonur Cosmodrome in Kazakhstan to bring almost three tons of supplies to the outpost, or the pending launch of the SpaceX/Dragon commercial launch vehicle later this month from the Cape Canaveral Air Force Station, Fla. to the station.

Learn more about orbital debris and the space station

Russian Space Official Says NASA Statement To Suspend Contracts Was ‘Too Harsh’

Lawrence LeBlond for redOrbit.com – Your Universe Online
A senior space official with the Russian space program has countered NASA’s move to suspend contracts with Roscosmos on several projects, stating that it will have a global backlash on space research and may harm future partnerships with the Americans.
Ivan Moiseyev, Director of the Space Policy Institute told RIA Novosti on Thursday that despite harming global space partnerships, this will not be the end of the Russian space program.
On Wednesday, April 2, NASA’s Michael O’Brien issued a statement announcing that the US space agency would be suspending several contracts it has with Russia due to the ongoing crisis between them and the Ukraine. He maintained that the suspension would not affect operations and activities pertaining to the International Space Station.
In light of that statement, Moiseyev went on the offensive:
“The statement was way too harsh,” he told RIA Novosti, warning that the move would have a “rather significant” impact on space exploration project on a global scale.
“Modern space science is a global phenomenon that benefits all countries,” Moiseyev noted. “It means that many large-scale projects require an international effort. A freeze on cooperation will spur a serious backlash against the international space program.”
He added that this move was something Russia had not seen coming and would never want for space science. But, he maintained, the space agency would simply adjust to the new reality and move forward, despite the possibility of catastrophic repercussions that may follow.
Moiseyev told RIA Novosti in an earlier interview that Russia didn’t depend much on the US when it comes to the space industry.
As far as dependence goes, the US currently relies on Russia to maintain an American presence in space. When the Space Shuttle program retired in 2011, NASA’s only way of sending astronauts to the ISS was to pay Russia tens of millions of dollars for a seat on the Soyuz. NASA is now building partnerships with several privatized companies to bring manned space launches back to American soil.
Leading the way in the manned space race is SpaceX, which plans on having a manned space launch system ready as early as 2015. The company’s founder and CEO, Elon Musk, also dreams of one day sending man to Mars. That reality is still a dozen or more years out.
Returning human spaceflight to American soil and ending reliance on Russia is a major goal for the White House as well.
“This has been a top priority of the Obama Administration’s for the past five years, and had our plan been fully funded, we would have returned American human spaceflight launches — and the jobs they support — back to the United States next year,” according to a statement from NASA, as cited by CNN.
“With the reduced level of funding approved by Congress, we’re now looking at launching from U.S. soil in 2017. The choice here is between fully funding the plan to bring space launches back to America or continuing to send millions of dollars to the Russians. It’s that simple,” continues the statement.
Russia’s annexation of Crimea in the Ukraine has been condemned as a violation of international law by Western leaders, who have called on Moscow to pull back its tens of thousands of troops from around Ukraine’s eastern border.
Russia maintains the forces are there for military exercises and will return to base when those are finished.

The Trix Rabbit Has Eyes For Your Child: How Eye Contact Influences Consumer Cereal Choices

Lawrence LeBlond for redOrbit.com – Your Universe Online

The next time you go shopping you may want to stay out of the cereal aisle if you have your kids in tow. A new study from Cornell University has discovered that cereals marketed to children are oriented about 23 inches (on average) off the ground and have spokes-characters that draw the attention of kids with their gazes.

Researchers Aner Tal and Brian Wansink from Cornell Food and Brand Lab worked with Aviva Musicus from Yale to determine if spokes-characters on cereal make eye contact and if the eye contact influences a child’s choice.

For the study, published in the Journal of Environment and Behavior, the team first conducted an experiment to determine whether the angle of the gaze in spokes-characters on children’s cereal was such that it would create eye contact with most children. The researchers tested 65 cereal brands in 10 different grocery stores in New York and Connecticut and found that of 86 different spokes-characters evaluated, 57 were marketed to children with a downward gaze at an angle of 9.67 degrees.

In contrast, cereals geared toward adult shoppers were placed higher on the store shelves and had spokes-characters that most often looked straight forward or slightly upward at a 0.43-degree angle.

In the stores studied, most of the cereal brands geared toward children were placed on the bottom two shelves (avg. height of 20.21 inches), while the top two shelves were reserved for adult cereals (avg. height of 53.99 inches), a finding that correlates with previous studies on cereal placement.

In a second experiment, the researchers examined the extent to which eye contact influences feelings of trust and connection with the brand. The team recruited 63 individuals from a private northeastern university who were asked to view a box of Trix cereal and rate their feelings of trust and connection to that brand. The participants were randomly shown one of two versions of the spokes-character on the cereal: one with the rabbit looking straight ahead at the viewer and the other with the rabbit looking down.

In the cereal box with the rabbit that made eye contact, the team found that brand trust was 16 percent higher and connection to the brand was 28 percent higher. Also, the participants reported that they liked Trix better, compared to other cereals, when the rabbit made eye contact. This finding indicates that cereal brands with spokes-characters that make eye contact may increase positive feelings toward the product and encourage children and, as seen in this study, adults to buy that particular brand.

The findings in this study indicate that spokes-characters that make eye contact are part of a package design that can be used as a powerful advertising tool to influence consumer habits toward a brand.

“There are some cool things happening in grocery stores, many based on psychology, that impact how and what people purchase,” said Tal in a statement.

“If you are a parent who does not want your kids to go ‘cuckoo for Cocoa Puffs,’ avoid taking them down the cereal aisle. If you are a cereal company looking to market healthy cereals to kids, use spokes-characters that make eye contact with children to create brand loyalty,” said Wansink in a statement.

NASA Releases Image Of M-class Solar Flare

Karen C. Fox, NASA’s Goddard Space Flight Center
On April 2, 2014, the sun emitted a mid-level solar flare, peaking at 10:05 a.m. EDT, and NASA’s Solar Dynamics Observatory captured imagery of the event. Solar flares are powerful bursts of radiation. Harmful radiation from a flare cannot pass through Earth’s atmosphere to physically affect humans on the ground, however — when intense enough — they can disturb the atmosphere in the layer where GPS and communications signals travel.
To see how this event may impact Earth, please visit NOAA’s Space Weather Prediction Center at http://spaceweather.gov, the U.S. government’s official source for space weather forecasts, alerts, watches and warnings.
This flare is classified as an M6.5 flare. M-class flares are ten times less powerful than the most intense flares, which are labeled X-class. The number after the M provides more information about its strength. An M2 is twice as intense as an M1, an M3 is three times as intense, etc.
Updates will be provided as needed.
Related Links
Frequently Asked Questions Regarding Space Weather
View Other Past Solar Activity

FDA Approves Sublingual Pill To Treat Hay Fever Symptoms

redOrbit Staff & Wire Reports – Your Universe Online

In a move that could replace allergy shots for some men and women, the US Food and Drug Administration (FDA) announced on Wednesday that it had approved the first tablet for the treatment of hay fever symptoms.

The product is known as Oralair, according to USA Today’s Kim Painter, it will only work against certain types of grass pollens and will take multiple months before it starts working – meaning that it will not help people with other types of allergies and will not come in time to provide relief for early symptoms this summer.

Oralair is manufactured by a French pharmaceutical company Stallergenes and is the first under-the-tongue (sublingual) allergen extract to be approved in the US, the FDA said. It contains a cocktail of freeze-dried pollen extracts from five different types of grasses (Kentucky Blue Grass, Orchard, Perennial Rye, Sweet Vernal and Timothy) and can be used to treat allergic rhinitis with or without eye inflammation.

The medication is a once-daily, rapidly-dissolving tablet that is to be started four months prior to the grass pollen season and continued throughout the entire hay fever season, the American federal health agency added. The first dose is to be administered at a healthcare provider’s office, so that medical professionals can monitor patients for a period of 30 minutes to monitor for potential side effects. Subsequent doses can be taken at home.

“While there is no cure for grass pollen allergies, they can be managed through treatment and avoiding exposure to the pollen,” said Dr. Karen Midthun, director of the FDA’s Center for Biologics Evaluation and Research. “The approval of Oralair provides an alternative to allergy shots that must be given in a health care provider’s office. Oralair can be taken at home after the first administration.”

Studies evaluating the safety and effectiveness of Oralair were conducted both in the US and Europe and involved about 2,500 people. Some of those patients received the new allergy tablet, while others were given a placebo.

Patients then reported their symptoms and additional medications required to make it through the grass pollen season. During the course of one season, patients who took Oralair experienced a 16 to 30 percent reduction in symptoms and the need for additional medications compared to those who received placebos, the FDA reported.

Dr. James Li, chairman of the division of allergy and immunology at Mayo Clinic in Rochester, Minnesota, told Painter that the success rate was slightly lower than allergy shots in studies, though the two treatment programs have yet to be directly compared to each other.

“The pills can cause some side effects: In studies, one third of patients developed itchy mouths and some reported throat irritation,” Painter said. She also notes that experts believe the fact that the pill is only effective against one type of allergy is a drawback, as most patients suffering from these types of symptoms are also affected by other types of pollens and environmental allergens such as pet dander and dust mites.

The Associated Press (AP) reports that Oralair has been approved for patients between the ages of 10 and 65. The medication was first approved for use in Europe six years ago, and is currently available in 31 different countries, including Australia, Canada and Russia.

Touch Makes A Female Cockroach Produce Faster

North Carolina State University

To speed up reproduction, there’s no substitute for the tender touch of a live cockroach.

That’s the major takeaway from a North Carolina State University study examining whether artificial antennae – in this case, duck feathers – can mimic a cockroach antenna’s capacity to hasten reproduction in cockroach females.

Female cockroaches that get “touched” – by other female cockroaches and, under certain conditions, even by duck feathers that mimic roach antennae – reproduce faster than female roaches that live in isolation or without tactile stimulation.

Pairing two cockroaches together – even roaches of different species – speeds up reproduction the most.

“To understand the mechanisms behind tactile stimulation and reproduction, we devised a motor-driven system using duck feathers as stand-ins for cockroach antennae. We found that these artificial antennae worked to stimulate certain hormones that speed up reproduction in the female German cockroach,” says Dr. Coby Schal, Blanton J. Whitmire Professor of Entomology at NC State and the senior author of a paper describing the research. “We also found that the shape of the artificial antenna doing the ‘touching’ and the speed and duration of the stimulation were key factors that influenced reproduction speed.”

Female cockroaches gain the capacity to produce eggs when they become adults, so reproduction speed can be described as the duration of time between the onset of adulthood and the first bout of egg-laying. Stimulating juvenile hormone production in adult female cockroaches accelerates growth of their eggs. When the eggs grow to a certain size, female cockroaches lay the eggs. Faster egg growth, therefore, translates into faster egg-laying. Differences between speedy reproduction and slower reproduction could be several days. And speedy reproduction leads to bigger infestations.

To elucidate the role that tactile sensation plays in reproduction, the researchers, led by first author Adrienn Uzsak, a former Ph.D. student in Schal’s lab, performed a number of experiments. Throughout all the experiments, the researchers showed that isolating female roaches, or exposing them to dead roaches, slowed the reproductive process.

Isolating a female cockroach in a Petri dish while allowing another cockroach’s antennae to protrude into the dish sped up reproduction greatly, but cutting the interloper’s antennae down to nubs stifled reproduction speed to that of roaches in isolation.

The researchers also developed a motorized chamber that rotated a duck feather inside a Petri dish to act as a surrogate roach antenna. The researchers changed both the speed of the rotating feather and the duration of stimulation. Short bursts of stimulation with slow motor speeds led to faster reproduction, while longer stimulation bouts with a fast-moving feather slowed reproduction.

Finally, the researchers used different types of duck feathers in the motorized chamber, contrasting longer, barbed feathers with shorter, unbarbed feathers. Longer, barbed feathers stimulated faster reproduction.

“In studies over the years, we’ve learned how challenging it is to pinpoint the role of tactile stimulation in reproduction, although we knew from the start that it was important,” Schal said. “Now we’ve learned that, under the right conditions, artificial antennae – including some that are quite different from cockroach antennae – can speed up reproduction, clarifying the importance of antennal contact and the role of antennal movement. Now that we have an experimental approach for consistent tactile stimulation, our next challenge is to understand how these stimuli cause females to produce more of the hormone that accelerates reproduction.”

The study appears online in Proceedings of the Royal Society B. James Dieffenderfer and Dr. Alper Bozkurt, form NC State’s Department of Electrical and Computer Engineering, co-authored the paper. Funding from the U.S. Department of Agriculture, the National Science Foundation and the Blanton J. Whitmire endowment powered the work.

Most Children Are Confused By The Message In Healthy Fast Food Ads

[ Watch the Video: Fast Food Ads For Healthier Kids Meals Don’t Send Right Message ]

Brett Smith for redOrbit.com – Your Universe Online

Fast food companies have been making a push over the past few years to include healthier options for their kids’ meals, but that message may not be getting across – especially to the kids themselves.

A new study published on Monday in the journal JAMA Pediatrics found that about one-half to one-third of children do not properly identify the healthier milk and apple slices options shown in freeze-framed television advertisements.

In one ad image used in the study, Burger King apple slices were depicted inside an oval container typically used for french fries – and identified as such by kids 90 percent of the time.

“And I see some…are those apples slices?” one child asked the researcher showing her the images.

“I can’t tell you…you just have to say what you think they are,” the researchers replied.

“I think they’re french fries,” the child said.

“Burger King’s depiction of apple slices as ‘Fresh Apple Fries’ was misleading to children in the target age range,” said study author Dr. James Sargent, co-director of the Cancer Control Research Program at Dartmouth College’s Norris Cotton Cancer Center. “The advertisement would be deceptive by industry standards, yet their self-regulation bodies took no action to address the misleading depiction.”

The study team said their research, which included 99 children ages 3 to 7 years old, was designed to look at the perception of McDonald’s and Burger King campaigns to advertise apples and milk in their kids’ meals. The team looked at ads from these companies targeted at children from July 2010 through June 2011. In this study, scientists used “freeze frames” of Kids Meals displayed in TV ads that were frequently shown on Cartoon Network, Nickelodeon, and other cable channels that feature children’s programming.

Of the four healthy food depictions examined, only McDonald’s display of apple slices was acknowledged as an apple product by a large majority of the target audience, 52 percent, despite age of the child. The scientists discovered that the other three representations were simply examples of poor communication from the company.

The new study builds on an earlier analysis from the same team, which discovered that McDonald’s and Burger King children’s advertising highlighted free gifts like toys to cultivate children’s brand awareness for a particular fast food restaurant. These campaigns are developed despite self-imposed directions by the fast food industry designed to deter this practice.

While the Food and Drug Administration and the Federal Trade Commission have critical regulatory roles over food labels and advertising, the Better Business Bureau runs a self-regulatory program for children’s advertising. Two distinct programs offer directions to make children’s advertising centered on the food, not toys – and, more explicitly, on foods with significant nutritional value.

“The fast food industry spends somewhere between $100 to 200 million dollars a year on advertising to children, ads that aim to develop brand awareness and preferences in children who can’t even read or write, much less think critically about what is being presented.” Sargent said.

Your Hand Soap Could Be Doing You More Harm Than Good

Rebekah Eliason for redOrbit.com – Your Universe Online

Antimicrobial household products are widely used, but scientists are wondering if they are doing more harm than good. A new study from Arizona State University has assembled evidence suggesting that antimicrobial use brings consumers no measurable benefit.

Even more troubling, according to recent research, the lax regulation of these products has caused toxic compounds to spread throughout wildlife and human populations contaminating the environment.

After leaving the issue unfinished for 40 years, the Food and Drug Administration (FDA) reevaluated the safety of additives in the most common antibacterial household products. Specifically, the FDA studied triclocarban (TCC) and triclosan (TCS), which are chemicals commonly used to make soaps and toothpaste.

Rolf Halden, who has tracked the topic for years, said, “It’s a big deal that the FDA is taking this on.” Halden is the director of the Center for Environmental Security, which is a joint research hub created with support from Arizona State University’s Biodesign Institute, Fulton Schools of Engineering and the Security and Defense Systems Initiative.

To address safety and environmental concerns, the FDA has stipulated that manufactures demonstrate the safety of their products within one year or they must completely remove them from products. Until June, the FDA rule is open for public comment.

“The FDA’s move is a prudent and important step toward preserving the efficacy of clinically important antibiotics, preventing unnecessary exposure of the general population to endocrine disrupting and potentially harmful chemicals, and throttling back the increasing release and accumulation of antimicrobials in the environment,” said Halden.

In 1957 TCC was introduced to the market and not long after in 1964 TCS began to be distributed.

“This multi-billion dollar market has saturated supermarkets worldwide and vastly accelerated the consumption of antimicrobial products,” wrote Halden in a paper published in Environmental Science & Technology. “Today, TCC and more so TCS can be found in soaps, detergents, clothing, carpets, paints, plastics, toys, school supplies, and even in pacifiers, with over 2,000 antimicrobial products available.”

When used properly in healthcare settings, antimicrobial soaps are an effective product. Surprisingly, these same products when used in households are ineffective because most people do not use them as they were intended. In order to be effective, public health officials recommend that a person sing a verse of “Row, row, row your boat” for about 20-30 seconds while scrubbing hands with the soap.

Halden estimated that on average consumers wash their hands for a short and ineffective 6 seconds. All this accomplishes is spreading TCC and TCS throughout the environment and exposing wildlife to its harmful effects.

Halden has designed sophisticated detection methods using modern research technology to study the consequences of antimicrobial use on human health and the environment. Halden’s research has added to the increasing amount of worldwide scientific evidence regarding the damage of TCC and TCS.

The research team discovered the following facts about antimicrobial compounds:

TCC and TCS are responsible for 60 percent of all the drugs detected in wastewater treatment plant sludge. Since the chemicals do not easily degrade, TCC and TCS have persisted in US sediments for more than 50 years. Both chemicals have endocrine and immunotoxic effects and contaminate lakes and rivers which cause a lifetime of exposure to aquatic organisms.

Approximately 310,000 lbs/yr of TCC and 125,000 lbs/yr of TCS are applied inadvertently on US agricultural land as a result of sewage sludge disposal, which presents a pathway for the contamination of food with antimicrobials and drug resistant microbes. In commercial grade TCS traces of toxic dioxin are present and additional dioxins are known to form upon disposal down the drain and during sludge incineration.

These facts are merely the environmental and wildlife consequences. Dangers to human health include increasing the development of drug-resistant infections as well as altering hormone levels in developing children which can possibly lead to the early onset of puberty.

According to the Centers for Disease Control (CDC), the chemicals are found in the urine of three-quarters of Americans. Even more troubling, an industry funded study detected TCS in the breast milk of 97 percent of US women tested.

Regulating TCS and TCC in the United States has been challenging. One single umbrella guidance document from 1974 attempted to regulate all the uses and best practices. This document was known as the topical antimicrobial drug products Over-the-Counter (OTC) Drug Monograph of the FDA.

Although 2014 marks the 40th anniversary of the OTC FDA issuance, this document has still not been finalized to protect consumers from the toxic effects of TCC and TCS.

According to Halden, the solution to solve current antimicrobial issues is ultimately innovation. He envisions using ‘green’ next generation antimicrobials. These new types of antimicrobials offer broad-spectrum effectiveness against pathogens but would possess a low toxicity as well as lower the potential for developing antimicrobial drug resistance. In addition, it would be important that they rapidly degrade in wastewater treatment plants in order to limit unwanted exposure and contamination of the environment.

Development of such a product has the potential to be a multi-billion dollar and highly competitive industry.

“Sustainability considerations already are informing the design of green pharmaceuticals and adopting this approach for antimicrobials promises to yield important benefits to people and the planet,” he concludes in the ES&T paper.

In the interim, Halden will travel to Washington, D.C. to exchange information with scientists and lawmakers at the FDA and the U.S. Environmental Protection Agency.

Australia’s Dingo Is One-of-a-Kind

April Flowers for redOrbit.com – Your Universe Online

When I think of a dingo, I think of a sad scene in the Tom Selleck movie, “Quigley Down Under” where the dingoes are attacking and Cora has to decide if she will repeat her mistakes, or let the aboriginal child cry. Like many of you, I’m sure, I considered them to be just another version of a coyote, no different from any other.

A new study from the University of New South Wales (UNSW) and the University of Sydney, however, reveals that the dingo is a distinctly Australian animal. The findings, published in the Journal of Zoology, shed new light on the dingo’s defining physical characteristics. The research team also resurrects the name Canis dingo, which was first given to the species by German naturalist Friedrich Meyer in 1793.

The research team, led by USNW’s Dr. Mike Letnic and Dr. Mathew Crowther of the University of Sydney, explain that the confusion over whether the dingo is a distinct species began, in part, with the scientific classification of the Australian dingo, which was based on a simple drawing and description in the journal of Australia’s first governor, Arthur Phillip. This classification had no reference to a physical specimen.

Finding a specimen of a dingo that was unlikely to have bred with domestic dogs was a challenge. The team searched museum collections across Europe and the US, looking for specimens that were known, or were likely, to pre-date 1900, including those from archaeological sites.

To create a benchmark description of the dingo, the team examined 69 skull specimens and six skin specimens. They found that a relatively broad head with a long snout, and erect tail and ears, were the defining physical characteristics of the species.

“Now any wild canid – dingo, dog, or hybrid of the two – can be judged against that classification,” says Dr. Crowther, from the University of Sydney’s School of Biological Sciences.

“We can also conclusively say that the dingo is a distinctive Australian wild canid or member of the dog family in its own right, separate from dogs and wolves. The appropriate scientific classification is Canis dingo, as they appear not to be descended from wolves, are distinct from dogs and are not a subspecies.”

Dr. Letnic, from the UNSW School of Biological, Earth and Environmental Sciences, comments, “Many Australians like to think that dingoes are always yellow and that animals with any other coloration are not dingoes. This is untrue.”

“One of our insights is that coat color does not define an animal as a dingo, dog or a hybrid. We found that dingoes can be tan, dark, black and tan, white, or can have the sable coloration typical of German Shepherd dogs.”

Dingoes are Australia’s largest land predator, and as such, have an important role to play in conservation. Dingoes regulate the populations of species such as kangaroos, wallabies and invasive red foxes. The clearer identification provided by this study will allow researchers to develop a sounder understanding of dingo numbers. In turn, this will improve the understanding of the dingo’s role in biodiversity.

“Distinguishing dingoes from their hybrids (cross-breeds) with feral dogs is a practical concern. Current policies in parts of Australia support the conservation of dingoes but the extermination of ‘dingo-dogs’, which are considered a major pest because they kill livestock,” says Dr. Crowther.

Genetic evidence suggests that the dingo, introduced to Australia around three to five thousand years ago, originated from East Asian domestic dogs. Until the arrival of domestic dogs with the European settlement of Australia, the dingo bred in isolation, becoming a distinct breed.

“That made distinguishing dingoes from dogs problematic, as the DNA tests and analyses of their physical structure were based on dingoes whose ancestry was not known. They were either captive animals or wild animals of uncertain ancestry,” concluded Dr. Crowther.

Customers Prefer Restaurants That Offer Nutritional Info, Healthier Menu Options

redOrbit Staff & Wire Reports – Your Universe Online

Despite concerns that they would lose customers if forced to provide dietary information to customers, chain restaurants are actually more likely to be frequented by customers if they disclose the nutritional content of their products and offer healthier menu choices, according to a recently published study.

Researchers from Penn State University and the University of Tennessee recruited 277 frequent restaurant patrons and presented them with a variety of different scenarios to determine whether or not those individuals preferred access to nutritional information and/or healthier food options. Their findings were published in last month’s edition of the International Journal of Hospitality Management.

Each participant was asked to read sample menus that presented each possible scenario. Next, they answered a series of questions related to how they perceived the restaurant’s attitude and corporate social responsibility, as well as their own willingness to eat at each establishment and how health conscious they tended to be.

The study authors discovered that when participants were provided with a scenario in which a restaurant offered nutritional information and included healthier food products on its menu, those people were significantly more likely to view that restaurant as a socially responsible entity.

“The Affordable Care Act has mandated that chain restaurants – those with more than 20 restaurants – provide nutrition information to customers,” said associate professor of hospitality management David Cranage. “Many restaurants had been fighting this legislation because they thought they would lose customers if the customers knew how unhealthy their food was.”

“In this study, we found that customers perceive restaurants to be socially responsible when they are provided with nutrition facts and healthful options and, therefore, are more likely to patronize those restaurants,” he added. “In other words, the participants developed a favorable attitude toward the restaurant and wanted to visit it more frequently.”

Furthermore, Cranage and his colleagues found that men and women who said that they were health conscious were more likely than non-health conscious people to view a restaurant that provided healthy food options as socially responsible. However, when a restaurant allowed them to see nutritional information, both health conscious and non-health conscious people were more likely to view that establishment as socially responsible.

“These results suggest that highly health-conscious people are more sensitive to being able to obtain healthful foods at restaurants than less health-conscious people, regardless of whether or not nutrition information is provided,” said Cranage. “We believe that providing healthful foods and nutrition information can improve a restaurant’s image.”

“Often, managers must choose between profitability and social responsibility when making decisions,” he added. “However, results of this study indicate that deciding to provide nutrition information and healthful food items yields benefits from both perspectives. Based on results of this study, restaurateurs may make an easy decision to increase more healthful items on their menu while simultaneously increasing the image of their business.”

Twitter Mapping Project Reveals Regional Preferences Of US Beer Drinkers

redOrbit Staff & Wire Reports – Your Universe Online
Thanks to a recent Twitter mapping project, experts from the University of Kentucky have managed to determine that adults living in the Eastern half of the United States prefer to drink Bud Light, while Coors Light is the preferred brand of beer for those living in the Western half – especially in and around the state of Colorado.
In addition, associate professor of geography Matthew Zook and PhD student Ate Poorthuis used recent social media posts and geotagging to discover that residents of the Midwest and Great Plains preferred Miller Lite, and those living in the southern border regions appeared to favor the flavor of brands such as Corona and Dos Equis.
Zook and Poorthuis looked at a million tweets that included location-associated information that contained keywords such as beer and wine. The tweets, which were sent between June 2012 and May 2013, mentioned several best-selling, well-known and/or inexpensive brands of primarily light and pale lagers.
“The Twitter maps quite accurately reflect various regions’ history and cultural practices surrounding beer production and consumption, and show just how much reality and cyberspace overlap,” Zook said in a statement Tuesday. The findings were included in a chapter of the new book The Geography of Beer.
“Beer, like many other social practices, may be millenniums old but the socio-spatial practices associated with it – checking into a brewery, posting a review, geotagging a photo – continue to evolve and therefore our approaches to data and research must also evolve to capture these geographies,” added Poorthuis.
While the researchers said that they are not surprised by the popularity of Bud Light and Coors Light on the microblogging service, they said that the geographic presence of the latter beer brand in the Western US demonstrated the overall preferences in different states or regions. Furthermore, they found this trend was more prevalent when examining Yuengling, Grain Belt, Busch Light and other brands with smaller market shares.
The authors also reported that the tweet-mapping project also found a regional variation in preference for beer or wine. The majority of wine-related tweets were sent by wine-growing regions such as northern and central California, Oregon and Washington. Residents of the both US coastal regions were also found to tweet more often about wine, while people from several Midwestern states, Kansas, Oklahoma and Texas posted primarily about beer.
The Geography of Beer, which was edited by Mark Patterson and Nancy Hoalst Pullen, also contains sections on the geography of beer in ancient Europe, American breweries between 1776 and 2012, the geography of microbreweries and brewpubs in the US, the economic and cultural craft-brewing explosion that has taken place since 1985, and efforts by the beer-making industry to become more environmentally friendly.

New Tool Helps Young Adults With Sickle Cell Disease In The Transition To Adult Care

Child and adolescent hematologists at Boston Medical Center (BMC) have developed a tool to gauge how ready young adults with sickle cell disease are for a transition into adult care. In a new article for the Journal of Pediatric Hematology/Oncology, Amy Sobota, MD, MPH, and her collaborators have shown that a questionnaire geared to the needs of young adults with sickle cell disease can pinpoint areas of need before the patient goes into an adult clinic.
BMC’s sickle cell disease transition clinic, which is unique in Boston, was established in 2008 and serves approximately 45 patients.
Sickle cell disease is a hemoglobin disorder, the molecule in red blood cells that carries oxygen to the tissues. Due to a genetic mutation, sickle cell patients make red blood cells that are shaped like a crescent or “sickle.” These patients are often anemic and can get bouts of extreme pain when sickled red blood cells become caught in small vessels of the body. Sickle cell disease traditionally has had a high mortality rate; however, children with sickle cell disease are now living longer, healthier lives thanks to early diagnosis and effective treatment.
These welcome changes have given new importance to the young patient’s point of transition into adult care.
Previous studies have shown that patients with SCD who are transitioning from pediatric to adult care have more admissions and emergency department visits. “We saw that these patients had specific needs, and that is why we started the transition clinic at BMC,” said Sobota, who is an attending in pediatric hematology/oncology at BMC and an assistant professor of pediatrics at Boston University School of Medicine.
To determine the tool’s efficacy, the researchers looked at the answers provided by 33 patients between the ages of 18 and 22 who completed the assessment. A majority, 97 percent, of the respondents said they could explain sickle cell disease to another person and that they understood “how they got” the genetic disease, and 94 percent understood that sickle cell disease might be passed on to their children.
All of the patients said that they planned to attend college or obtain post-high-school training, but only 70 percent knew where to find information about job training and opportunities. Sixty four percent of transitioning patients said they understood the various types of health insurance available to them, but only 13 percent had drawn up a portable medical history form that they could give to adult healthcare providers. Encouragingly, 97 percent of young sickle cell patients claimed a good social support system.
Finally, patients were asked about their ability to manage independent living and 73 percent of the patients had some job experience, full- or part-time. Although all of the patients were 18 and over, only 79 percent said they were already going to doctor’s appointments on their own. However, few mentioned that they had anxiety about transitioning to adult care.
“Our study indicates that this assessment tool – the only one of its kind – provides important information to physicians of patients with sickle cell disease who are transitioning from pediatric to adult care,” said Sobota. “Caregivers can use this information from patients in order to effectively tailor and guide their treatment and education through this transition.”

On the Net:

Heart Health As Young Adult Linked To Mental Function In Mid-Life

Study Highlight:
-Having blood pressure, blood sugar and cholesterol levels slightly higher than the recommended guidelines in early adulthood is associated with lower cognitive function in mid-life.
Being heart healthy as a young adult  may increase your chance of staying mentally sharp in mid-life, according to new research in the American Heart Association journal Circulation.
In a 25-year study on 3,381 people, 18- to 30-years-old, those with blood pressure, blood sugar and cholesterol levels slightly higher than the Association’s recommended guidelines, scored lower on cognitive function tests in their 40s and 50s. Standardized scores on three cognitive tests were between 0.06 to 0.30 points less, on average, for each standard deviation increase in cumulative exposure to these risk factors, which the researchers considered significant for this age group. Standard deviation is the amount of variation from the average.
“It’s amazing that as a young adult, mildly elevated cardiovascular risks seem to matter for your brain health later in life,” said Kristine Yaffe, M.D., study author and a neuropsychiatrist, epidemiologist and professor at the University of California-San Francisco. “We’re not talking about old age issues, but lifelong issues.”
This is one of the first comprehensive long-term studies looking at key heart disease and stroke risk factors’ effects on cognitive function in this age group. Prior research showed similar effects of mid-life and late-life cardiovascular health on brainpower in late life.
The study was part of the ongoing multi-center Coronary Artery Risk Development in Young Adults (CARDIA) Study. Participants had their blood pressure, fasting blood sugar and cholesterol levels checked every two to five years. Researchers analyzed each person’s cumulative cardiovascular health over 25 years.  The American Heart Association defines ideal cardiovascular health as systolic blood pressure <120 mm Hg, diastolic blood pressure <80 mm Hg, blood sugar <100 mg/dL, and cholesterol < 200 mg/dL.
At the end of the study, participants took three tests measuring memory, thinking speed and mental flexibility.
Elevated blood pressure, blood sugar and cholesterol are three major risk factors for atherosclerosis, the slow narrowing of arteries caused by a build-up of plaque in the artery walls leading to the brain and heart.
The narrowing of the arteries leading to and in the brain is the most likely explanation for the link between cardiovascular health and cognitive function, Yaffe said.
“Our study is hopeful, because it tells us we could maybe make a dent in the risks of Alzheimer’s and other forms of dementia by emphasizing the importance of controlling risk factors among younger people,” she said.
Co-authors are Eric Vittinghoff, Ph.D.; Mark Pletcher, M.D., M.P.H.; Tina Hoang,
M.S.P.H.; Lenore Launer, Ph.D.; Rachel Whitmer, Ph.D.; Laura Coker, Ph.D.; and Stephen Sidney, M.D. Author disclosures are on the manuscript.
The study is funded in part by the National Heart, Lung, and Blood Institute and Kaiser Foundation Research Institute.
Beating Heart animation, researcher photo and other photos are available on the right column of the release link http://newsroom.heart.org/news/small-wireless-pacemaker-is-safe-effective-in-early-testing?preview=0d50403b02c7922d6f45dd4059178e4f
For the latest heart and stroke news, follow us on Twitter: @HeartNews.
For updates and new science from Circulation, follow @CircAHA.

On the Net:

Movies Show Black Police Officers Good For Entertainment Only

Depictions of African-American police officers in television and movies may have a real-world impact on police officers and the citizens that they serve, according to a new research project by Sam Houston State University associate professor of criminal justice Howard Henderson and Franklin T. Wilson, from Indiana State University.

While the presence of African-American officers has been shown to increase the perceived legitimacy of police departments, the criminologists believe that media depictions of African-American officers may play a role in delegitimizing African-American officers, both in the eyes of the general public and the African-American community, in particular.

The recent box office success of the comedy “Ride Along,” starring Ice Cube and Kevin Hart, and the 2013 cancelation of the television drama “Ironsides,” starring Blair Underwood, represent the most recent example of an established trend, according to the criminologists.

In their recently released study, “The Criminological Cultivation of African-American Municipal Police Officers: Sambo or Sellout,” Henderson and Wilson reported that African-American city police officers have rarely been depicted as leading characters in theatrically released films over the first 40 years of the cop film genre.

When depicted, African-Americans are overwhelmingly portrayed as comedic entertainment while white officers are not.

Published in Race and Justice, the official journal of The American Society of Criminology Division on People of Color and Crime, the article explores the impact such depictions might have on the recruitment, retention, and public perceptions of African-American city police officers.

“Given the racially charged nature of this past year with instances like the Paula Deen case, the Trayvon Martin verdict, the recent ‘Loud Music Case’ of Michael Dunn, among others along with the profit driven nature of entertainment media, I fear the pattern we have discovered may not be a matter of negligence on the part of Hollywood,” said Wilson, an assistant professor of criminal justice, who led the research. “Instead, it may be a reflection that many United States citizens are not ready to accept an African-American in a serious authoritative role.”

The study of 112 films revealed that white officer depictions dominated the genre, appearing in the lead or joint leading roles in 89 percent of the films; African-American officers were depicted in 19 percent of the films, while other minorities only appeared in 3 percent of the films.

While the study examined films released after 1971, arguably the start of the modern cop film genre with “Dirty Harry,” 95 percent of African-American depictions did not occur until after 1987.

White officers were only teamed with an African-American in 9 percent of the films that depicted a white officer in a leading role. In contrast, of the films that depicted an African-American officer in a leading role, 52 percent depicted the officer with another officer, all but one of which was white.

In addition to the 40-year domination of white officer depictions and the apparent requirement of a white costar to justify an African-American in a leading role, the study revealed that 52 percent of African-American officer depictions portrayed the officer serving as comedic entertainment. White officer comedic portrayals resulted in only 17 percent, which is reduced to only 3 percent if excluding films where the white officer is teamed with a minority officer or minority civilian.

“Most people view films one at a time and do not consider depictions as a whole; this is what sets our study apart,” Henderson said.

“When Roger Murtaugh (Danny Glover) from the ‘Lethal Weapon’ series is depicted stranded on his toilet due to a bomb being placed under it or being asked to strip down to his heart-covered boxers and cluck like a chicken to distract a criminal so that the white officer can save the day, we see individual scenes that may make us laugh,” Henderson said. “What we do not see is how overall these depictions are eerily similar to, if not the continuation of, the presentation of African-Americans as comedic outlets that date back to the slavery experience. Minstrel shows of the mid-1830s as well as the Stepin Fetchit character of film in the 1920s and 1930s regularly used derogatory comedic depictions of African-Americans.”

The study also revealed that while several films touched on the struggle many African-American officers face in maintaining a balance of loyalty to the African-American community and to the police force, only four films directly addressed the issue—“In Too Deep” (1999), “Shaft” (1999), “Training Day” (2001) and “Dirty” (2005).

The researchers note that in “Shaft,” Samuel L. Jackson’s character captures the dilemma by stating that he is “too black for the uniform, too blue for the brothers.”

The study reveals that most often an officer is faced with either selling out the African-American community or quitting the police force.

“Even if we look to television programs, we are hard pressed to find African-American police officers depicted in leading roles,” Henderson said. “We see many supporting roles but no non-comedic city police officer leading roles that take place in the United States in the past or the present.”

Wilson points out that there have been a few attempts, such as the 2005 attempt to revive the 1970s television series “Kojak” with Ving Rhames as the leading character, and more recently, there was an attempt to revive the 1960s and 70s television show “Ironsides,” with the 2013 version starring Blair Underwood; however, “Kojak” was canceled after approximately one season and “Ironsides” was canceled after only four episodes.

“We do not know if such portrayal patterns have an impact on recruitment, retention and public perceptions of African-American city police officers yet but it certainly points to a need for a closer examination,” Henderson said.

On the Net:

Eat Up! Fresh Vegetables And Fruits Can Help You Live Longer

Brett Smith for redOrbit.com – Your Universe Online
An apple a day may keep the doctor away, but what about seven? Or eight?
According to a new study published in the Journal of Epidemiology & Community Health, eating seven or more servings of fruit and vegetables per day reduces a person’s rate of death at any time by 42 percent.
The study team, from the University College London, discovered that eating fresh vegetables was linked with the most powerful protective impact, with an everyday portion lowering overall likelihood of death by 16 percent. Salad added to a 13 percent risk reduction per portion and each portion of fresh fruit was connected with a smaller, but still substantial, 4 percent drop.
“We all know that eating fruit and vegetables is healthy, but the size of the effect is staggering,” said Oyinlola Oyebode, an epidemiologist at UCL. “The clear message here is that the more fruit and vegetables you eat, the less likely you are to die at any age. Vegetables have a larger effect than fruit, but fruit still makes a real difference. If you’re happy to snack on carrots or other vegetables, then that is a great choice but if you fancy something sweeter, a banana or any fruit will also do you good.”
To reach their conclusion, the UK scientists used data from the Health Survey for England to examine the diets of over 65,000 people between 2001 and 2013. They discovered that the more fruit and vegetables they ate, the less likely they were to die at any age. Consuming seven or more portions was found to reduce the specific risks of death by cancer, 25 percent, and heart disease, 31 percent. The study also showed that vegetables have considerably greater health benefits than fruit.
When compared to consuming less than one portion of fruit or vegetables, the risk of death by any cause is cut by 14 percent when a person consumes one to three portions daily, 29 percent for three to five portions, 36 percent for five to seven portions and 42 percent for seven or more.
The team said their study is the first to connect fruit and vegetable intake with all-cause, cancer and heart disease deaths in a nationally-representative population, the first to measure health benefits per-portion, and the first to look at the kinds of fruit and vegetable with the most benefit. The researchers said they had considered a wide range of confounding factors, such as cigarette smoking and body mass index, in reaching their conclusion.
While the benefits of fresh fruits and vegetable were clearly evident, regular consumption of frozen or canned fruit seemed to increase the risk of death by 17 percent, a find that which public health doctors from the University of Liverpool called “intriguing” in an accompanying editorial. They said the added sugars of these processed foods could be to blame and suggested that dietary guideline should be revised to reflect this find.
“150 ml of freshly squeezed orange juice (sugar 13 g); 30 g of dried figs (sugar 14 g); 200 ml of a smoothie made with fruit and fruit juice (sugar 23 g) and 80 g of tinned fruit salad in fruit juice (sugar 10 g)…contain a total of some 60 g of refined sugar,” they wrote. “This is more than the sugar in a 500 ml bottle of cola.”
“Most canned fruit contains high sugar levels and cheaper varieties are packed in syrup rather than fruit juice,” Oyebode said. “The negative health impacts of the sugar may well outweigh any benefits. Another possibility is that there are confounding factors that we could not control for, such as poor access to fresh groceries among people who have pre-existing health conditions, hectic lifestyles or who live in deprived areas.”

Japanese Scientist Falsified Claims On Groundbreaking Stem Cell Research

Lawrence LeBlond for redOrbit.com – Your Universe Online

Less than three months after a rising Japanese scientist made claims of a significant stem cell breakthrough, new evidence has come forward that confirms the findings of the groundbreaking study were falsified.

Haruko Obokata, the lead author of a study published in the journal Nature in January, claimed that stem cells could be created by dipping blood cells into acid, potentially leading to hope in growing Stap (stimulus-triggered acquisition of pluripotency) cell tissue to treat illnesses such as diabetes and Parkinson’s disease. But scientists at the Riken Centre for Development Biology in Kobe, Japan, the same center where Obokata conducted her research, is now saying the researcher’s claims were falsified.

The latest news comes after criticism surrounded the research last month, after researchers from around the world could not replicate the team’s findings using the same approach. The growing body of criticism led some on the research team to consider retracting their paper, which in turn led to further investigations from investigators within the Riken Centre.

Despite the growing controversy, Charles Vacanti, a tissue engineer at Harvard Medical School and Brigham and Women’s Hopsital in Boston, a coauthor of the study, said at the time that he would stick by the results.

“It would be very sad to have such an important paper retracted as a result of peer pressure, when indeed the data and conclusions are honest and valid,” Vacanti told the Wall Street Journal in March.

But as of now, it has come to light that nearly all claims made by Obokata were manipulated and she had falsified images of DNA fragments used in her research, according to the Riken Centre investigative team.

“The manipulation was used to improve the appearance of the results,” Shunsuke Ishii, the head of the committee set up to investigate allegations that the research was fraudulent, told The Associated Press.

When the news broke in January that Obokata, who had just received her PhD three years earlier, had appeared to create a new method of turning blood cells into stem cells easily, she became an instant hero, with many calling it the third most significant breakthrough in stem cell research, according to a report in The Washington Post.

Obokata said in a news conference at the time that her research was an emotional roller coaster ride.

“There were many days when I wanted to give up on my research and cried all night long,” she said. “But I encouraged myself to hold on just for one more day.”

Even as the investigation has found significant discrepancies in her study, Obokata said in a statement to the AFP that she denies the allegations and “will file a complaint against Riken as it’s absolutely impossible for me to accept this.”

The Riken Centre investigators maintained they will punish those who are involved in this bogus study.

“Those involved will be strictly dealt with as per the provisions of RIKEN’s internal regulations, and RIKEN will deliberate and implement measures to ensure that this does not happen again,” the investigators said in a statement.

The investigative team noted that three coauthors of the paper had not falsified the data but were still “gravely responsible” for failing to verify the findings of the study.

The investigative panel would not comment on whether Stap cells actually exist. Ishii noted that determining the existence of Stap cells was not part of the investigators’ mission.

Which Social Networks Are Best For Drug And Medical Information?

April Flowers for redOrbit.com – Your Universe Online

The Internet abounds with a wealth of information — true and otherwise — about healthcare. So where should you turn to find the latest information about medications or conditions? That was the focus of a recent study from the University of California, Riverside’s Bourns College of Engineering and Department of Political Science.

The team found, among other things, that general social networks such as Twitter or Pinterest were the best places for information on medications such as Viagra or ibuprofen. If, on the other hand, you want information on sleep disorders or depression, you should start with specialized social networks such as WebMD or Drugs.com.

The findings, which will be published in an upcoming issue of The Journal of Biomedical Informatics, have implications for a wide range of stakeholders, such as healthcare providers, and healthcare marketing professionals. Providers would have more knowledge for recommending social network sites to their patients, or for creating forums themselves for particular healthcare conditions or drugs. Marketing professionals would be able to use the information to focus their resources, while researchers of healthcare content in social networks could find the data useful when selecting sources.

The research team included Vagelis Hristidis, an associate professor in the computer science and engineering department at UC Riverside, and his graduate student Matthew T. Wiley; Kevin M. Esterling, a professor of political science at UC Riverside; and Canghong Jin, a former visiting graduate student at UC Riverside currently at Zhejiang University in China. The team studied 10 social networks, charting the following factors for each: were they health focused or general, were they moderated, did they require registration, and were they in the form of question and answer.

They selected Twitter, Google+ and Pinterest as the general social networks. Facebook was excluded because its data is not public. DailyStrength, Drugs.com, DrugLib.com, EverydayHealth.com, MediGuard.org, WebMD and Medications.com were selected as the specialized social networks.

A list of the 200 most commonly prescribed drugs was cut down to 122 by eliminating such things as variants of the same drug at different dosages. They then created data collection tools to aggregate references to those drugs from the sites. Filters were created that eliminated duplicate posts and non-English posts. Each site’s Terms of Use were followed during data collection, and messages that could be considered spam were not removed because the definition of spam is subjective.

Depending on the date of the site’s creation, data was collected from 2001 to 2012, ending in 2013. The team analyzed the posts in three different ways. The first was to determine if the post was positive, negative or objective. Secondly, they broke the posts into six drug categories: coagulation modifiers, genitourinary tract agents, nutritional products, cardiovascular agents, psychotherapeutic agents and others. Finally, the team searched for mentions of five types of medical concepts: anatomy, physiology, procedures, chemicals and drugs and disorders.

The differences in the content of discussions on the social networks, based on whether there was moderation, registration required, or a question-and-answer format, were taken into consideration.

Key findings of the study include:

• Negative sentiments were twice as likely to be found on health social networks—such as WebMD and drugs.com—than on general social networks—such as Google+ and Twitter.

• They found that the same basic group of drugs were popular across the board on the general social networks, while different drugs were popular on the health sites.

• On health network sites, posts about psychotherapeutic agents, such as Abilify and Cymbalta, are about five times more common. In contrast, posts about genitourinary tract agents, such as Viagra and Cialis, are 16 times more common in general social networks.

• On non-moderated health network sites, they observed an 87 percent increase in discussion of psychotherapeutic agents, while on moderated sites, discussions of gastrointestinal agents, hormones, anti-infectives, and respiratory agents all increased.

Researchers Reveal Potential Death Risks From Certain Anxiety And Sleep Medications

April Flowers for redOrbit.com – Your Universe Online

Millions of people around the world find help from drugs that treat anxiety and sleep disorders. A new study from the University of Warwick, however, reveals that anti-anxiety and sleeping medicines have been linked to an increased risk of death.

The large study, which tracked 34,727 people for an average of seven and a half years, demonstrates that the risk of death doubles with several anxiolytic (anti-anxiety) drugs or hypnotic (sleeping) drugs. The research team, led by Professor Scott Weich, emphasized that the results are based on routine data and need to be interpreted cautiously, however, they believe that a better understanding of the impact of these drugs is essential.

“The key message here is that we really do have to use these drugs more carefully. This builds on a growing body of evidence suggesting that their side effects are significant and dangerous. We have to do everything possible to minimize over reliance on anxiolytics and sleeping pills,” said Weich, Professor of Psychiatry at the University of Warwick.

“That’s not to say that they cannot be effective. But particularly due to their addictive potential we need to make sure that we help patients to spend as little time on them as possible and that we consider other options, such as cognitive behavioral therapy, to help them to overcome anxiety or sleep problems,” Weich continued.

Where possible, the researchers adjusted for factors such as age, smoking and alcohol use, socioeconomic status and other prescription medications. Most importantly, factors such as sleep disorders, anxiety disorders and all psychiatric illness were controlled for in all participants, who were tracked from the time they received their first prescription for either an anxiolytic or hypnotic drug.

The most commonly prescribed drug class was benzodiazepines, including diazepam and temazepam, according to the researchers. The effects of two other groups of drugs — the so-called “Z” drugs, and all other anxiolytic and hypnotic drugs — were also examined by the scientists. Over the course of the study, many participants received more than one drug, and five percent received prescriptions from all three groups.

The findings of this study were published in a recent issue of BMJ.

Satellite Shows High Productivity From The United States Corn Belt, But Drought Risks Still Remain

April Flowers for redOrbit.com – Your Universe Online

During the Northern Hemisphere’s growing season, data returned from satellite sensors shows that the Midwest has more photosynthetic activity than any other region of the planet, according to a recent NASA report.

School children are taught that plants convert light to energy through the process of photosynthesis. What is less known, however, is that chlorophyll also emits a fraction of the absorbed light as fluorescent glow, which is invisible to the naked eye. The research team has found that the glow is an excellent indicator of the gross productivity of a given region.

Prior research by Joanna Joiner, of NASA’s Goddard Space Flight Center, demonstrated that the fluorescent glow of plants could be extracted from the data of existing satellites, which were designed and built for other purposes. For the current study, Joiner worked with Luis Guanter of the Freie Universität Berlin, and Christian Frankenberg of NASA’s Jet Propulsion Laboratory to estimate the photosynthesis from agriculture for the first time using the satellite data.

“The paper shows that fluorescence is a much better proxy for agricultural productivity than anything we’ve had before. This can go a long way regarding monitoring – and maybe even predicting – regional crop yields,” Frankenberg said.

On an annual basis, the tropics are the most productive. During the Northern Hemisphere’s growing season, however, the research team noticed that the US Corn Belt had a significant advantage. “Areas all over the world are not as productive as this area,” said Frankenberg.

They analyzed data from the Global Ozone Monitoring Experiment 2 (GOME-2) on Metop-A, a European meteorological satellite, finding that fluorescence in the Corn Belt — from Ohio to Nebraska and Kansas — peaks in July. During this peak, the levels are 40 percent higher than those observed in the Amazon.

The researchers confirmed these findings by comparing them with ground-based measurements from carbon flux towers and yield statistics. Satellite measurements have a resolution of more than 1,158 square miles, while the resolution of ground based measurements is approximately 0.4 square miles. The study shows that even with the course resolution, the satellite method was able to estimate the photosynthetic activity occurring inside plants at the molecular level for areas with relatively homogenous vegetation like the Corn Belt.

The new method still faces challenges in estimating the productivity of fragmented agricultural areas. These areas are not properly sampled by current satellite instruments, but future missions with better resolution could help—such as the upcoming NASA Orbiting Carbon Observatory-2 mission, which is scheduled to launch in July 2014. The findings, published in the Proceedings of the National Academy of Sciences, could also help researchers improve the computer models used to simulate Earth’s carbon cycle. Guanter found that a strong underestimation of crop photosynthesis in the models—by 40 to 60 percent.

Drought Risks Remain

Perhaps these findings could also help with another Corn Belt challenge. A new study from Columbia University’s The Earth Institute reveals that increasing heat is expected to extend dry conditions to more cities and farmland by the end of the century. The findings, published in Climate Dynamics, demonstrate that higher evaporation rates may play an important role in future drought conditions, despite the fact that most studies focus on rainfall projections. The researchers say that evaporation, caused by warmer temperatures wringing more moisture from the soil, will affect even those regions predicted to have an increase in rainfall.

In one of the first studies to model the effects of both changing rainfall and evaporation rates on future drought, the research team used the latest climate simulations to estimate that 12 percent of land will be subject to drought by 2100 through rainfall alone. If higher evaporation rates from the added energy and humidity in the atmosphere is considered, the percentage of land affected by drought rises to 30. The study, which excludes Antarctica, demonstrates that even areas expected to get more rain, including important wheat, corn and rice belts in the western United States and southeastern China, will be at risk of drought because of an increase in evaporative drying.

“We know from basic physics that warmer temperatures will help to dry things out,” said Benjamin Cook, a climate scientist with joint appointments at Columbia University’s Lamont-Doherty Earth Observatory and the NASA Goddard Institute for Space Studies. “Even if precipitation changes in the future are uncertain, there are good reasons to be concerned about water resources.”

The latest International Panel on Climate Change’s (IPCC) climate report cautions that soil moisture is expected to decline worldwide, leading to a greater risk of agricultural drought in already dry regions. Consistent with the current study, the IPCC report predicts a strong change of evaporation rates in the Mediterranean, southwestern United States and southern African regions.

The researchers used two drought metric formulations to analyze projections of both rainfall and evaporative demand from the climate model simulations completed for the IPCC’s 2013 climate report. Comparison of the two metrics shows an agreement that increased evaporative drying will probably tip marginally wet regions at mid-latitudes like the Great Plains and a swath of southeastern China into aridity, which would not be the case if rainfall were the only consideration. Dry zones in Central America, the Amazon and southern Africa will grow larger, while the summer aridity of Greece, Turkey, Italy and Spain is expected to extend northward into continental Europe.

Currently, when one area’s crop yields are temporarily lowered by bad weather, other regions are typically able to compensate. According to the study, however, the warmer weather of the future could simultaneously wither crops in multiple regions. “If rain increases slightly but temperatures also increase, drought is a potential consequence.”

The Psychology Behind Internet Trolls

On any webpage you visit you will most likely find negative and ugly comments, people just love to say ugly things! So Canadian psychologists took a look at this behavior in a paper called “Trolls Just Want to Have Fun.” – trolls being the negative commenters. Through a survey, they found that trolls most often showed signs of “sadism, psychopathy, and Machiavellianism,” behaviors often referred to as the “Dark Tetrad.” What researchers are not sure of is if they were sadists and psychopaths before the Internet, or if the Internet turned them into one.
[ Read the Article: The ‘Dark Tetrad’ Makes Up The World Of The Internet Troll ]