Why The Mirror Lies

In people with body dysmorphic disorder, distorted self-image could be the result of the brain’s abnormal processing of visual input

Everyone checks themselves in the mirror now and then, but that experience can be horrifying for individuals suffering from body dysmorphic disorder, or BDD, a psychiatric condition that causes them to believe, wrongly, that they appear disfigured and ugly. These people tend to fixate on minute details “” every tiny blemish looms huge “” rather than viewing their face as a whole.

Now researchers at UCLA have determined that the brains of people with BDD have abnormalities in processing visual input, particularly when examining their own face. Further, they found that the same systems of the brain are overactive in both BDD and obsessive-compulsive disorder, suggesting a link between the two. The research appears in the February issue of the journal Archives of General Psychiatry.

“People with BDD are ashamed, anxious and depressed,” said Dr. Jamie Feusner, an assistant professor of psychiatry and lead author of the study. “They obsess over tiny flaws on their face or body that other people would never even notice. Some refuse to leave the house, others feel the need to cover parts of their face or body, and some undergo multiple plastic surgeries. About half are hospitalized at some point in their lifetimes, and about one-fourth attempt suicide.”

Despite its prevalence “” BDD affects an estimated 1 to 2 percent of the population “” and severe effects, little is known about the underlying brain abnormalities that contribute to the disease.

To better understand its neurobiology, Feusner and colleagues examined 17 patients with BDD and matched them by sex, age and education level with 16 healthy people. Participants underwent functional magnetic resonance imaging (fMRI) while viewing photographs of two faces “” their own and that of a familiar actor “” first unaltered, and then altered in two ways to parse out different elements of visual processing.

One altered version included only high”“spatial frequency information, which would allow detailed analysis of facial traits, including blemishes and hairs. The other showed only low”“spatial frequency information, conveying the general shape of the face and the relationship between facial features.

Compared to the control participants, individuals with BDD demonstrated abnormal brain activity in visual processing systems when viewing the unaltered and low”“spatial frequency versions of their own faces. They also had unusual activation patterns in their frontostriatal systems, which help control and guide behavior and maintain emotional flexibility in responding to situations.

Brain activity in both systems correlated with the severity of symptoms. In addition, differences in activity in the frontostriatal system varied based on participant reports of how disgusting or repulsive they found each image. Basically, how ugly the individuals viewed themselves appeared to explain abnormal brain activity in these systems.

The abnormal activation patterns, especially in response to low-frequency images, suggest that individuals with body dysmorphic disorder have difficulties perceiving or processing general information about faces.

“This may account for their inability to see the big picture “” their face as a whole,” Feusner said. “They become obsessed with detail and think everybody will notice any slight imperfection on their face. They just don’t see their face holistically.”

Some of the patterns, said Feusner, also appear to be similar to those observed in patients with obsessive-compulsive disorder, supporting hypotheses that the two conditions share similar neural pathways. However, future studies are needed to further elucidate the causes and development of body dysmorphic disorder.

Other authors on the paper include Teena Moody, Emily Hembacher, Jennifer Townsend, Malin McKinley, Hayley Moller and Susan Bookheimer, all of UCLA.

The research was supported by the National Institute of Mental Health, the Obsessive-Compulsive Foundation, a research grant from UCLA, the National Center for Research Resources, the National Institutes of Health, the Brain Mapping Medical Research Organization, the Brain Mapping Support Foundation, the Pierson”“Lovelace Foundation, the Ahmanson Foundation, the William M. and Linda R. Dietel Philanthropic Fund at the Northern Piedmont Community Foundation, the Tamkin Foundation, the Jennifer Jones-Simon Foundation, the Capital Group Companies Charitable Foundation, the Robson Family and the Northstar Fund.

On the Net:

New Research On Type 2 Diabetes By TCD Researchers Could Benefit Young Adults With The Condition

New research on Type 2 diabetes by Trinity College Dublin  researchers could benefit young adults (aged 18-25 years) with the condition. The research led by Professor John Nolan of Trinity College Dublin and St James’s Hospital, Dublin, has just been published online in the leading international journal, Diabetes Care.

The study findings demonstrate new mechanisms in muscle cells that may explain severe insulin resistance which is the body’s decreased ability to respond to the effects of insulin, and a reduced response to aerobic exercise in young obese patients with Type 2 diabetes.   These important findings will contribute in the longterm to the development of more specific treatments for young people with Type 2 diabetes.

Type 2 diabetes is the most common form of diabetes. It occurs because the body produces too little insulin and is unable to properly use the insulin that is secreted. It usually occurs in older people although it is becoming more common among younger people, partly due to lifestyle factors such as diet, lack of physical activity and obesity. The highest rates occur in countries with modern lifestyles.  Type 2 diabetes accounts for approximately 85%-90% of all cases of diabetes in European countries².  It is estimated that 129,052 people in the Republic of Ireland have adult Type 2 diabetes or 4.3% of the adult population³.

Commenting on the significance of the  research, Professor John Nolan of the Department of Clinical Medicine, TCD, who led the Metabolic Research Group, said: “Type 2 diabetes is presenting in much younger people, usually because of early onset obesity and a strong family background of diabetes.  These studies provide us with important new insights into the way diabetes develops and progresses in these young patients.  In this study, we have shown that obese young patients with Type 2 diabetes, in contrast to equally obese young people without diabetes, have abnormal function of key mitochondrial genes and proteins.  Mitochondria are the energy centers in cells and these abnormalities contribute to insulin resistance and a severely blunted response to physical exercise.  Aerobic exercise is very effective in preventing and treating Type 2 diabetes in middle aged and older people.”

 “Type 2 diabetes is the major chronic disease of modern societies”, continued Professor Nolan, “and threatens the health of populations, most dramatically in Asia and developing countries.  Designing specific treatments for Type 2 diabetes in young people depends on a more exact understanding of the cellular mechanisms of this disease.  Our studies of muscle mitochondrial function have allowed us to focus intervention studies on these important new mechanisms.”

The research was carried out by the Metabolic Research Unit at Trinity College Dublin based at St James’s Hospital. These studies are part of an ongoing research program by Professor Nolan’s team into the causes and treatment of Type 2 diabetes and severe insulin resistance in young people.  The investigations were done in collaboration with Professor Antonio Zorzano at the Institute for Research in Biomedicine, Barcelona. The studies were funded by grants from the European Foundation for the Study of Diabetes and from the EU Commission as well as grants from the Ministerio de Educaci³n y Cultura in Spain.

On the Net:

Child-Specific Doses For Pediatric PET Patients

Study investigates clear dose guidelines in PET exams on children

Studies have shown positron emission tomography’s (PET) value as a minimally invasive, painless and safe diagnostic tool for many pediatric conditions. In a study published in the February issue of The Journal of Nuclear Medicine (JNM), researchers at the Children’s Hospital of Philadelphia (CHOP) and the University of Pennsylvania (Penn) gathered data that may provide clinicians with new formulas””specific to pediatrics””to calculate the amount of radiotracer that should be injected based on the patient’s weight.

“These findings mean that PET””a very common nuclear medicine procedure””can be used in children with methods that are even more patient-specific than those currently employed,” said Roberto Accorsi, Ph.D., former research assistant professor of radiology at CHOP-Penn and lead author of the study.

This study is one more contribution to the medical imaging community’s overall efforts to reduce radiation dose to children. Nuclear medicine specialists are continuously refining methodologies in order to preserve image quality and minimize radiation exposure during pediatric PET exams. Since medical research published in recent years highlights the health risks of exposure to ionizing radiation, many have looked to the medical community for ways to curb exposure during medical imaging exams. Although the nuclear medicine exam’s benefits to the patient far outweigh any potential risks associated with radiation, the nuclear medicine community seeks to uphold practices that are consistent and mindful of patients’ concerns.

In nuclear medicine, there are well-established guidelines for administering radiopharmaceutical doses for adults; however, there is little guidance for administering pediatric doses. Thus, the CHOP-Penn study sets out to examine how nuclear medicine physicians can take into consideration a child’s lighter weight and body size and adjust the dose and scan time accordingly, while maintaining high-quality imaging for the best diagnosis possible.

Image quality for PET depends strongly on the patient’s weight and body build. In other words, the larger and heavier the patient, the more injection dose or possibly a longer scan time is needed to obtain a quality image. For patients who are lighter and have less body mass””such as in pediatric patients””less injection dose or a reduced scan time may still allow for high-quality images.

“The results of this study show that, due to children’s relatively small size and light weight, it is possible to reduce radiological dose (or scan time) while preserving image quality as compared to PET imaging in adults,” said Dr. Accorsi, whose research was supported through a Research Fellow Award by the Society for Pediatric Radiology Research and Education Foundation. “Minimizing exposure to radiation is important to all patients, but especially for young children.”

CHOP-Penn researchers acquired and analyzed data from 73 patients. The patients’ weight ranged from 25 pounds to 200 pounds. Researchers report in their study that when following an injection protocol proportional to weight, the data quality of PET images was found to improve with decreasing weight. The study provides practical injection protocols to trade this advantage for decreased scan time or dose at constant image quality.

Studies such as the one published in JNM are helping physicists and physicians gather new data about improving dose regimens to get the highest-quality diagnostic image while using the lowest amount of radiation practical, adhering to the “As Low As Reasonably Achievable” (ALARA) principle.

On the Net:

Children More Apt To Visit The Dentist If Their Parents Do

Whether or not children receive regular dental care is strongly associated with their parents’ history of seeking dental care. A new report to appear in the journal Pediatrics, which has been released online, is the first to analyze the relationship between parents’ and childrens’ dental visits in a nationally representative sample.

“When parents don’t see the dentist, their children are much less likely to see the dentist,” says Inyang Isong, MD, MPH, of the MassGeneral Hospital for Children (MGHfC) Center for Child and Adolescent Health Policy, the study’s lead author. “We also found that the children of parents who have put off their own dental care for financial reasons are more likely to have their care deferred due to cost as well. It looks like strategies to promote oral health should focus on the whole family.”

The study’s authors note that dental caries ““ tooth decay ““ is of one of the most prevalent childhood diseases and is particularly common among minority and low-income children. Previous studies have associated factors including insurance coverage, parents’ income and education, and the availability of dental care in the local community with the likelihood that children will have regular dental visits.

Earlier investigations of the impact of parents’ accessing dental care focused on particular demographic groups. In order to see whether associations from those studies applied more broadly, the current investigation analyzed data from the 2007 National Health Interview Survey and its Child Health Supplement, which are designed to collect basic health and demographic information, along with answers to questions on health topics of current interest, from a cross section of the U.S. population.

Survey responses including data regarding dental visits for both a child and parent in the same household was available for around 6,100 matched pairs. Among parents who reported seeing a dentist during the preceding year, 86 percent of children had also seen a dentist; but only 64 percent of the children of parents with no recent dental visit had seen a dentist during the previous 12 months. In addition, among parents who put off their own dental care because of financial considerations, 27 percent of their children also had dental care deferred. In contrast, only 3 percent of children whose parents had not put off their own care had their dental care deferred.

“Even when children are covered by medical insurance, it appears that financial barriers are influencing parents’ decisions about accessing dental care for their children,” says Isong, a clinical fellow at MGHfC. “We’re now in the process of looking at the impact of dental insurance ““ something not addressed by the NHIS ““ and other enabling resources on the relationship between parents’ and children’s receipt of dental care.”

James Perrin, MD, of the MGHfC Center for Child and Adolescent Health Policy is senior author of the Pediatrics paper. Additional co-authors are Karen Kuhlthau, PhD, and Jonathan Winickoff, MD, MPH, MGHfC; Katharine Zuckerman, MD, MPH, Oregon Health and Science University; and Sowmya Rao, PhD, MGH Biostatistics Center.

On the Net:

Help In Sight For Ageing Vision

International scientists today reported encouraging results in human trials of the use of the herb saffron for treating one of the commonest causes of loss of sight in old age.

Professor Silvia Bisti and Prof. Benedetto Falsini with colleagues from The Vision Centre, the University of L’Aquila and Catholic University of Rome, Italy, told an international vision conference in Sydney that a clinical trial had confirmed earlier hopes that saffron could improve the function of the retina in patients suffering from dry Age-related Macular Degeneration (AMD).

AMD affects the majority of humans as they age and is a major cause of blindness in the elderly, increasing dependency and the cost of caring for people in an ageing population.

“The trial on human patients with dry AMD was a success,” she said. “The parameters we chose to document the effect showed that saffron treatment improved retinal function. On top of that the majority of patients recognized when they took the treatment that their vision improved.”
Prof. Bisti says the main finding from animal experiments and clinical trial so far are that saffron:
“¢ Protects vision cells from cell death in cases where the retina is under stress
“¢ Improves the structure and function of vision cells
“¢ Modulates gene expression in the eye
“¢ Shows potential for protecting the retina in cases where AMD is detected early.
“¢ Can be used to prevent or slow the neurodegeneration of the retina.

Details of the research, which was carried out in a double-blind placebo-controlled experiment using patients who had experienced vision loss due to AMD, have been submitted for publication in a leading scientific journal. Participants in the trial received pills containing concentrated saffron at controlled rates along with their normal diet.

“Chemically, saffron is known to contain volatile and aroma-yielding compounds and many non-volatile biologically active components, such as carotenoids and various alpha- and beta-carotenes. The peculiar characteristics of saffron and our experimental results suggest that it has a number of very different action on the cells of the retina, ranging from antioxidant activity to direct control of gene expression,” she says.

The promising results in human patients follow extensive trials of saffron in an animal model, which pointed to its likely beneficial effects. “These also showed that a saffron diet will help to protect the eye from the damaging effects of bright light ““ something we all suffer whenever we go out in the sun.”

Prof. Bisti says the research has also established that saffron is active in affecting genetic diseases of the eye in animal models of retinitis pigmentosa, a condition which can cause life-long blindness in young people.

“We are encouraged by the results we have achieved in this research to date but, to be candid, it has raised more questions than we have answers for at the moment, so a lot more work is required before we can say for certain that dietary treatment with saffron can be used as a therapy to treat certain eye diseases. These are still early days,” she says.

Professor Jonathon Stone from The Vision Centre and Sydney University commented “The outcome of this experiment was remarkable – significant improvement in vision after several weeks of taking saffron in pill form, which reversed when the patients were taken off it. This is very encouraging for a non-invasive way to treat certain important eye diseases.”

The Australian Neuroscience Society Satellite Meeting 2010: “From Photoreceptors to Behaviour” is being held at the Save Sight Institute, 8 Macquarie St, Sydney on January 29 and 30, 2010. Media are welcome to attend and interview participants.

On the Net:

Oxygen: Key To Common Eye Diseases

The key to treating eye diseases like diabetic retinopathy lies in understanding the distribution of oxygen in our retina, Professor Dao-Yi Yu from The Vision Centre and University of WA will tell an international scientific conference on vision in Sydney.

The Australian Neuroscience Society Satellite Meeting 2010: “From Photoreceptors to Behaviour” is being held at the Save Sight Institute, 8 Macquarie St, Sydney on January 29 and 30, 2010.

Oxygen is vital to the health of the visual cells in our eyes. “It’s like fuel to a car ““ we can’t operate without it,” Professor Yu says. “The retina which is responsible for the first stages of the visual process has a limited blood supply, so there’s a delicate balance between the high oxygen demand of our retinal cells and a limited supply. Oxygen is the most supply-limited substance in the human retina. When a slight imbalance occurs in the eye due to a lack of oxygen, our retina is stressed and this can lead to various eye diseases.

“By finding out where the oxygen is most needed in our eyes, we can treat eye diseases better by shifting the distribution of oxygen towards those regions.” Professor Yu explained. “This can actually preserve vision in some diseases.”

One example is diabetic retinopathy. Diabetics are often at risk of vision loss as the blood vessels in their eyes become blocked, choking off the delivery of oxygen to the retina.

“For decades, the way to prevent vision loss caused by diabetic retinopathy has been to destroy the peripheral retinal cells. This is thought to allow oxygen to reach the cells in the central part of the retina ““ where it’s needed most, because this area provide us with sharp, clear vision. ” Professor Yu described.

However, the destruction of these retinal cells often causes the loss of peripheral vision, night vision and color vision.

“Our research aims to restore people’s sight without destroying as many peripheral cells.” Professor Yu said.

The proposed procedure involves using less damaging lasers to treat the peripheral retina, but allow sufficient oxygen to support our central area of vision.

“The retina has a highly layered structure. By using special lasers with selective wavelengths and power modulation we can selectively thin out the cells in the outer retinal layers, provoking a natural shift of oxygen towards the inner retina.”

A number of groups of people are particularly at risk of loss of vision caused by oxygen starvation in the retina. The elderly are a major group affected, but people with cardiovascular diseases, hypertension or diabetes are also at risk.

The Australian Neuroscience Society Satellite Meeting 2010: “From Photoreceptors to Behaviour” is sponsored by The Vision Centre, which is funded by the Australian Research Council as the ARC Centre of Excellence in Vision Science. Media are welcome to attend and interview the participants.

On the Net:

Breakthrough In HIV Research

Researchers have made a breakthrough in HIV research that had eluded scientists for over 20 years, potentially leading to better treatments for HIV, in a study published today in the journal Nature.

The researchers, from Imperial College London and Harvard University, have grown a crystal that reveals the structure of an enzyme called integrase, which is found in retroviruses like HIV. When HIV infects someone, it uses integrase to paste a copy of its genetic information into their DNA.

Prior to the new study, which was funded by the Medical Research Council and the US National Institutes of Health, many researchers had tried and failed to work out the three-dimensional structure of integrase bound to viral DNA. New antiretroviral drugs for HIV work by blocking integrase, but scientists did not understand exactly how these drugs were working or how to improve them.

Researchers can only determine the structure of this kind of molecular machinery by obtaining high quality crystals. For the new study, researchers grew a crystal using a version of integrase borrowed from a little-known retrovirus called Prototype Foamy Virus (PFV). Based on their knowledge of PFV integrase and its function, they were confident that it was very similar to its HIV counterpart.

Over the course of four years, the researchers carried out over 40,000 trials, out of which they were able to grow just seven kinds of crystals. Only one of these was of sufficient quality to allow determination of the three-dimensional structure.

Dr Peter Cherepanov, the lead author of the study from the Department of Medicine at Imperial College London, said: “It is a truly amazing story. When we started out, we knew that the project was very difficult, and that many tricks had already been tried and given up by others long ago. Therefore, we went back to square one and started by looking for a better model of HIV integrase, which could be more amenable for crystallization. Despite initially painstakingly slow progress and very many failed attempts, we did not give up and our effort was finally rewarded.”

After growing the crystals in the lab, the researchers used the giant synchrotron machine at the Diamond Light Source in South Oxfordshire to collect X-ray diffraction data from these crystals, which enabled them to determine the long-sought structure. The researchers then soaked the crystals in solutions of the integrase inhibiting drugs Raltegravir (also known as Isentress) and Elvitegravir and observed for the first time how these antiretroviral drugs bind to and inactivate integrase.

The new study shows that retroviral integrase has quite a different structure to that which had been predicted based on earlier research. Availability of the integrase structure means that researchers can begin to fully understand how existing drugs that inhibit integrase are working, how they might be improved, and how to stop HIV developing resistance to them.

On the Net:

UCLA Cancer Researchers Perform Complete Genomic Sequencing Of Brain Cancer Cell Line

Study may result in new and more effective therapies

Researchers at UCLA’s Jonsson Comprehensive Cancer Center have performed the first complete genomic sequencing of a brain cancer cell line, a discovery that may lead to personalized treatments based on the unique biological signature of an individual’s cancer and a finding that may unveil new molecular targets for which more effective and less toxic drugs can be developed.

The study also may lead to new and better ways to monitor for brain cancer recurrence, allowing for much earlier diagnosis and treatment when the cancer returns. Clinicians also could use the finding to develop a test to determine when the brain cancer has been killed, preventing overtreatment with harmful drugs that can later cause debilitating health problems.

Using the latest technology, the sequencing was done in less than a month and cost about $35,000. By comparison, the sequencing of the human genome took years, required huge teams of scientists and cost more than $1 billion, said Dr. Stan Nelson, a professor of human genetics, a researcher at UCLA’s Jonsson Comprehensive Cancer Center and senior author of the study.

“This is very exciting because we, as scientists, can now move forward with revealing complete cancer genomes,” said Nelson, who directs the cancer center’s Gene Expression Shared Resource. “Cancer is at its heart a genetic disease. Cancer cells have acquired mutations that allow them to invade tissues and to not live by the normal rules. The changes from normal (mutations) that have given the cancer these special properties are encoded in DNA, and the entire DNA sequence has just been to complex and costly to decode until now.”

The study appears in the Jan. 29, 2009 issue of PLoS Genetics, a peer-reviewed journal of the Public Library of Science.

The sequencing was done on a much studied glioblastoma cell line called U87, which is being used in more than a dozen UCLA cancer laboratories and studied in more than 1,000 laboratories worldwide, Nelson said. They picked the cell line, he said, because it has been so thoroughly examined. The sequencing will allow scientists who have studied the cell line to reinterpret their findings and may prompt researchers to move in new directions going forward.

The sequencing revealed virtually all potentially cancer-causing chromosomal translocations and genetic deletions and mutations that may have resulted in this cancer’s development. The study involved taking the very long strands of genetic material from the cancer cells and sheering them, or cutting them up randomly. Billions of different DNA fragments from this cancer were simultaneously read with next generation sequencing technology. The genetic material was analyzed more than a billion times to ensure the results would be both highly sensitive and accurate, Nelson said.

“This was the most thorough sequencing analysis of an individual cancer cell line that has been performed to date,” Nelson said. “We developed specific informatics tools to help with the analysis and used the most powerful technology available. As scientists, we previously didn’t know most of the mutations that occur within a given cancer ““ we’re blind to them. Now this new technology allows us to look at every single cancer and decode that cancer genome completely so there’s no chance we’re missing a mutation that may be causing the disease.”

Knowing the genes that are mutated and driving the cancer’s growth could allow clinicians to choose therapies most suited to attack the specific molecular signature of that patient’s disease to provide more effective treatment. The sequencing also could reveal a molecular abnormality that is driving the cancer, unveiling a target that could lead to the development of new therapies that home in on cancer cells and leave the healthy cells alone.

Patient-specific diagnostics also could be developed to monitor for cancer recurrence, Nelson said.

“Sometimes it’s difficult to tell if a cancer is coming back or if what you’re seeing is scar tissue,” Nelson said. “Scientists could develop a sensitive molecular assay that looks for a unique mutation found only in the cancer cells and not in the healthy cells. If that mutation is found by the assay, the cancer has returned and patients could be promptly treated when the recurrence is at its earliest stage and easiest to treat. Conversely, such an assay could be used to determine when the cancer has been effectively eliminated and it’s safe to discontinue what are harmful treatments.”

Just that one simple assay, Nelson said, would have an amazing impact on how cancers are treated.

“Oncologists would be able to know, definitively, when they can stop giving chemotherapy because it’s no longer needed or when they have to resume chemotherapy because the cancer has returned,” he said.

Nelson and his team created a web site where researchers can access and retrieve the sequencing data for use in their own experiments, a sort of mini human genome project. Nelson believes sequencing all cancer genomes will result in a significant paradigm shift in the way cancers are treated.

The team of scientists within Nelson’s lab has set up a process at UCLA to sequence other cancer cell lines in a highly accurate and cost effective way. His goal is to be able to sequence a patient’s individual cancer and turn the data around quickly enough to provide oncologists with the information they need to make immediate treatment decisions.

On the Net:

Restaurants Need Better Milk Handling

One-third of samples of milk and dairy products analyzed in various restaurants exceed the microbe contamination limits set by the European Union, according to a study carried out by researchers from the University of Valencia (UV). The experts advise against keeping milk in jugs and suggest that these foodstuffs need to be better handled.

“Out of all the dairy products we analyzed, 35% of the samples exceeded the maximum contamination levels established by EU law for enterobacteriaceae, and 31% exceeded the limits set for mesophilic aerobic microorganisms (which grow at an optimum temperature of between 30 and 45ºC)”, Isabel Sospedra, a researcher at the Department of Preventive Medicine and Public Health of the UV and one of the authors of the study, tells SINC.

The scientists examined 265 batches of milk and ready-to-use milk derivatives in a range of bars and restaurants in Valencia, and checked whether their microbial quality fell into line with European Union regulations. The results, which have been published recently in the journal Foodborne Pathogens and Disease, show that one-third of the samples had some kind of microorganism contamination and were not fit for human consumption.

“Luckily none of the batches we analyzed tested positive for Staphylococcus aureus, Listeria monocytogenes or Salmonella spp., which are pathogenic microorganisms that cause both food poisoning and toxoinfections”, the study’s authors says.

The researchers found differences according to the source of the sample (hot milk, products at room temperature or homemade dairy products). According to the study, 2% of the samples of hot milk (kept in jugs or stainless steel thermos flasks) tested positive for the bacteria Escherichia coli.

The team detected unsuitable practices, such as reheating milk over and over again, even in a microwave, and then pouring it back into the thermos, which increases the risk of microbial contamination. The study shows that there is a greater contamination risk from milk kept in jugs, meaning this type of container is not suitable for storing milk.

Focus more on cleaning utensils

The experts advise that, when using milk in any way, it is important to clean jugs, thermos flasks and the steamers of coffee machines thoroughly and frequently, using the right kind of hygienic sponges or cloths, which is not always the case. “Kitchen cloths are not suitable for disinfecting because of their microstructure, which means they transfer even greater levels of contamination”, the scientists explain.

In terms of milk that is cold or at room temperature, this is usually kept in its original container in restaurants and bars ““ a plastic bottle or tetrabrick. The study shows that containers with a lid are better, since tetrabricks opened with scissors are more exposed to microbial proliferation, and are especially vulnerable to enterobacteriaceae.

In terms of dairy products prepared in the restaurants themselves (custards, mousses, puddings and crme caramels), custards (natillas) had the highest levels of contamination with microorganisms. This may be due to the fact this was the only foodstuff analyzed that is further processed after being heated, say the scientists. Cross contamination could come from the hands of the person preparing the product, particularly when he or she places the biscuit on top of the dish.

In line with previous studies, the researchers also showed that adding cinnamon to dairy products led to reduced microorganism contamination, since this substance helps to eliminate microorganisms such as Escherichia coli, Listeria monocytogenes and bacteria from the Salmonella family.

References: Isabel Sospedra, Josep V. Rubert, Carla Soler, Jose M. Soriano, Jordi Mañes. “Microbial Contamination of Milk and Dairy Products from Restaurants in Spain”. Foodborne Pathogens and Disease 6(10): 1269-1272, diciembre de 2009.

Image Caption: A study recommends better handling of milk in restaurants. Credit: SINC

On the Net:

Symptoms Have Little Value For Early Detection Of Ovarian Cancer

Use of symptoms to trigger a medical evaluation for ovarian cancer does not appear to detect early-stage ovarian cancer earlier and would likely result in diagnosis in only 1 out of 100 women in the general population with such symptoms, according to an article published online January 28 in the Journal of the National Cancer Institute.

Researchers at Fred Hutchinson Cancer Research Center in Seattle assessed the predictive value of certain symptoms, including abdominal pain or bloating and urinary frequency, which were cited in a recent consensus statement as a way to diagnose ovarian cancer earlier.

Mary Anne Rossing, Ph.D., of the Program in Epidemiology at Fred Hutchinson, and colleagues conducted in-person interviews with 812 patients aged 35-74 years who had epithelial ovarian cancer that was diagnosed from 2002 through 2005. They compared the results from these case patients with results from interviews with 1,313 population-based control subjects””women who did not have ovarian cancer. The researchers assessed the sensitivity, specificity, and positive predictive value of a proposed symptom index and of symptoms included in the consensus recommendation.

Symptoms appeared in most case patients only about 5 or fewer months before diagnosis. Women with early-stage ovarian cancer were somewhat less likely to have symptoms (except nausea) than those with late-stage cancer. The estimated positive predictive value of the symptoms was 0.6%𔓚¿.1% overall and less than 0.5% for early-stage disease.

The authors conclude that 100 symptomatic women would need to be evaluated to detect one woman with ovarian cancer.

“The low positive predictive value of symptoms to detect ovarian cancer””particularly at an early stage””argues for a cautious approach to the use of symptom patterns to trigger extensive medical evaluation for ovarian cancer,” the authors write.

In an accompanying editorial, Beth Y. Karlan, M.D., and Ilana Cass, M.D., of the Division of Gynecologic Oncology at Cedars-Sinai Medical Center in Los Angeles, note the strengths of the study, including in-person interviews and large number of patients, but also point out its limitations: inherent recall bias and survival bias in case patients and control subjects. Recall bias is always a possibility in case”“control studies in that case subjects may be more likely to remember symptoms than control subjects.

“Importantly, these findings remind us that wide recognition of symptoms alone will not incrementally improve the overall survival from ovarian cancer,” the editorialists write. “Rather, they highlight the urgent need to develop better molecular markers and improved imaging modalities for ovarian cancer screening.”

On the Net:

Alli Proven To Reduce Visceral Fat, A Dangerous Fat Linked To Many Life-Threatening Diseases

According to new data released today

New studies show that overweight and obese people using alli® (orlistat 60 mg) with a reduced calorie, lower-fat diet can significantly reduce weight, visceral fat, and waist circumference and therefore may reduce their risk of type 2 diabetes, hypertension, heart disease and stroke. The studies were presented at the 1st International Congress on Abdominal Obesity in Hong Kong earlier today.

alli is the only FDA-approved OTC weight loss aid that is clinically proven to boost weight loss by 50 percent and significantly reduce excess visceral fat. Working in the digestive tract, alli prevents about 25 percent of the fat that a person eats from being absorbed.

Visceral fat is a dangerous type of fat that surrounds the vital organs in the abdomen and when present in excess disrupts the normal functioning of organs, increasing the risk of life-threatening diseases. Even modest weight loss can result in significant reductions in visceral fat and substantially improve health. In fact, when losing weight, visceral fat is among the first fat lost, which is associated with noticeable health benefits such as reductions in total cholesterol and low-density lipoprotein (LDL). This helps reduce the risk of type 2 diabetes, hypertension, heart disease and stroke. It is these health complications that have a high personal and societal toll and impact in the global obesity epidemic.

“Although most individuals try to lose weight to improve their appearance, it’s important to help them understand that losing excess fat reduces their risks of life-threatening diseases,” said Jeanine Albu, M.D., Senior Attending in Medicine, Associate Chief of the Division of Endocrinology, Diabetes and Nutrition and the Chief of the Metabolism and Diabetes Clinic at the St. Luke’s-Roosevelt Hospital Center in New York.

“We need to raise awareness of the direct link between visceral fat on the inside and heart disease and diabetes,” said Dr. Albu. “Through healthy eating, keeping active and treatments such as alli, people can lose 5 to 10 percent of total body weight ““ including visceral fat ““ and achieve and maintain their healthy weight.”

In two of the studies presented at the congress, alli was evaluated to determine its effect on excess visceral fat. This new body of evidence proves that alli significantly reduces weight and dangerous visceral fat to help people improve their health.

The Visceral Fat Imaging Study

The three-month Visceral Fat Imaging Study demonstrated that alli reduced total body weight by 5.6 percent and visceral fat by 10.6 percent versus amounts at the start of the study in overweight and obese adults on a reduced calorie, lower-fat diet (P < 0.0225). Carried out at Europe’s largest imaging center, the Clinical Imaging Centre in Hammersmith Hospital, UK, the study used MRI technology that showed changes taking place inside people’s bodies as they lost weight in a unique way.

Twenty-six study participants were counseled to follow a reduced calorie, lower-fat diet, and then took alli three times per day for 12 weeks. Results also showed that at week 12 alli significantly reduced waist circumference (the measurement around the waistline), the best practical way to assess visceral fat, by 5 cm (2 inches).

The Visceral Fat Multi-Center Study

In the six-month Visceral Fat Multi-Center Study, overweight and obese adults receiving alli while on a reduced calorie, lower-fat diet had significantly greater improvements in visceral fat than those treated with diet alone.

In this study, 123 participants were randomly assigned to receive either alli three times per day or a placebo, along with recommendations to follow a reduced calorie, lower-fat diet, for 24 weeks. At week 24, statistically significant reductions in visceral fat and body weight were observed in both groups; however, the reduction was significantly higher among patients taking alli. Mean reductions in visceral fat were 15.66 percent for alli versus 9.39 percent for placebo (P< 0.0001); mean reductions in body weight were 5.96 kg versus 3.91 kg, respectively (P< 0.05).

Overweight and obese people enrolled in the Visceral Fat Imaging Study and Visceral Fat Multi-Center Study had a body mass index (BMI) of 25-35 kg/m2, with a waist circumference greater than 88 cm (34.64 inches) for women or 102 cm (40.16 inches) for men at the start of the studies. Use of alli in both studies was shown to be generally well tolerated and consistent with the known safety profile.

On the Net:

A New Way To Treat Autism

Children with autism would likely receive better treatment if supporters of the two major teaching methods stopped bickering over theory and focused on a combined approach, a Michigan State University psychologist argues in a new paper.

For years, the behavioral and developmental camps have argued over which theory is more effective in teaching communication and other skills to preschool-aged children with autism. Basically, behaviorists believe learning occurs through reinforcement or reward while developmental advocates stress learning through important interactions with caregivers.

But while the theories differ, the actual methods the two camps ultimately use to teach children can be strikingly similar, especially when the treatment is naturalistic, or unstructured, said Brooke Ingersoll, MSU assistant professor of psychology.

In the January issue of the Journal of Positive Behavior Interventions, Ingersoll contends that advocates of the behavioral and developmental approaches should set aside their differences and use the best practices from each to meet the needs of the student and the strengths of the parent or teacher.

“We need to stop getting so hung up on whether the behavioral approach is better than the developmental approach and vice versa,” Ingersoll said. “What we really need to start looking at is what are the actual intervention techniques being used and how are these effective.”

An estimated one out of every 110 children in the United States has autism and the number of diagnosed cases is growing, according to the Centers for Disease Control and Prevention. Symptoms typically surface by a child’s second birthday and the disorder is four to five times more likely to occur in boys than in girls.

Ingersoll said the behavioral and developmental treatment methods both can be effective on their own. But historically, advocates for each have rarely collaborated on treatment development for children with autism, meaning it’s unknown whether a combined approach is more effective.

Ingersoll expects it is. She is trained in both methods and has created a combined curriculum on social communication that she’s teaching to preschool instructors in Michigan’s Ottawa, Livingston and Clinton counties. Through the MSU-funded project, the instructors then teach the method to parents of autistic children.

Ingersoll said the combined method works, but it will probably take a few years of research to determine if it’s more effective than a singular approach.

“I’m not necessarily advocating for a new philosophical approach ““ the reality is that neither side is likely to change their philosophy,” Ingersoll said. “What I am advocating is more of a pragmatic approach that involves combining the interventions in different ways to meet the needs of the child or the caregiver. I think that will build better interventions.”

Image Caption: Assistant psychology professor Brooke Ingersoll is calling for a new method of treating children with autism. Photo by G.L. Kohuth

On the Net:

Herbal Use Popular Amongst Pregnant Women

According to a new study, roughly 1 in 10 pregnant women in the U.S. expose their unborn baby to herbal products.

Researchers say that these findings are potentially concerning because the data on the safety of herbal use during pregnancy is lacking.  Also, the prevalence of the exposure was at its peak in the first 3 months of pregnancy, which is a critical period of development.

“If we assume that our study sample was representative of the 4.2 million births each year in the United States, our findings project that 9.4 percent, or potentially 395,000 U.S. births annually, will involve exposure to at least one herbal product during pregnancy,” lead author Dr. Cheryl S. Broussard, from the Centers for Disease Control and Prevention in Atlanta, told Reuters Health via email.

The study was based on data from 4,239 mothers in the National Birth Defects Prevention Study who delivered live born infants with no major birth defects from 1998 to 2004. 

There were 462 of the mothers studied that reported using herbal products in the 3 months before or at some point during pregnancy.  The prevalence of herbal use during pregnancy was highest during the first trimester.

The study did not include the 86 mothers whose only use of herbs was herbal teas, but the prevalence of herb exposure before or during pregnancy was 8.9 percent.

According to Reuters, the most commonly reported products used during early pregnancy were ginger, which helps ease morning sickness, and ephedra, an herbal stimulant that was banned in the U.S. in 2004 after reports linked it to heart attacks, strokes and at least 155 deaths.

The products that were most commonly used were herbal teas and chamomile, which is also supposed to ease morning sickness.

The report said that herb exposure was most common amongst women older than 30 with more than 12 years of education.

Iowa had the lowest rate of herbal used of the states studied, while Utah had the highest with 16.5 percent.

“The fact that use of herbal products was greatest during the first trimester raises concerns about fetal safety, because this is a critical period of fetal organ development,” Broussard noted.

“Providers should inform patients that it would be prudent to err on the side of caution regarding use of herbal products just before and during pregnancy because little is known about their potential risks.”

On the Net:

Astronomers Catch Supernova, Observe Relativistic Expansion

An international team of scientists, including several astronomers from the Joint Institute for VLBI in Europe (JIVE) and the Netherlands Institute for Radio Astronomy (ASTRON), both located in Dwingeloo, have observed a supernova with peculiar radio emission. In a paper to be published in the January 28, 2010 issue of Nature, the team led by JIVE’s Zsolt Paragi reports, for the first time ever, detection of a relativistic outflow in a Type Ic supernova, thus supporting the link with the even more energetic Gamma Ray Bursts, some of the most energetic explosions in the Universe.

At the end of its life, the central region of a massive star collapses while its outer layer is expelled in a gigantic explosion. This phenomenon is known as a supernova. Supernova SN 2007gr was discovered in 2007 with the Katzman Automatic Imaging Telescope in California, USA. Optical observations showed that it was Type Ic, known to result from the most massive stars. Supernovae are very distant sources, and the radio emissions they produce fade quickly. Therefore, the highest resolution imaging technique, called Very Long Baseline Interferometry (VLBI), is required to receive the extremely faint emission and reveal the details of the explosion process. Because SN 2007gr was located in a relatively nearby galaxy, closer than any other Type Ic supernovae detected in the radio spectrum, it offered a unique opportunity to study this phenomenon.

With the VLBI technique, radio telescopes located up to thousands of kilometers from each other carry out measurements synchronously. Paragi’s team exploited the electronic, real-time VLBI (e-VLBI) capabilities of the European VLBI Network (EVN), by which the data are streamed in real-time from the telescopes to the central data processor at JIVE. Rapid analysis of the SN 2007gr data, obtained 22 days after the initial discovery, showed that the source was still visible in the radio spectrum, and confirmed the technical feasibility of radio observations. Based on this result, the team carried out further observations with the EVN and the Green Bank Telescope in West Virginia, USA. For the first time ever, they were able to show mildly relativistic expansion in such a source.

Although it showed peculiar radio properties, SN 2007gr was otherwise a normal Type Ic supernova. It appears that only a small fraction of the matter that was ejected in the explosion reached a velocity at least half the speed of light. According to the emerging picture, this mildly relativistic matter was collimated into a bipolar narrow cone, or jet. The team concludes that it is possible that all, or at least most, Type Ic supernovae produce bipolar jets, but the energy content of these mildly-relativistic outflows varies dramatically, while the total energy of the explosions is much more standard.

“At least a fraction of Type Ic supernovae have been thought for a long time to produce highly collimated relativistic jets,” says Paragi. “Our observations support this and provide new clues for the understanding of how supernovae explode, and how some of them may be related to the even more energetic gamma ray bursts.”

The Westerbork Synthesis Array Telescope, operated by ASTRON, played an important role in obtaining this result due to its large collecting area, which significantly improved the sensitivity of the VLBI observations. Moreover, it provided an independent measurement of the total flux density, or brightness, of the source.

These observations also showcase how the new e-VLBI services of the EVN empower astronomers to react quickly when transient events occur. “Organizing VLBI observations on a short timescale with the most sensitive radio telescopes on Earth is a challenging task,” notes JIVE director Huib Jan van Langevelde. “Using the electronic-VLBI technique eliminates some of the major issues. Moreover, it allows us to produce immediate results necessary for the planning of additional measurements. The scientific outcome from the SN 2007gr observations shows the impact of the technological development in our field in the last few years, which allows highly efficient collaboration between radio telescopes within and even outside of Europe.”

Image 1: Initial e-VLBI detection of SN 2007gr with the EVN on September 6-7, 2007 (colors). The EVN and Green Bank Telescope VLBI image obtained on November 5-6, 2007 is overlaid (contour representation). By the time of the second observation the source had expanded and was no longer consistent with an unresolved object as bright as indicated by the independent WSRT measurements. At the distance of the supernova this is consistent with an expansion velocity higher than half of the speed of light.

Image 2: Telescopes that participated in the e-VLBI observations of SN2007gr, and the network paths by which data is streamed to the correlator at JIVE.

Image 3: Westerbork Synthesis Radio Telescope, located near Westerbork, the Netherlands.

On the Net:

Baby Boomers’ Hearing Better Than Expected

Despite warnings over loud rock music damaging their ears, the baby boomer generation appears to have better hearing than their parents did.

A new study suggests that the rate of hearing problems at ages 45 to 75 has been dropping for years, at least among white Americans.

“I’m less likely to have a hearing loss when I get to be 70 years old than my grandmother did when she was 70,” Karen Cruickshanks of the University of Wisconsin-Madison, told the Associated Press.

Cruickshanks, the author of the study, is a baby boomer that said she remembers taking criticism from her mother for listening to loud music.

Her research has shown that what people do and experience may help them prevent or delay hearing loss as they get older. 

Experts have theorized that there might be several reasons for the finding, like fewer noisy jobs and better ear protection at worksites, as well as immunizations and antibiotics that prevented certain diseases.

Although experts praise the work, they agree that scientists must now study whether the pattern holds up outside its largely white participants.

They said the results do not mean it is safe to listen to music loudly from an iPod for hours.

Cruickshanks, along with colleagues, reported their findings in the American Journal of Epidemiology.

The team analyzed results of hearing tests given to about 5,300 people that were at least 45 years of age.  The tests were taken between 1993 and 2008, and many participants were tested at five-year intervals. 

The researchers looked at how many tests showed at least mild hearing loss and then to see if the rate of impairment at given ages was affected by when the person was born.

Men showed an average of a 13 percent drop in the risk of impairment for every five-year increase in the date of their birth.  In women, the decrease was about 6 percent.

The team is now trying to uncover reasons for the decline.

Cruickshanks said the explanation would probably be complex and hard to find because the pattern has been going on for decades.

However, she said factors might include fewer people with long-term exposure to very loud noises at work, as well as a decline in smoking.  She added that changes in health care, including immunizations and use of antibiotics could also play a role.

Elizabeth Helzner, an epidemiologist who studies age-related hearing loss at the State University of New York Downstate Medical Center in Brooklyn, said the study is “very impressive.”

She said the findings make sense in light of declines in long-term exposure to loud noise without ear protection in the workplace, and perhaps in hunting and battle.

According to Helzner, those exposures would have happened more to men than women, which would help explain why the results were more dramatic in men.

She added that another factor could be better control of diabetes and heart disease, which are both linked to hearing loss.

Helzner questions the decline with today’s youth due to their eardrums being exposed for a long duration through earbuds.  She said that chronic exposure might prove more hazardous than the briefer bouts baby boomers had.

On the Net:

1-in-10 Children Hear Voices

According to a new study, nearly 1 in 10 seven to eight-year-olds hear voices that are not actually there.

However, the team’s results showed that most children that hear voices do not find them troubling or disruptive to their thinking.

“These voices in general have a limited impact in daily life,” Agna A. Bartels-Velthuis of University Medical Center Groningen in The Netherlands wrote in an email to Reuters Health.

She added that parents with children that hear voices should not be too concerned. 

“In most cases the voices will just disappear. I would advise them to reassure their child and to watch him or her closely.”

The researchers wrote in the British Journal of Psychiatry that up to 16 percent of mentally healthy children and teens might hear voices.  They said that although hearing voices can signal a heightened risk of schizophrenia and other psychotic disorders later in life, a big majority of young people that have these experiences never become mentally ill.

The researchers studied 3,870 Groningen primary schoolers.  Each child was asked whether they had heard “one or more voices that only you and no one else could hear” in the past.

There were nine percent of the children that had answered yes to the question.  Only 15 percent of these children said the voices caused them to suffer, and 19 percent said the voices interrupted their thinking. 

Although girls were more likely to report suffering and anxiety due to the voices, boys and girls were equally likely to report having heard voices.

Bartles-Velthuis and her team found no link between complications in the womb during early infancy with the likelihood of hearing voices.  She and her colleagues expected that hearing voices would be more common among urban children than among their rural peers, “but to our surprise, the contrary was the case in our sample. We have no explanation for this finding.”

The researchers found that although urban children were less likely to hear voices, they were more troubled by them when they could.

The team said that this greater severity suggests that the urban children who heard voices might be at higher risk of going on to develop psychotic illness.

The researchers are now conducting a five-year follow-up study of the children to see how the voice-hearing plays out and what effect it has on behavior.

On the Net:

Female Athletes Injured More Than Male Athletes

Cause: Many athlete training programs based on research using male athletes

Female athletes experience dramatically higher rates of specific musculoskeletal injuries and medical conditions compared to male athletes, according to exercise physiologist Vicki Harber in the Faculty of Physical Education and Recreation at the University of Alberta.

According to her paper, depending on the sport, there can be a two- to sixfold difference in these types of injuries between male and female athletes. That’s because many training programs developed for female athletes are built on research using young adult males and don’t take the intrinsic biological differences between the sexes into account.

Harber has authored a comprehensive guide for coaches, parents and administrators, entitled The Female Athlete Perspective, and published by Canadian Sport for Life (CS4L), which addresses these and other medical issues known to influence women’s participation in sport.

The paper is based on a thorough review of the current literature on the subject, Harber’s extensive knowledge as a researcher in female athlete health and her work in the development of female athletes.

Musculoskeletal injuries, particularly knee and shoulder injuries, are most prevalent, with increased probability of re-injury, says Harber, noting that many of these injuries are preventable. Building awareness about appropriate support for young female athletes and changes to training programs are critical to help them reach their athletic and personal potential, injury-free.

Harber found the risk of the Female Athlete Triad””three separate but interrelated conditions of disordered eating, amenorrhea and osteoporosis””is another area that urgently needs attention for young female athletes.

For female athletes to thrive injury-free, attention must be paid to their proper nutrition to ensure both the athletic performance and healthy reproductive performance associated with bone health and overall wellbeing, Harber found.

On the Net:

Allergy-Related Disorders In Children

Allergies and asthma are a continuing health problem in most developed countries, but just how do these ailments develop over the course of a childhood? In a population-based study designed to help answer this question, researchers at the Norwegian University of Science and Technology (NTNU) found that 40 per cent ““ or two of five — of nearly 5,000 two-year-olds had at least one reported allergy-related disorder. The most common symptom was wheezing, which was reported in 26 per cent of all children in the study, says Ingeborg Smidesang, a PhD candidate in the university’s Faculty of Medicine, and the primary author of the study.

Researchers are careful to point out that there is no guarantee that children who wheeze at two years old will grow up with asthma. “One of the challenges here is that we don’t know which wheezers will develop asthma”, Smidesang says.

The findings are among the first to illustrate the scope of allergy-related problems in such a young group of children, and the challenges that these problems pose for both families and for public health systems overall. “If you think about something like moderate atopic eczema, which can involve quite a few doctor’s visits, and a lot of work on the part of parents, it is quite a big deal”, she says. “This can be quite a burden.”

The study has been published in an online version of Pediatric Allergy and Immunology, a peer-reviewed academic journal. Among the findings reported is that fully 21 per cent of the 5000 children in the study, or about 1000 children, had been tested for allergies. Roughly 60 per cent of these 1000 children were reported by their parents to have had a positive allergy test. However, when researchers randomly selected 390 children for allergy testing, only eight per cent had a positive test. The allergy-related disorders that were studied were eczema, asthma, asthma-like symptoms and hay fever. Researchers found that boys were more likely than girls to have an allergy-related disorder, Smidesang said.

Allergy-related disorders vary widely within countries and between countries. For example, children in northern Norway are more likely than children in southern Norway to have atopic dermatitis, Smidesang said, probably because the winters are longer in the north than in the south. Another comparison between Sweden and the UK in 2002-2003 showed that asthma symptoms in children were roughly 10 per cent in Sweden compared to 21 per cent in the UK. Researchers can make conjectures about what causes these variations, but the bottom line is that medical researchers really don’t understand what causes children to develop allergies and what can be done to prevent them.

Smidesang’s study is a part of a larger effort called PACT (Prevention of Allergy among Children in Trondheim), which began in 2000 to try to better understand how allergy-related symptoms develop in children and to investigate the effectiveness of risk-factor intervention, including increasing omega-3 fatty acid intake, reducing parental smoking and indoor dampness. A control group of 14 000 children, from which the current study is drawn, was established to track fluctuations in risk factor levels and to provide comparison data. A second group of roughly 3000 children was recruited for a proactive intervention effort. The program started during pregnancy and continued until the children reached the age of 2. The 390 children who were randomly selected for skin prick allergy testing will be followed up when they are 6 years old.

Full bibliographic information: Allergy related disorders among 2-yrs olds in a general population. The PACT Study. Ingeborg Smidesang , Marit Saunes, Ola Storrø, Torbjørn Øien, Turid Lingaas Holmen, Roar Johnsen and Anne Hildur Henriksen. Pediatric Allergy and Immunology, published online 9 Dec 2009 DOI 10.1111/j.1399-3038.2009.00954.x

Image Caption: Allergies in children. Not all children who are born with allergies will continue to have allergies into adulthood. Determining which children will be afflicted as adults remains a continuing challenge. Photo credit: NTNU Info/Jens Søraa

On the Net:

Psychodynamic Psychotherapy Brings Lasting Benefits

Psychodynamic psychotherapy is effective for a wide range of mental health symptoms, including depression, anxiety, panic and stress-related physical ailments, and the benefits of the therapy grow after treatment has ended, according to new research published by the American Psychological Association.

Psychodynamic therapy focuses on the psychological roots of emotional suffering. Its hallmarks are self-reflection and self-examination, and the use of the relationship between therapist and patient as a window into problematic relationship patterns in the patient’s life. Its goal is not only to alleviate the most obvious symptoms but to help people lead healthier lives.

“The American public has been told that only newer, symptom-focused treatments like cognitive behavior therapy or medication have scientific support,” said study author Jonathan Shedler, PhD, of the University of Colorado Denver School of Medicine. “The actual scientific evidence shows that psychodynamic therapy is highly effective. The benefits are at least as large as those of other psychotherapies, and they last.”

To reach these conclusions, Shedler reviewed eight meta-analyses comprising 160 studies of psychodynamic therapy, plus nine meta-analyses of other psychological treatments and antidepressant medications. Shedler focused on effect size, which measures the amount of change produced by each treatment. An effect size of 0.80 is considered a large effect in psychological and medical research. One major meta-analysis of psychodynamic therapy included 1,431 patients with a range of mental health problems and found an effect size of 0.97 for overall symptom improvement (the therapy was typically once per week and lasted less than a year). The effect size increased by 50 percent, to 1.51, when patients were re-evaluated nine or more months after therapy ended. The effect size for the most widely used antidepressant medications is a more modest 0.31. The findings are published in the February issue of American Psychologist, the flagship journal of the American Psychological Association.

The eight meta-analyses, representing the best available scientific evidence on psychodynamic therapy, all showed substantial treatment benefits, according to Shedler. Effect sizes were impressive even for personality disorders””deeply ingrained maladaptive traits that are notoriously difficult to treat, he said. “The consistent trend toward larger effect sizes at follow-up suggests that psychodynamic psychotherapy sets in motion psychological processes that lead to ongoing change, even after therapy has ended,” Shedler said. “In contrast, the benefits of other ’empirically supported’ therapies tend to diminish over time for the most common conditions, like depression and generalized anxiety.”

“Pharmaceutical companies and health insurance companies have a financial incentive to promote the view that mental suffering can be reduced to lists of symptoms, and that treatment means managing those symptoms and little else. For some specific psychiatric conditions, this makes sense,” he added. “But more often, emotional suffering is woven into the fabric of the person’s life and rooted in relationship patterns, inner contradictions and emotional blind spots. This is what psychodynamic therapy is designed to address.”

Shedler acknowledged that there are many more studies of other psychological treatments (other than psychodynamic), and that the developers of other therapies took the lead in recognizing the importance of rigorous scientific evaluation. “Accountability is crucial,” said Shedler. “But now that research is putting psychodynamic therapy to the test, we are not seeing evidence that the newer therapies are more effective.”

Shedler also noted that existing research does not adequately capture the benefits that psychodynamic therapy aims to achieve. “It is easy to measure change in acute symptoms, harder to measure deeper personality changes. But it can be done.”

The research also suggests that when other psychotherapies are effective, it may be because they include unacknowledged psychodynamic elements. “When you look past therapy ‘brand names’ and look at what the effective therapists are actually doing, it turns out they are doing what psychodynamic therapists have always done””facilitating self-exploration, examining emotional blind spots, understanding relationship patterns.” Four studies of therapy for depression used actual recordings of therapy sessions to study what therapists said and did that was effective or ineffective. The more the therapists acted like psychodynamic therapists, the better the outcome, Shedler said. “This was true regardless of the kind of therapy the therapists believed they were providing.”

On the Net:

The Evolution And Spread Of Drug-Resistant Bacteria

An international team of researchers has used high resolution genome sequencing to track a particularly virulent strain of MRSA as it traveled between South America, Europe and Southeast Asia. The findings shed light on how these deadly bacteria are able to spread from patient to patient in a single hospital and, on a larger scale of geography and time, between countries and entire continents.

The researchers included scientists from Rockefeller University, the Wellcome Trust Sanger Institute and the University of Bath in the United Kingdom, Instituto de Tecnologia Química e Biol³gica (ITQB) in Portugal and a hospital in Thailand.

“MRSA is responsible for over 18,000 fatalities in the United States each year according to CDC estimates, a number virtually identical to the current fatality rate of AIDS in the USA,” says Alexander Tomasz, who is Dr. Plutarch Papamarkou Professor and head of the Laboratory of Microbiology and Infectious Disease at Rockefeller.

Earlier studies by Rockefeller and ITQB scientists demonstrated that the most successful MRSA strains belong to a limited number of families, or clones, that are responsible for the overwhelming majority “” more than 80 percent “” of all MRSA disease in hospitals worldwide.

In the new research, the scientists focused on one of the most successful MRSA clones, called the Brazilian MRSA, which was first identified at Rockefeller in 1995 and which has the DNA sequence type assignment ST239 (SCCmec III). Isolates of Brazilian MRSA are resistant to virtually all currently available antibacterial agents except vancomycin.

Colleagues at ITQB in Portugal and Susana Gardete, a postdoctoral fellow in the Laboratory of Microbiology and Infectious Disease at Rockefeller, prepared DNA from more than 40 of the Brazilian MRSA isolates recovered between 1982 and 2003 from a variety of sources in Europe, South America and Asia. These preparations were analyzed by colleagues at the Sanger Institute using a new, very high throughput DNA sequencing technology.

The findings reported in Science provide an unparalleled view of the evolutionary history and age of the Brazilian MRSA clone. It was possible to show that the most likely birthplace of Brazilian MRSA was actually Europe, from where it spread to South America and Asia. From there, it continued to evolve and was reintroduced to Europe at a later date.

Applying the same technology to 20 Brazilian MRSA samples recovered from individual patients in a single Thai hospital within the short timeframe of a few weeks, the scientists were able to trace with precision the patient-to-patient spread of the MRSA bacterium.

“The remarkable insights that this study provides into the stages of evolution of a major human pathogen illustrates the power of collaboration between evolutionary biologists, experts in DNA sequencing and bioinformatics and epidemiologists who can provide carefully selected and characterized strain collections for each study,” says Tomasz.

For more than 20 years, Tomasz and Hermínia de Lencastre, a senior research associate in his laboratory, have collected isolates of MRSA patients all over the world. These carefully characterized samples are stored in freezers at ITQB and Rockefeller as part of the CEM/NET Initiative, an ongoing international project in molecular epidemiology first organized by de Lencastre and Tomasz in 1995.

“The application of full genome sequencing described in the Science report provides us with a view of how MRSA evolves on two different scales of time and geography,” says de Lencastre. “It not only documents evolution on the timescales of decades and over the geography of entire continents, but also on the shorter timescale of a few weeks within the confines of a single hospital in Thailand.”

“It would be interesting to add to these two stories a third one in which we applied full DNA sequencing on an even shorter scale of time and space,” says Tomasz. “In a recent study published in PNAS in 2007 we were able to track the in vivo evolution of multidrug resistance in a single MRSA lineage recovered from a patient undergoing a three-month course of chemotherapy.”

Reference: Science 327: 469″“474 (January 21, 2010). Evolution of MRSA During Hospital Transmission and Intercontinental Spread. Simon R. Harris, Edward J. Feil, Matthew T. G. Holden, Michael A. Quail, Emma K. Nickerson, Narisara Chantratita, Susana Gardete, Ana Tavares, Nick Day, Jodi A. Lindsay, Jonathan D. Edgeworth, Hermínia de Lencastre, Julian Parkhill, Sharon J. Peacock and Stephen D. Bentley

On the Net:

Kids Eat Better If Menus Show Calorie Count

Few areas currently mandate nutritional information on chain restaurant menus

In a new study, the amount of calories selected by parents for their child’s hypothetical meal at McDonald’s restaurants were reduced by an average of 102 calories when the menus clearly showed the calories for each item. This is the first study to suggest that labeled menus may lead to significantly reduced calorie intake in fast food restaurant meals purchased for children. Led by researcher Pooja S. Tandon, MD, from Seattle Children’s Research Institute, these findings support nutritional menu labeling and show that when parents have access to this information they may make smarter meal choices for their children. “Nutrition Menu Labeling May Lead to Lower-Energy Restaurant Meal Choices for Children” published online January 25 in Pediatrics.

At a pediatric practice in Seattle, 99 parents of 3- to 6-year-olds who sometimes eat in fast food restaurants with their children were surveyed about their fast food dining habits. They were presented with sample McDonald’s restaurant menus which included current prices and pictures of items, and asked what they would select for themselves and also for their children as a typical meal. Half of the parents were given menus that also clearly showed calorie information for each item. Choices included most of the items sold at McDonald’s, including a variety of burgers, sandwiches, salads, dressings, side items, beverages, desserts and children’s “Happy Meals.” Parents who were given the calorie information chose 102 fewer calories on average for their children, compared with the group who did not have access to calorie information on their menus. This reflects a calorie reduction of approximately 20%. Notably, there was no difference in calories between the two groups for items the parents would have chosen for themselves.

“Even modest calorie adjustments on a regular basis can avert weight gain and lead to better health over time,” said Dr. Tandon, research fellow at Seattle Children’s Research Institute and the University of Washington School of Medicine. “Just an extra 100 calories per day may equate to about ten pounds of weight gain per year. Our national childhood obesity epidemic has grown right alongside our fast food consumption. Anything we can do to help families make more positive choices could make a difference. Interestingly, by simply providing parents the caloric information they chose lower calorie items. This is encouraging, and suggests that parents do want to make wise food decisions for their children, but they need help. Now that some areas are requiring nutritional information in chain restaurants, we have opportunities to further study what happens when we put this knowledge in the hands of parents.”

There was no correlation between the families’ typical frequency of fast food dining and calories selected, for either parents or children.

A growing number of jurisdictions across the country have begun mandating that nutritional information be readily available at point-of-ordering in chain restaurants. Currently more than 30 localities or states are considering policies that would require calories and other nutrition information to be clearly visible””four have already implemented policies. Federal menu labeling standards have also been discussed as part of health care reform legislation.

On the Net:

Pope Urges Priests To Make Good Use Of The Internet

Pope Benedict XVI told priests who have a hard time getting their message out to their parishioners to start making ‘astute’ use of Internet.

The pope, who has used the Web heavily in recent years, urged priests to use any and all multimedia tools available to preach the word of God and to reach out to people of other religions and cultures.

In a message released by the Vatican, Benedict XVI said that e-mail and the Web is not always enough. He feels that priests should also make use of cutting-edge technologies to express themselves and lead their communities.

“The spread of multimedia communications and its rich ‘menu of options’ might make us think it sufficient simply to be present on the Web,” but priests are “challenged to proclaim the Gospel by employing the latest generation of audiovisual resources,” he said.

Benedict said that young priests should become familiar with new media technologies while still in seminary, though he emphasized the importance of maintaining theological and spiritual principles while using media resources.

The 82-year-old pope has often been wary of new media, warning about the dangers posed by sex and violence in various media outlets. However, he has also praised the new ways of communicating as a “gift to humanity” when used to foster friendship and understanding.

The Vatican is trying hard to keep up with the rapidly changing technological world.

Last year it opened a YouTube channel as well as a portal dedicated to the pope. The Pope2You site gives news on the pope’s trips and speeches. There is even a Facebook app that allows users to send postcards to their family and friends containing photos of the pope and his excerpts.

Many priests and top religious figures already interact with the faithful online. One of Benedict’s advisers, Cardinal Crescenzio Sepe, the archbishop of Naples, has his own Facebook profile and so does Cardinal Roger Mahony, archbishop of Los Angeles.

Monsignor Claudio Maria Celli, who heads the Vatican’s social communications office, said that Benedict’s words aimed to encourage reflection in the church on the positive uses of new media, but not to enforce it. It’s not mandatory that all priests open a blog or a Website. It is only offering that “the church and the faithful must engage in this ministry in a digital world,” Celli told reporters.

Celli, 68, said that young priests would have no trouble following the pope’s message, but, those of a certain age may find it much harder.

The Roman Catholic Church established World Communications Day, which is on May 16, in 1966, according to the AFO news agency. This year’s theme is “The Priest and Pastoral Ministry in a Digital World: New Media at the Service of the Word.”

On the Net:

Pregnant Moms Should Not Smoke Marijuana

New studies show that women who smoke marijuana while pregnant may be setting up their babies for future developmental problems.

Children of heavy pot smokers may have problems such as short-term memory loss, trouble concentrating, and clouded judgment.

One study even found that young children whose mothers smoked marijuana during pregnancy had a higher risk of leukemia than those whose mothers did not.

What’s more, there’s no way to know if the pot you’re smoking has been laced with other drugs (such as PCP) or contaminated with pesticides, which would put your baby at an even greater risk.

Although smoking cigarettes while pregnant is known to impair fetal growth, studies on marijuana use have been inconclusive.

The findings of a recent study published in the Journal of the American Academy of Child and Adolescent Psychiatry, conducted by Hannan El Marroun of Erasmus University Medical Center in Rotterdam, shows that marijuana use, even restricted to early pregnancy, may have irreversible effects on fetal growth.

This study was performed on 7,500 pregnant women who were surveyed on their use of alcohol, tobacco and drugs, and had ultrasounds to chart fetal growth during the first, second and third trimesters.

Overall, 214 women said they had used marijuana before and during early pregnancy; 81 percent quit after learning they were pregnant, but 41 women continued to smoke marijuana throughout pregnancy.

The researchers found that, on average, marijuana users gave birth to smaller babies, particularly those who had used throughout pregnancy.

Researchers suggest the only way to prevent this is for women to completely quit smoking pot before becoming pregnant.

According to El Marroun’s team, mothers’ marijuana use could stunt fetal growth for several reasons. Like tobacco smoking, it may deprive the fetus of oxygen. It is also possible that the byproducts of marijuana directly affect the developing nervous and hormonal systems of the fetus.

Finally, the researchers note, pregnant women who use marijuana may have other factors in their lives – such as a less-than-healthy diet or chronic stress — that could contribute to poor fetal growth.

On the Net:

Cartilage Repair Can Improve Life, Ease Burden On Health Services

Osteoarthritis (OA) is one of the ten most disabling diseases in the developed world and is set to become more of a financial burden on health services as average life expectancy increases.

OA is the most common form of arthritis, affecting nearly 27 million Americans or 12.1% of the adult population of the United States, according to Laurence et al.´ A 2001 study showed that the disease costs US health services about $89.1 billion,2 and indirect costs relating to wages and productivity losses and unplanned home care averaged $4603 per person.3

In a review for F1000 Medicine Reports, Yves Henrotin and Jean-Emile Dubuc examine the range of therapies currently on offer for repairing cartilaginous tissue. They also consider how recent technological developments could affect the treatment of OA in elderly populations.

The most promising therapeutic technique is Autologous Chondrocyte Implantation (ACI), which involves non-invasively removing a small sample of cartilage from a healthy site, isolating and culturing cells, then re-implanting them into the damaged area.

A recent enhancement to this method is matrix-assisted ACI (MACI) – where the cultured cells are fixed within a biomaterial before being implanted to promote a smooth integration with the existing tissues. ACI and MACI have previously been reserved for younger patients who are not severely obese (i.e. with a BMI below 35), whose cartilage defect is relatively small and where other therapies have already been tried.

Professor Henrotin said: “The huge financial burden emphasizes the acute need for new and more effective treatments for articular cartilage defects, especially since there are few disease modifying drugs or treatments for OA.”

Given the encouraging results of the trials cited in this review, Henrotin says MACI/ACI therapies could be used to delay or prevent the need for total joint replacement in OA patients. However, it remains to be seen whether these techniques are superior in terms of risk and cost-effectiveness when compared with current alternatives.

While the implantation procedure needs to be simplified, and specific clinical studies on elderly patients are needed, Henrotin is optimistic about improvements currently being researched, and considers that the use of MACI/ACI “constitutes a real opportunity for such patients in the next decade.”

On the Net:

Stress Peptide And Receptor May Have Role In Diabetes

The neuropeptide corticotropin-releasing factor (CRF) makes cameo appearances throughout the body, but its leading role is as the opening act in the stress response, jump-starting the process along the hypothalamus-pituitary-adrenal (HPA) axis. Researchers at the Salk Institute for Biological Studies have found that CRF also plays a part in the pancreas, where it increases insulin secretion and promotes the division of the insulin-producing beta cells.

These findings, which will be published in this week’s edition of the Proceedings of the National Academy of Sciences, may provide new insights into diabetes, particularly type 1, as well as suggest novel targets for drug intervention.

The pancreas is both an exocrine gland, producing enzymes that are secreted into the gut to help digest food, and an endocrine gland, secreting a cocktail of hormones, including insulin, which is manufactured by beta cells that reside in endocrine islets within the “sea” of exocrine tissue.

Plasma glucose increases after a meal, and, in healthy people, insulin is secreted to instruct the body to take up the glucose and store it in the liver or muscles to bring blood glucose levels down. In diabetes, the glucose metabolism is misregulated: In type 1 diabetes, the immune system attacks the beta cells, which then are unable to produce sufficient insulin. In type 2 diabetes, the most prevalent form of the condition, patients have sufficient beta cells, which still secrete insulin, but the body is unable to respond correctly, and plasma glucose remains constantly elevated.

CRF, in concert with its receptor, CRFR1, has long been known as key to the body’s response to various forms of stress, but the pair is also involved in many more processes, including a number with direct ties to metabolism. As early as the 1980s, studies had suggested that pancreas cells can respond to CRF, but the few limited observations did not demonstrate the nature of the response or which cells or receptors were involved.

Prompted by evidence suggesting that CRF has an effect on beta cells, a team led by first author Mark O. Huising, Ph.D., and senior author Wylie Vale, a professor and head of the Salk Institute’s Clayton Foundation Laboratories for Peptide Biology and holder of the Helen McLoraine Chair in Molecular Neurobiology, sought to verify that effect and determine its underlying mechanism. Working with cell lines, pancreatic islets from mice and human donors, as well as mouse models, Vale’s lab, which discovered CRF in the early 1980s, conducted a series of experiments that collectively demonstrated the presence and actions of CRFR1 in the islets.

“We found that beta cells in the pancreas actually express the CRFR1 receptor,” explains Huising, a postdoctoral fellow in the Clayton Foundation Laboratories. “And once we had established the presence of CRFR1 in the islet, we started filling in the blanks, trying to learn as much about pancreatic CRFR1 as we could.”

What they discovered was that beta cells exposed to CRF, one of the peptides that activate the CRFR1 receptor, can respond in at least two ways. First, they increase their secretion of insulin if they simultaneously encounter high levels of glucose. The higher the levels of glucose, the more insulin they release in response to CRF and the more rapidly blood levels of glucose are reduced.

Working in collaboration with a group at the Panum Institute in Copenhagen, the researchers went on to establish that beta cells exposed to CRF also activate the MAPK pathway, which is a key pathway implicated in beta cell division. Mature, differentiated beta cells can divide, albeit slowly, but if they are exposed to a molecule that will activate the CRFR1 receptor, they will start to divide somewhat more rapidly, which is especially relevant in the context of type 1 diabetes.

“The thinking is that type 1 diabetic patients usually have a few beta cells left in their pancreas, so those remaining beta cells, though not enough to control glucose levels, may seed a population of regenerating beta cells,” Huising says.

While a few gut peptides termed incretins, which are currently used to increase insulin secretion in patients, have also been shown to accelerate beta cell division, the Vale group’s findings suggest an incretin-like effect for a peptide normally associated with the stress response.

“Anything we can find out that will drive proliferation or the division of beta cells is very interesting, and being able to stimulate beta cells to divide a little faster may be part of a solution that may ultimately, hopefully, allow management of type 1 diabetes, ” Vale says. “But because it is an autoimmune condition, making the cells divide won’t be enough. That is why researchers are working hard to solve the problem of destruction of beta cells.”

These results emphasize the complexity of metabolic disorders and identify novel targets to treat diabetes and obesity. One of the key questions remaining for Vale and his group is under what conditions the pancreatic CRFR1 system is utilized and gets activated.

“We know what it can do, but we don’t fully understand the physiological circumstances under which it does it,” Vale says. “This receptor appears to be important within the pancreas. What we haven’t determined, though, is whether this is a stress-linked phenomenon because we still have questions regarding the source of the hormone that acts on pancreatic CRFR1. We would like to know where it is coming from to determine if it is released in stressful conditions to bring about the effects we observed.”

On the Net:

Study Projects Increased Conflict And Speculation In Tropical Forests Despite Copenhagen Accord

Unclear land rights, corruption threaten to undermine success of promised REDD funds

As environmental and political leaders struggle to determine how to move forward from the UN Climate Change Conference in Copenhagen, a new report by an international coalition of top forest organizations warns that the failure to set legal standards and safeguards for a mechanism to transfer funds to forest-rich nations may trigger a sharp rise in speculation and corruption, placing unprecedented pressures on tropical forest lands and the communities that inhabit them.

The report, released today by the Rights and Resources Initiative (RRI) at an event at Chatham House, concludes that unclear land rights in some countries, coupled with threats from corruption, could block success of the US$3.5 billion pledged for a program to reduce the amount of carbon in the atmosphere by preventing the unfettered destruction of tropical forests.

The authors of The End of the Hinterland: Forests, Conflict and Climate Change cite numerous studies suggesting that in 2010 the potential for enormous profits will lead to increased competition over forest resources between powerful global governments and investors on the one hand, and local actors on the other, resulting in new and resurging violent conflict.

“Throwing heaps of money into a system without agreeing to any framework or standards has the potential to unleash a wave of speculation unlike anything we’ve ever seen in our lifetime,” said Andy White, Coordinator of RRI and one of the lead authors of the report. “The result will be chaos on the carbon markets, as well as chaos in the field. It will be like the Wild, Wild West.”

Figuring prominently in the Copenhagen Accord last December, the initiative known as Reduced Emissions from Deforestation and Degradation””or REDD””was heralded as one of the rare points of consensus going into Copenhagen.

Negotiators hoped REDD might provide low-cost and easy emissions reductions and offsets for developed countries, as well as finance and investment for developing, forest-rich countries. However, their failure to agree on legal standards and safeguards for implementing REDD schemes suggests that there will be no uniformity of carbon governance across these countries. The study says that in this situation, the inevitable diversion of funds, land grabs, and conflict will limit reductions in forest emissions and greatly worsen the plight of forest peoples in the South.

“Forests will remain remote from the centers of power, but they will be carved up, controlled, and used as global political bargaining chips like never before,” said Jeffrey Hatcher, Policy Analyst for RRI and co-author of the report. “Unless governments adopt the necessary tenure and governance reforms that will lead to a reduction in emissions, the world faces a devastating back-slide into a ‘business as usual’ mode of thinking.”

The authors of the new report argue that the era of forests as “hinterlands,” or remote areas largely ignored except as a supply of cheap natural resources, is swiftly coming to an end. Commodities like food, fuel, fiber and carbon are becoming exponentially more valuable, and new global satellite and technology provides the tools to monitor, assess, and potentially control forests remotely.

As forest areas boom in value, investors, traders, and northern governments will contest for these lands. Governments still declare ownership of about 65 percent of the world’s forests, while only about 9 percent are legally owned or designated for use by communities and indigenous peoples. And national and local leaders may become the target of efforts to use bribery to obtain forest-related agreements that fail to consider the rights of those most affected.

The End of the Hinterland provides examples of conflicts between forest communities and outsiders:

    * In Peru, the “Bagua Massacre,” a violent clash between indigenous protestors and military police along the jungle back roads of the Peruvian Amazon, left nearly 100 dead. Sparked by the government’s allocation of ancestral forests lands for oil and gas exploitation, a coalition of indigenous groups occupied key oil installations and roads for several months in protest to a series of presidential decrees that violated their rights to these lands. After 57 days, President Alan Garcia violently evicted the protestors and upheld the decrees.

    * In India, despite the enactment in 2009 of a forests right law that was hailed as a landmark for tribal peoples and forest dwellers, reports from the field show little real change. Minimal effort has been made to alert villagers to the law’s provisions, and those who have managed to file claims are only given a fraction of the area under occupation/cultivation with no opportunity to appeal. This is taking place amid escalating confrontation between Maoist rebels and the Government, and many believe the real objective of the Government’s “Operation Green Hunt” is to clear the indigenous forest dweller population, known as adivasi, from mineral-rich lands so that the Government can hand the lands over to the corporations.

“A counterbalance to these threats comes from the growing movements and high-level organization of local communities and indigenous peoples who are insisting on respect for their rights,” said Marcus Colchester of the Forests Peoples Programme. “If their rights are not safeguarded in line with international law, REDD will not work.”

Armed with new technologies and tools, such as GPS devices and GIS mapping, indigenous peoples have taken steps to obtain legal recognition of rights to their lands, especially in Latin America, though Africa and Asia remain far behind; it would take 270 years, for example, for the tenure distribution in the Congo Basin to match that of the Amazon Basin.

“The COP15 fiasco demonstrates that the existing world order offers Africa nothing,” said Kyeretwie Opoku of Civic Response-Ghana. “Africa cannot afford a ‘business as usual’ approach. Without restructuring power relations between the global North and South, between corporations and peoples, and between national elites and marginalized communities, no amount of climate or REDD funding will prevent a regional and perhaps global social disaster. No matter how challenging this might be, civil society must mobilize for an entirely new direction for our forests and our people. We really have nothing to lose.”

The report notes as well encouraging signs of progress on tenure reform in countries such as China and Brazil. China’s recent forest land reform, commenced in the early 2000s, allowed collective forest owners to reallocate their use rights to households or to keep them as collective. In 2009, a national-level survey showed that the impact of these reforms have affected more than 400 million landowners and over 100 million hectares of forests, making it arguably the largest tenure reform in history.

In Brazil, the Supreme Court in March 2009 formally recognized the land rights of the Rapos Serra do Sol indigenous reserve, and a legal study of Brazilian and international law concluded that the Surui tribe can claim legal ownership of forest-carbon rights associated with their lands in Rondônia, Brazil.

Nevertheless, despite significant advances in 2009 around tenure reform in certain regions of the world, the report concludes that for real emissions reductions around REDD programs to happen, policymakers must invest in strengthening local organizations, governance, and rights, rather than invest in the same business and development models that failed in the past.

One significant risk to the future of forests, according to the report, is relying on global institutions like the World Bank to become a conduit for REDD or other climate funds. The authors warn that as a government-owned institution with limited power to move its member countries to adopt global standards, the World Bank is inherently more likely to support the conventional status quo development agenda, rather than support the progressive change and local initiatives that are urgently needed for REDD to be effective.

“If the new funds are painted as ‘REDD’ but end up going through the old conventional development models, then we are simply engaging in ‘business as usual,'” said White. “Taking action now to ensure that the new era of forest reform will be locally-led and rights based, rather than externally controlled and corrupt, will determine whether forest communities are protected and global emissions targets are achieved.”

On the Net:

Zero Vision, Zero Results?

The zero vision has had its day. Ten years after the Norwegian authorities launched its zero casualties objective for road safety, statistics have not improved

So says Trond Ó¦ge Langeland, staff engineer at the Norwegian Public Roads Administration and a PhD graduate from the University of Stavanger. He based his thesis on interviews with 30 experts on road safety, and his conclusion is less than encouraging.

Since the mid-1990’ies, the number of people killed in road accidents has not decreased significantly. 560 people were killed in traffic accidents in 1970. Fifteen years on, the figure was less than 300. The National Transport Plan 2002-2011 was launched in 1999, and the zero vision with it. Since then, the number of fatalities has remained largely unchanged.

“The zero vision has drawn more attention to road safety, but it has not yielded any significant short-term gains so far,” Langeland says.

No decline

There may be a number of reasons why the casualty and severely injured figure has stalled since the mid-1990ies. It could be attributed to a series of preventive measures implemented since the 1970ies, which — in spite of the traffic boom — have had a beneficial effect. Compulsory use of safety belts and more secure vehicles are among them.

“Still, it is a paradox that decrease in casualties is less than might be expected,” Langeland says.

Until 1990, the casualty figure shrunk by roughly 100 during each decade. This should imply a number of approximately 150 road casualties today — not 250, which is the actual figure, he explains.

Speed kills

The zero vision stemmed from a desire to further reduce the number of fatalities and severe injuries from road accidents. According to Langeland, it is to be regarded more as a vision than an actual target. It deemed the high number of road casualties unjustifiable. It also acknowledged a shared responsibility between traffic planners and road users. Vehicles, road users and infrastructure are interrelated. This interrelatedness is best exemplified by speed limits being set on the basis of the human body’s endurance at the moment of collision. According to traffic rules, the limit should not exceed 70 km/h on roads where there is a risk of frontal collisions taking place. For side collisions, the limit is 50 km/h. In areas where non-motorists and vehicles may conflict, the speed should not exceed 30 km/h.

“Breach of speed limits is a strongly contributing factor to many road accidents. Implementing preventive measures to ensure lower speed levels, such as speed caps in cars, will reduce the annual number of people killed in road accidents significantly,” Langeland says.

This is why there is an increased effort to separate speed limits, he adds.

“Politicians could adopt unpopular initiatives such as speed caps, designed to reduce the number of traffic casualties and severely injured. We are restricted by international regulations in some areas, but draconian measures may sometimes give significant gains.”

Restrictions vs freedom

Although Langeland believes the zero vision to be an unobtainable goal, he still thinks it has something going for it.

Even though the vision does not permeate traffic authorities and the police on a daily basis, it has raised awareness among the public and the safety sector. It may serve as a guiding light on the way to achieving lower casualty figures than 250, Langeland believes.

“A future figure of 50 annual casualties is realistic. But it will require more forceful measures, which may pose a threat to people’s individual freedom and driving experience. The government and politicians alike must address this dilemma.”

Langeland refers to drunk driving now being socially unacceptable. The same should apply to not using a safety belt and driving above speed limits, which are the two main causes of road deaths today.

Future measures should concentrate on preventing head-on collisions, off-the-road accidents and accidents involving non-motorists. Physical obstacles are effective at preventing accidents, Langeland asserts. Median barriers and other markers set up to prevent head-on collisions is one such measure. Roadside terrain designed to reduce the impact of accidents is another. All efforts should be considered when new roads are being planned.

The traffic researcher offers a final word of advice: “The time loss from adhering to, and respecting, speed limits is miniscule. It really shouldn’t matter to us if we are 30 seconds late for dinner.”

Image 1: UNPOPULAR INITIATIVES CAN REDUCE THE NUMBER OF CASUALTIES: “Politicians could adopt unpopular initiatives such as speed caps, designed to reduce the number of traffic casualties and severely injured. We are restricted by international regulations in some areas, but draconian measures may sometimes give significant gains.” Photo: Elisabeth Tønnessen

Image 2: THE ZERO VISION: The zero vision stemmed from a desire to further reduce the number of fatalities and severe injuries from road accidents. Here are shoes of young passenger at Trafikksikkerhetshallen at Forus near Stavanger and Sandnes. Photo: Elisabeth Tønnessen

On the Net:

Tobacco Plants Protect Themselves By Changing Flowering Times

Messenger molecule in oral secretions of herbivorous insects changes flower opening time of their host plants: Hummingbirds take over role as pollinators from moths

Butterflies and moths are welcome visitors to many plant species. Plants attract insect pollinators with the colors, forms, nectars and scents of their flowers to ensure fertilization and reproduction. However, female moths are also threatening to the plant: Once attracted by the flower’s scent, they lay their eggs on the green leaves, and shortly voracious young caterpillars hatch. Scientists from the Max Planck Institute for Chemical Ecology have now discovered how tobacco plants successfully solve this dilemma. The researchers found that herbivory changed the opening time of the flower buds from dusk to dawn. In addition the emission of flower scents was dramatically reduced. This change in flower timing was elicited by specific molecules in the oral secretions of the larvae, and required the jasmonate signaling cascade, which is known to elicit a host of other defense responses in plants. Instead of night-active moths, these morning-opening flowers attract day-active hummingbirds which are also able to transfer pollen – without threatening the plant’s life.

Outbreak of tomato hornworms

During field experiments performed by PhD students of the Department of Molecular Ecology headed by Prof. Ian T. Baldwin in the Great Basin Desert of Utah (USA) in summer 2007, a massive outbreak of tomato hornworms (Manduca quinquemaculata) occurred. Almost every tobacco plant of the native species Nicotiana attenuata on the field site was attacked by these herbivores which prefer plants of the nightshade family. Danny Kessler intensively studied the infested plants and noticed that these plants had many flowers that opened after sunrise ““ although tobacco is typically a night-flowering plant and usually opens its flower buds after sunset. This finding resulted in experiments conducted in the following two years that showed that the flowering time postponed by 12 hours was directly related to herbivory.

Pollination wanted, but no oviposition

Ecologists had already noticed that female moths attracted for pollination laid their eggs, and shortly leaf-eating larvae hatched to feed on the same plant. The scientists considered whether plants would actually submit without reserve to this life-threatening disadvantage – just for pollination. They intensively studied the remarkable morning-opening flowers (MoF) which were only produced by plants that had been attacked by insect larvae and compared them to the usually occurring night-opening flowers (NoF). The first experiment already revealed an astounding result: MoF did not emit the attractant benzyl acetone anymore (see also Kessler et al., Science 321, 2008) and also the sugar concentration in the floral nectar was considerably reduced. Furthermore, it was striking that the petals of MoF only opened to a third of the size of NoF. All in all, the MoF were rendered literally unnoticeable by the moths ““ however, they may become interesting for different pollinators living nearby the field station: hummingbirds.

Hummingbirds visit the morning-opening flowers and serve as pollinators

To find out whether moths or birds successfully transferred pollen from flower to flower, the scientists determined the outcrossing rate of plants visited by moths or hummingbirds in field experiments. They removed the anthers from young flower buds to rule out self-pollination. Then an unattacked and an insect-attacked tobacco plant were covered with a mesh-covered wire cage until the morning of the next day to exclude night-active pollinators. A second pair of plants remained uncovered and thereby accessible to night-active pollinators. Before dawn the cages were exchanged, so that the plants that had been uncovered during the night were now covered and the plants that had been covered at night became accessible to pollinators during the day. In the evening all experimental plants were covered and the plants remained so until seed capsules were produced. Counting of the capsules revealed that a significant majority of capsules on plants that had not been attacked by caterpillars originated from flowers that were pollinated during the night between 8:00 p.m. and 6:00 a.m., whereas in caterpillar-infested plants successful pollination had occurred in majority during the day between 6:00 a.m. and 8:00 p.m., therefore by hummingbirds.

The scientists verified the assumption that actually hummingbirds visit the MoFs and drink their nectar by directly observing and counting out more than 1000 flowering wild tobacco plants. 18 humming bird visitations were intensively studied which showed hummingbirds visiting larvae-infested plants. As a matter of fact, more than 90% of the birds preferred the MoF compared to NoF, even if only a few MoF were on a plant. “It is likely that the hummingbirds can recognize the special shape of the partially open corollas of the MoF in the morning and associate these characteristics with the reliable quality and quantity of the nectar in these flowers,” says Celia Diezel, co-author of the study.

Experiments using larval oral secretions and transgenic tobacco plants

In further experiments the scientists studied how attacked plants recognize herbivory and subsequently change the developmental program of the flowers to favor hummingbirds. Instead of infesting the plant by putting caterpillars on the leaves, the researchers mechanically wounded a leaf with a pattern wheel and applied oral secretions from hornworm larvae on the wounds. The plant reacted as after direct insect attack: After approximately 3 days more morning-opening flowers compared to non-induced plants were produced. “Maybe the fatty acid amino acid conjugates present in the oral secretions of the larvae elicit this reaction. We already know that they switch on the plant’s defense against herbivory, for instance by producing toxic substances to fend off the attacker,” Danny Kessler, PhD student at the institute, explains. In an additional experiment he used genetically modified tobacco, in which the signaling pathway between the messenger molecule in the oral secretion and the defense reaction was interrupted; these plants were unable to produce jasmonate, a plant hormone initiating plant defense responses. In fact, the transgenic jasmonate-deficient plants used in the field experiment did not produce MoF after spit induction, but could if the plants were sprayed with jasmonate, which showed that the reprogramming of the flower production is actually related to the pathway that switches on defense mechanisms.

Why do plants risk attracting tomato hornworm moths as pollinators, although the insects’ larvae feed on the plants? “We cannot answer this question from the perspective of one single plant, but, if at all, from an evolutionary and ecological background,” says Ian Baldwin. Wild tobacco populations grow on vast areas after fires, comparable to synchronized monocultures with thousands of widespread plants. Hummingbirds may not be the most reliable pollination service the plant species needs for outcrossing and reproduction. Using volatiles, the plants can attract moths from large distances, whereas hummingbirds are only available, if their nests are accidentally in the vicinity of the tobacco populations. Moreover, looking at the special mode of hummingbird pollination, it is more likely that flowers of one single plant are pollinated with pollen from the same plant than from flowers of different plants. This can decrease the genetic variability of the seeds produced. Moths may move more frequently among plants and this behavior may results in greater genetic variability for the seed produced from their pollination services. [JWK/AO]

Original Publication: Danny Kessler; Celia Diezel; Ian T Baldwin: Changing pollinators as a means of escaping herbivores. Current Biology, Online First, January 21, 2010, DOI 10.1016/j.cub.2009.11.071

Image Caption: Wild tobacco (Nicotiana attenuata), native in North America, is flowering during the nighttime and attracts night-active moths as pollinators by emitting the attractant benzyl acetone. However, as soon as female moths start laying their eggs on the plant and the young caterpillars become a serious danger, the plant postpones the opening time of the flowers by 12 hours to dawn and additionally stops producing benzyl acetone. Moths stay away and hummingbirds take over pollination. Credit: Danny Kessler, Max Planck Institute for Chemical Ecology, Jena, Germany

On the Net:

Blueberry Juice Improves Memory In Older Adults

Scientists are reporting the first evidence from human research that blueberries “” one of the richest sources of healthful antioxidants and other so-called phytochemicals “” improve memory. They said the study establishes a basis for comprehensive human clinical trials to determine whether blueberries really deserve their growing reputation as a memory enhancer. A report on the study appears in ACS’ Journal of Agricultural and Food Chemistry, a bi-weekly publication.

Robert Krikorian and colleagues point out that previous studies in laboratory animals suggest that eating blueberries may help boost memory in the aged. Until now, however, there had been little scientific work aimed at testing the effect of blueberry supplementation on memory in people.

In the study, one group of volunteers in their 70s with early memory decline drank the equivalent of 2-2 l/2 cups of a commercially available blueberry juice every day for two months. A control group drank a beverage without blueberry juice. The blueberry juice group showed significant improvement on learning and memory tests, the scientists say. “These preliminary memory findings are encouraging and suggest that consistent supplementation with blueberries may offer an approach to forestall or mitigate neurodegeneration,” said the report. The research involved scientists from the University of Cincinnati, the U.S. Department of Agriculture, and the Canadian department of agriculture.

Image Caption: A few glasses of blueberry juice a day may help improve memory in older adults. Credit: iStock

On the Net:

Female Hormone May Protect Women From Psychosis

Many American women are prescribed estrogen to combat the negative effects of menopause, such as bone loss and mood swings. Now, new evidence from a Tel Aviv University study suggests that hormone replacement therapy might also protect them “” and younger women “” from schizophrenia as well.

Prof. Ina Weiner of Tel Aviv University’s Department of Psychology and her doctoral student Michal Arad have reported findings suggesting that restoring normal levels of estrogen may work as a protective agent in menopausal women vulnerable to schizophrenia. Their work, based on an animal model of menopausal psychosis, was recently reported in the journal Psychopharmacology.

“We’ve known for some time that when the level of estrogen is low, vulnerability to psychotic symptoms increases and anti-psychotic drugs are less likely to work. Now, our pre-clinical findings show why this might be happening,” says Prof. Weiner.

A hormonal treatment to address a behavioral condition

In their study, Weiner and Arad removed the ovaries of female rats to induce menopause-like low levels of estrogen and showed that this led to schizophrenia-like behavior. The researchers then tried to eliminate this abnormal behavior with an estrogen replacement treatment or with the antipsychotic drug haloperidol. Estrogen replacement therapy effectively alleviated schizophrenia-like behavior but haloperidol had no effect on its own. Haloperidol regained its effect in these rats when supplemented by estrogen.

“When the level of estrogen was low, we could see psychotic-like behavior in the animals. Moreover, the sensitivity to psychosis-inducing drugs went up, while the sensitivity to anti-psychotic drugs went down,” Prof. Weiner says. This is exactly what we observe in women with low estrogen levels,” she says. “But we also found that estrogen, all by itself, combats psychosis in both male and female rats.” Furthermore, in low amounts estrogen increases the effectiveness of anti-psychotic drugs.

Prof. Weiner points out that the medical community is hotly debating the pros and cons of estrogen replacement as an add-on to conventional treatment in schizophrenia. Detractors point to higher chances of cervical cancer and heart attacks in those who receive estrogen supplements. But according to her study, which looked at very specific factors possibly related to schizophrenia, estrogen replacement therapy could have positive behavioral effects, she concludes.

Assessing the possibility for prevention

During the course of a woman’s lifetime, estrogen levels do not remain constant. During her reproductive years, these levels are affected by the menstrual cycle. There are also dramatic changes in the levels of estrogen just after a woman gives birth “” a change, which can trigger “post-partum blues,” and in extreme cases lead to clinical depression and psychosis.

As a preventative therapy, estrogen could be given to women at certain points in time when they are most at risk for schizophrenia, Prof. Weiner suggests: in their mid-twenties and later during the menopausal years.

“Antipsychotic drugs are less effective during low periods of estrogen in the body, after birth and in menopause,” says Prof. Weiner. “Our research links schizophrenia and its treatment to estrogen levels. Men seem less likely to begin schizophrenia after their 40s, which also suggests that estrogen is the culprit.”

Prof. Weiner is continuing her research.

On the Net:

Counterfeit Internet Drugs Pose Significant Risks And Discourage Vital Health Checks

Men who buy fake internet drugs for erection problems can face significant risks from potentially hazardous contents and bypassing healthcare systems could leave associated problems like diabetes and high blood pressure undiagnosed. That’s the warning just published online by IJCP, the International Journal of Clinical Practice.

Medical and pharmaceutical experts from the UK, Sweden and USA carried out a detailed review of the growing problem of counterfeit drugs. Estimates suggest that up to 90 per cent of these illegal preparations are now sold on the internet.

The review, which covers more than fifty studies published between 1995 and 2009, provides a valuable overview of the scale of counterfeit internet drugs, with a specific focus on erectile dysfunction (ED) drugs.

These have played a key role in driving the growth of counterfeit drugs, with studies suggesting that as many as 2.3 million ED drugs are sold a month, mostly without prescription, and that 44 per cent of the Viagra offered on the internet is counterfeit.

“The presence of unknown pharmaceutically active ingredients and/or impurities may lead to undesirable and serious adverse events, even death” warns lead author and journal editor Graham Jackson, a London-based cardiologist.

“We discovered that 150 patients had been admitted to hospitals in Singapore after taking counterfeit tadalfil and herbal preparations that claimed to cure ED. Seven were comatose, as the drugs contained a powerful drug used to treat diabetes, and four subsequently died.”

But it’s not just erectile dysfunction drugs that pose a risk, as Dr Jackson points out: “In Argentina, two pregnant women died after being given injections of a counterfeit iron preparation for anaemia and 51 children died in Bangladesh of kidney failure after taking paracetamol syrup contaminated with diethylene glycol, which is widely used as car antifreeze.”

Other examples include fake contraceptive and antimalaria pills, counterfeit antibiotics and a vaccine for life-threatening meningitis that only contained water.

The US-based Center for Medicine in the Public Interest estimates that the global sale of counterfeit drugs will reach $75 billion in 2010, a 92 per cent increase in five years.

Counterfeit seizures in the European Union (EU) quadrupled between 2005 and 2007 and the number of drug fraud investigations carried out by the US Food and Drug Administration rose 800 per cent between 2000 and 2006.

ED drugs are the most commonly counterfeited product seized in the EU due to their high cost and the embarrassment associated with the underlying condition. Some estimates suggest that as many as 2.5 million men in the EU are using counterfeit Viagra.

Analysis of counterfeit ED drugs has shown that some contain active ingredients, while others contain potentially hazardous contaminants.

Pfizer, which manufactures Viagra, analysed 2,383 suspected counterfeit samples forwarded to the company by law enforcement agencies between 2005 and 2009. It found that that a Hungarian sample contained amphetamine, a UK sample contained caffeine and bulk lactose and that printer ink had been used to color some samples blue. Other samples contained metronidazole, which can have significant adverse effects when combined with alcohol.”

And a study of 370 seized “Viagra” samples carried out by the Dutch National Institute for Public Health found that only 10 were genuine, with a range of other drugs present in the samples.

“As well as the risk posed by unknown ingredients, internet drugs circumvent traditional healthcare and this poses its own risks as underlying health conditions could go undiagnosed if people don’t seek medical advice” says Dr Jackson.

The World Health Organization states that counterfeit medicines are a threat to communities and must be stopped and there is a general consensus that steps need to be taken to tackle the problem.

“However, obstacles to effective action include the lack of a clear worldwide consensus on what constitutes a counterfeit drug and the fact that activities that are illegal in one country may be legal in another” says Dr Jackson.

“In some cases producing counterfeit medicine can be ten times as profitable per kilogram as heroin, yet in the UK someone can face greater legal sanctions if they produce a counterfeit T-shirt.

“What is clear is that we need much greater public awareness of the risks of buying counterfeit drugs, as lives are at risk.

“It is essential that healthcare clinicians get that message across.”

On the Net:

Fertility Drugs Contribute To Multiple Births

March of Dimes concerned about drugs’ role in growing crisis of prematurity

The widespread use of so-called fertility drugs, not just high-tech laboratory procedures, likely plays a larger role than previously realized in the growing problem of premature births in the United States, because these drugs cause a high percentage of multiple births, the March of Dimes said today.

The organization’s comments came in response to a study published in the American Journal of Epidemiology by authors from the Centers for Disease Control and Prevention (CDC) and the March of Dimes that found controlled ovarian hyperstimulation (COH) drugs — used to stimulate a woman’s ovaries to speed the maturity and multiply the production of eggs — accounts for four times more live births than assisted reproductive technologies (ARTs) such as in vitro fertilization.

“Many people have focused on the role of ARTs in multiples and have not fully appreciated that fertility drugs alone are responsible for one out of every five multiple births,” said Alan R. Fleischman, M.D., medical director of the March of Dimes. “COH drugs are widely prescribed, and some health care professionals ““ and their patients — are not aware of the serious risks of fertility drugs to women and their babies. There is a very high possibility of multi-fetal pregnancy resulting from use of these drugs, and that brings a high risk of prematurity and lifelong health problems for the babies as a consequence.”

“The March of Dimes urges more research and leadership from professional societies to develop specific guidelines and encourage acceptance of best practices for the proper use and dosage of fertility drugs, as well as the careful counseling and monitoring of women treated with these drugs. Women who are taking fertility drugs should always ask their doctor what they can do to prevent having a multi-fetal pregnancy,” Dr. Fleischman said.

Dr. Fleischman noted that approximately 88,000 babies are born preterm annually as a result of the recent increase of twins, triplets, and other multiple births. About 60 percent of twins, more than 90 percent of triplets, and virtually all quadruplets and higher-order multiples are born prematurely, he noted. In addition to the increased risks associated with multiple birth, studies have also suggested that even infants born singly, but conceived with ovulation stimulation are at increased risk for preterm delivery than naturally-conceived single births, the study authors pointed out.

Dr. Fleischman said it is critical for the American Society for Reproductive Medicine, the American College of Obstetricians and Gynecologists, and other clinical societies to develop clear guidelines on the use of fertility drugs to help prevent many premature births.

The study found that 4.6 percent of live births in 2005 resulted from fertility drug use, a figure 4 times higher than the 1.2 percent of births resulting from ARTs. A total of 22.8 percent of babies born as multiples were conceived using fertility drugs alone.

The study authors conclude that more than 190,000 infants per year are conceived with fertility drug use, but also say this figure is an underestimate because there is no system for population-based surveillance of births resulting from fertility drug treatment.

“The estimates from this analysis, together with separate published estimates from the National ART surveillance system, indicate that in all, approximately 6 percent of US infants are now exposed to ovulation stimulation treatments,” stated Laura Schieve, epidemiologist at the CDC’s National Center on Birth Defects and Developmental Disabilities. “Thus, we must continue to study both the short- and long-term health outcomes among the many women treated and the many children annually conceived with these infertility treatments.”

More than 540,000 babies are born too soon each year in the U.S. Preterm birth costs the nation more than $26 billion annually, according to the Institute of Medicine. It is the leading cause of newborn death, and babies who survive an early birth face the risk of lifelong health problems such as breathing problems, mental retardation, cerebral palsy, developmental delays, vision and hearing loss. Even babies born just a few weeks too soon (34-36 weeks gestation, also known as late preterm birth) have higher rates of death and disability than full-term babies.

On the Net:

Germ Killer Disinfects Surgical Instruments

A new fast-acting disinfectant that is effective against bacteria, viruses, fungi and prions could help to reduce the spread of deadly infections in hospitals, according to research published in the February issue of Journal of General Virology.

Researchers from the Robert Koch Institute in Berlin, Germany have optimized a rapid-acting, practical formula for disinfecting surgical instruments. The treatment works against a wide range of pathogens, including those that tolerate ordinary disinfectants, such as the bacterium Mycobacterium avium that causes a tuberculosis-type illness in immunocompromised individuals and enteroviruses that may cause polio.

In previous studies the team had identified a simple alkaline detergent formulation that was effective at eradicating prions from the surfaces of surgical instruments. Prions are misfolded proteins that cause BSE in cattle and CJD in humans. They are a particular problem to eliminate because they are very resistant to inactivation and can even become ‘fixed’ on surfaces by some conventional disinfectants.

In their new study, the researchers mixed the original alkaline detergent formulation with varying amounts of alcohol and tested its ability to rid surgical instruments of bacteria, viruses and fungi in addition to prions. They found that the original mixture made in 20% propanol was optimal for disinfecting instruments without fixing proteins to their surfaces.

Disinfectants are the first line of defense against the spread of hospital-acquired infections and effective treatment of surgical instruments is vital. Prion expert Dr. Michael Beekes who led the research, together with Prof. Martin Mielke from the hygiene department of the Robert Koch Institute, explained the difficulties of finding a suitable disinfectant. “Eliminating a broad range of pathogens with one formula is not easy. Some micro-organisms such as mycobacteria, poliovirus, fungal spores and not least prions are particularly resistant to inactivation. Prions are also known for their ability to stick to rough surfaces. In addition it’s a real challenge to disinfect complex instruments used in neurosurgery for example because they are heat-sensitive”.

Dr. Beekes believes the new formulation could have a huge impact on hospital safety protocols. “Standard formulations that eliminate prions are very corrosive. The solution we’ve come up with is not only safer and more material-friendly but easy to prepare, cheap and highly effective against a wide variety of infectious agents”.

On the Net:

Synthetic, Dissolving Plates Ease Repairs Of Nasal Septum Defects

Attaching cartilage to plates made of the resorbable material polydioxanone appears to facilitate corrective surgery on the nasal septum, the thin cartilage separating the two airways, according to a report in the January/February issue of Archives of Facial Plastic Surgery, one of the JAMA/Archives journals.

Conventional septoplasty, or surgery to straighten a deviated septum, is not always possible because the surgical manipulations involved weaken the septal cartilage, according to background information in the article. “For decades, the inner nose remained untouched in nasal septal surgery because of these problems, and plastic surgeons attempted to correct only the nasal septum regions visible from the outside,” the authors write. “Only in the past 40 years have surgical innovations allowed correction of cosmetic and functional deformities in a single session.”

However, these procedures are sometimes ineffective in complex cases; it is difficult to strengthen the cartilage enough to support the structure of the nose without compromising cosmetic or functional concerns. To resolve these issues, Miriam Boenisch, M.D., Ph.D., then of District Hospital Steyr, Steyr, Austria, and now of Medicent Linz, Linz, Austria, and Gilbert J. Nolst Trenit©, M.D., Ph.D., of the Academic Medical Center of the University of Amsterdam, report on the use of resorbable polydioxanone plates during septoplasty. Cartilage is removed from the septum and sutured to a polydioxanone plate cut to the same size. This combined graft is then re-implanted.

The procedure was performed on 396 patients beginning in 1996. Results were evaluated by patient report, photographs and measurements of nasal function at follow-up examinations periodically after surgery (patients were followed for an average of 12 months and a maximum of 10 years). No immediate complications such as bleeding, inflammatory reactions or tissue death occurred, nor were there long-term complications such as perforation or thickening of the septum or rejection of the implant.

A straight nasal septum was achieved in 369 patients (93.2 percent), and the same number reported improvement of the nasal airway. Two months after surgery, nasal function testing showed improved nasal flow in 324 patients (81.8 percent).

Eighteen patients (4.5 percent) required revision surgery to correct redeviation or other slight deformities. In eight (2 percent), the septum was displaced again but did not cause functional problems or the patients did not want revision surgery.

“Surgical correction of a deviated nasal septum is one of the most frequently performed surgical procedures,” the authors write. Of an average of 1.2 cases per 1,000 North American and European individuals, about 90 percent of the surgical procedures are routine. The other 10 percent require complex correction. “The use of resorbable polydioxanone plate facilitates this surgical technique.”

“To date, we have encountered no short- or long-term complications as a consequence of the use of polydioxanone plate,” they conclude. “The use of polydioxanone plate during septal surgery facilitates external septoplasty, corrects several combined nasal deformities such as post-traumatic and iatrogenic [medically induced] irregularities and avoids postoperative saddle nose deformity, without risk to the patient.”

On the Net:

Appendicitis Linked To Viral Infections

Can you catch appendicitis? And if you do, is it necessarily an emergency that demands immediate surgery?

Yes and no, according to a new study by UT Southwestern Medical Center surgeons and physicians.

The researchers evaluated data over a 36-year period from the National Hospital Discharge Survey and concluded in a paper appearing in the January issue of Archives of Surgery that appendicitis may be caused by undetermined viral infection or infections, said Dr. Edward Livingston, chief of GI/endocrine surgery at UT Southwestern and senior author of the report.

The review of hospital discharge data runs counter to traditional thought, suggesting that appendicitis doesn’t necessarily lead to a burst appendix if the organ is not removed quickly, Dr. Livingston said.

“Just as the traditional appendix scar across the abdomen is fast becoming history, thanks to new single-incision surgery techniques that hide a tiny scar in the bellybutton, so too may the conventional wisdom that patients with appendicitis need to be operated on as soon as they enter the hospital,” said Dr. Livingston. “Patients still need to be seen quickly by a physician, but emergency surgery is now in question.”

Appendicitis is the most common reason for emergency general surgery, leading to some 280,000 appendectomies being performed annually.

Appendicitis was first identified in 1886. Since then, doctors have presumed quick removal of the appendix was a necessity to avoid a subsequent bursting, which can be an emergency. Because removing the appendix solves the problems and is generally safe, removal became the standard medical practice in the early 20th century.

But this latest research studying appendicitis trends from 1970 to 2006 suggests immediate removal may not be necessary. Evidence from sailors at sea without access to immediate surgery and from some children’s hospitals, whose practice did not call for emergency surgery, hinted that non-perforated appendicitis may resolve without surgery, said Dr. Livingston.

In undertaking the study, the researchers screened the diagnosis codes for admissions for appendicitis, influenza, rotavirus and enteric infections. They found that seasonal variations and clustering of appendicitis cases support the theory that appendicitis may be a viral disease, like the flu, Dr. Livingston said.

Statistical data revealed peaks, which may be outbreaks of appendicitis, in the years 1977, 1981, 1984, 1987, 1994 and 1998. In addition, researchers uncovered some seasonal trends for appendicitis, documenting a slight increase in appendicitis cases during the summer.

“The peaks and valleys of appendicitis cases generally matched up over time, suggesting it is possible that these disorders share common etiologic determinates, pathogenetic mechanisms or environmental factors that similarly affect their incidence,” Dr. Livingston said.

Researchers have been able to rule out flu and several other common infections as a direct cause. They also were able to rule out several types of intestinal viruses.

Appendicitis afflicts about one in 10 people during their lifetime. The condition occurs when the appendix becomes obstructed, but doctors are unsure why. Dr. Livingston and other UT Southwestern researchers in 1995 identified an unexpected rise in appendicitis cases, reversing a downward trend throughout the previous 25 years.

“Though appendicitis is fairly common, it still remains a frustrating medical mystery,” Dr. Livingston said. “While we know surgical removal is an effective treatment, we still don’t know the purpose of the appendix, nor what causes it to become obstructed.”

Other UT Southwestern researchers involved in the Archives of Surgery paper were Dr. Robert W. Haley, chief of epidemiology, and Dr. Adam Alder, a resident and lead author. The team also collaborated with economists at Southern Methodist University on novel statistical methodologies to uncover the associations.

On the Net:

Obesity Screening, Treatment Should Start Early

According to one influential advisory panel, school-aged children should be screened for obesity and be sent to intensive behavioral programs if they need to lose weight.

The potential plan could transform how doctors deal with overweight kids.

The panel of doctors issued possible new guidelines on Monday stating that treating obese children can help them lose weight. The treatment would only be effective if it involved rigorous diet, activity and behavioral counseling.

The U.S. Preventive Services Task Force, which makes medical-care recommendations based on research, concluded that after reviewing more than a dozen studies, obese children who participated in moderate to high-intensity weight-management programs for 25 or more hours over a six-moth period often had accountable weight loss.

Currently there are not enough programs for parents to get their children involved in, and those that are available are not covered under most health insurance plans. However, under this new recommendation, there is evidence that these programs would be effective and things may change, said Ned Calonge, chairman of task force and chief medical officer of the Colorado Department of Public Health and Environment.

Nearly 32% of children and teens are obese or overweight, according to the latest statistics. Almost 20% of kids between ages 6 and 11 are obese. 18% of those ages 12 to 19 are obese. Obesity puts children at a much higher risk for health problems such as high cholesterol, blood pressure and diabetes.

Kids are considered overweight if they fall within the 85th and 94th percentile on the body-mass index growth charts, which are adjusted for combined weight and height. Those with a 95th percentile or higher are considered obese.

Although studies conclude that children would benefit highly from weight-management programs more so than the same studies implemented five years ago, pediatricians are not equipped to offer that kind of treatment, and those that do, are hard to find and may be very expensive, according to Calonge.

The recommendations merely highlight evidence showing what types of treatments work, rather than “whether or not those services are currently available,” he added.

The advice from the U.S. Preventive Services Task Force coincides with that of the American Academy of Pediatrics. Many pediatricians have the tools to measure body-mass index and usually perform yearly checkups on its patients already. But more advanced treatments are currently not available in most communities.

After studying more than 20 cases since 2005, the task force has made many recommendations on how to treat obesity. Those not considered by the panel, include the use of two diet drugs, Xenical and Meridia, that have been recently approved for use in older children, due to potential side effects including elevated heart rate and that there is no evidence that the use of the drugs result in long-term weight loss. Another treatment option, obesity surgery, has been recommended in only severe cases.

The most effective treatment usually involves counseling parents and their kids, group therapies, and other programs that are not covered by some insurance companies. Most of these treatments can be too costly for parents, one of the main reasons the programs are so scarce. Not only does cost play into it, many parents and their children aren’t willing to make the necessary lifestyle changes needed for the treatment to be effective.

Keith Ayoob, associate professor of pediatrics at the Albert Einstein College of Medicine in New York, says parents have to be involved with their children in any weight-management program. “Part of the problem is that where there are obese children, there are often obese parents.” To make lasting changes for the benefit of their children, “parents often have to take a hard look at their own eating styles and how they may have morphed into less-than-healthy role models.”

The recommendations made by the U.S. Preventive Services Task Force are geared for children ages 6 to 18 years of age. Evidence lacks on the effectiveness of treatments and programs for children under the age of 6., according to Calonge.

On the Net:

Robot Uses ‘Chaos Control’

Göttingen scientists develop an autonomous walking robot that flexibly switches between many different gaits by using “chaos control”

Even simple insects can generate quite different movement patterns with their six legs. The animal uses various gaits depending on whether it crawls uphill or downhill, slowly or fast. Scientists from Göttingen have now developed a walking robot, which – depending on the situation – can flexibly and autonomously switch between different gaits. The success of their solution lies in its simplicity: a small and simple network with just a few connections can create very diverse movement patterns. To this end, the robot uses a mechanism for “chaos control”. This interdisciplinary work was carried out by a team of scientists at the Bernstein Center for Computational Neuroscience Göttingen, the Physics Department of the Georg-August-University of Göttingen and the Max Planck Institute for Dynamics and Self-Organization. (Nature Physics, January 17th, 2010, advanced online publication)

In humans and animals, periodically recurring movements like walking or breathing are controlled by small neural circuits called “central pattern generators” (CPG). Scientists have been using this principle in the development of walking machines. To date, typically one separate CPG was needed for every gait. The robot receives information about its environment via several sensors – about whether there is an obstacle in front of it or whether it climbs a slope. Based on this information, it selects the CPG controlling the gait that is appropriate for the respective situation.

One single pattern generator for many gaits

The robot developed by the Göttingen scientists now manages the same task with only one CPG that generates entirely different gaits and which can switch between these gaits in a flexible manner. This CPG is a tiny network consisting of two circuit elements. The secret of its functioning lies in the so-called “chaos control”. If uncontrolled, the CPG produces a chaotic activity pattern. This activity, however, can very easily be controlled by the sensor inputs into periodic patterns that determine the gait. Depending on the sensory input signal, different patterns – and thus different gaits – are generated.

The connection between sensory properties and CPG can either be preprogrammed or learned by the robot from experience. The scientists use a key example to show how this works: the robot can autonomously learn to walk up a slope with as little energy input as possible. As soon as the robot reaches a slope, a sensor shows that the energy consumption is too high. Thereupon, the connection between the sensor and the control input of the CPG is varied until a gait is found that allows the robot to consume less energy. Once the right connections have been established, the robot has learned the relation between slope and gait. When it tries to climb the hill a second time, it will immediately adopt the appropriate gait.

In the future, the robot will also be equipped with a memory device which will enable it to complete movements even after the sensory input ceases to exist. In order to walk over an obstacle, for instance, the robot would have to take a large step with each of its six legs. “Currently, the robot would not be able to handle this task – as soon as the obstacle is out of sight, it no longer knows which gait to use,” says Marc Timme, scientist at the Max Planck Institute for Dynamics and Self-Organization. “Once the robot is equipped with a motor memory, it will be capable to use foresight and plan its

Reference: Silke Steingrube, Marc Timme, Florentin Wörgötter and Poramate Manoonpong. Self-organized adaptation of a simple neural circuit enables complex robot behavior. Nature Physics, January 17th, 2010 (DOI: 10.1038/NPHYS1508)

Image Caption: Following the principle of chaos control, the robot produces regular leg movements when walking normally. In addition, it can use the uncontrolled chaotic movement pattern to free itself when its leg is trapped in a hole. Image: Network Dynamics Group, Max Planck Institute for Dynamics and Self-Organization

On the Net:

Eye Exam Could Spot Alzheimer’s Early

British scientists are currently developing a test that can detect Alzheimer’s up to 20 years before any symptoms show, the Daily Mail UK reported.

Experts say that in as little as three years, the simple and inexpensive eye test could be part of routine examinations by opticians, allowing those in middle age to be screened.

The procedure has the power to revolutionize the treatment of Alzheimer’s by making it possible for drugs to be given in the earliest stages, dementia experts said.

The University College London researchers that are developing the technique say it could also speed up the development of medication capable of stopping the disease in its tracks, preventing people from ever showing symptoms.

“These findings have the potential to transform the way we diagnose Alzheimer’s, greatly enhancing efforts to develop new treatments,” said Rebecca Wood of the Alzheimer’s Trust.

While there is no cure for Alzheimer’s and existing drugs do not work for everyone, current diagnosis is based on memory tests, and expensive brain scans are also sometimes used.

Decisive proof of the disease usually comes from an examination of the patient’s brain after death.

However, the eye test would provide a quick, easy, cheap and highly accurate diagnosis.

The procedure exploits the fact that the light-sensitive cells in the retina at the back of the eye are a direct extension of the brain. UCL researchers showed for the first time in a living eye that the amount of damage to cells in the retina directly corresponds with brain cell death.

The researchers have also revealed the pattern of retinal cell death characteristic of Alzheimer’s. And each diagnosis has been right every time.

Studies in the past have shown that cells start to die ten to 20 years before the symptoms of Alzheimer’s become evident, which could allow people to be screened in middle age for signs of the disease.

But many people may not want to know their fate so far in advance and there is also the fear that insurance companies could increase premiums for those who test positive while still young.

So far, the experiments have only been performed on lab animals, but the team is ready to begin the first human trials.

“The equipment used for this research is essentially the same as is used in clinics and hospitals worldwide. It is also inexpensive and non-invasive, which makes us fairly confident that we can progress quickly to its use in patients,” said researcher Francesca Cordeiro.

The study, reported in the journal Cell Death & Disease, contends that it is entirely possible that in the future a visit to an optician to check on eyesight will also be a check on the state of the brain.

Cordeiro believes the technique could also improve the diagnosis of other conditions, including glaucoma and Parkinson’s disease.

An early diagnosis would give patients and their families much more time to prepare for the future, while it would also allow patients with new drugs that stop the disease in its tracks to reach their full potential.

“If you give the treatment early enough, you can stop the disease progressing, full stop,” Cordeiro said.

“This research is very exciting. If we can delay the onset of dementia by five years, we can halve the number of people who will die from the disease,” said Dr. Susanne Sorensen of the Alzheimer’s Society.

However, she cautioned that the test was still experimental but holds promise.

On the Net:

Revealing Earth’s Ultraviolet Fingerprint

On November 13, the European Space Agency’s comet orbiter spacecraft, Rosetta, swooped by Earth for its third and final gravity assist on the way to humankind’s first rendezvous to orbit and study a comet in more detail than has ever been attempted.

One of the instruments aboard Rosetta is the NASA-funded ultraviolet spectrometer, Alice, which is designed to probe the composition of the comet’s atmosphere and surface — the first ultraviolet spectrometer ever to study a comet up close. During Rosetta’s recent Earth flyby, researchers successfully tested Alice’s performance by viewing the Earth’s ultraviolet appearance.

“It’s been over five years since Rosetta was launched on its 10-year journey to comet Churyumov-Gerasimenko, and Alice is working well,” says instrument Principal Investigator Dr. Alan Stern, associate vice president of the Space Science and Engineering Division at Southwest Research Institute. “As one can see from the spectra we obtained during this flyby of the Earth, the instrument is in focus and shows the main ultraviolet spectral emission of our home planet. These data give a nice indication of the scientifically rich value of ultraviolet spectroscopy for studying the atmospheres of objects in space, and we’re looking forward to reaching the comet and exploring its mysteries.”

Dr. Paul Feldman, professor of Physics and Astronomy at the Johns Hopkins University, and an Alice co-investigator, has studied the Earth’s upper atmosphere from the early days of space studies. “Although the Earth’s ultraviolet emission spectrum was one of the first discoveries of the space age and has been studied by many orbiting spacecraft, the Rosetta flyby provides a unique view from which to test current models of the Sun’s interaction with our atmosphere.”

SwRI also developed and will operate the NASA-funded Ion and Electron Sensor aboard Rosetta. IES will simultaneously measure the flux of electrons and ions surrounding the comet over an energy range extending from the lower limits of detectability near 1 electron volt, up to 22,000 electron volts.

Thanks to an Earth gravity assist swing by in November, Rosetta is now on a course to meet its cometary target in mid-2014. Before Rosetta reaches its main target, it will explore a large asteroid called Lutetia, in July 2010. The Alice UV spectrometer will be one of the instruments mapping this ancient asteroid-

NASA’s Jet Propulsion Laboratory, Pasadena, Calif., manages the U.S. Rosetta project for NASA’s Science Mission Directorate.

Image Caption: During the spacecraft’s approach, the Earth appeared as a crescent. The drawing (generated by the SwRI-developed Geometry Visualization tool) shows the appearance of the Earth as seen from the spacecraft. The red outline shows the orientation of the long slit off the Alice spectrograph. The image of the Earth was taken around the same time by the OSIRIS camera on Rosetta. The plot shows one of the spectra the Alice instrument obtained during this approach to Earth. Some of the emission lines are identified. Image 1 Credit: ESA ©2009 MPS for OSIRIS Team MPS/UPD/LAM/IAA/RSSD/INTA/UPM/DASP/IDA. Image 2 Credit: SwRI. Image 3 Credit: NASA/JPL/SwRI

On the Net:

NSF: Ecology Of Infectious Diseases

Infectious Diseases Spreading

West Nile virus.  Hantavirus.  Lyme disease.  All are infectious diseases spreading in animals, and in humans.  Is our interaction with the environment somehow responsible for the increase in incidence of these diseases?

A joint National Science Foundation (NSF) and National Institutes of Health program — ecology of infectious diseases (EID) — supports efforts to understand the underlying ecological and biological mechanisms behind human-induced environmental changes and the emergence and transmission of infectious diseases.  Projects funded through the EID program and other NSF programs allow scientists to study how large-scale environmental events””such as habitat destruction, invasions of non-native species and pollution””alter the risks of emergence of viral, parasitic and bacterial diseases in humans and animals.

Researchers supported in the EID program are advancing basic theory related to infectious diseases and applying that knowledge to improve our understanding of how pathogens spread through populations at a time of increasing global change.

The benefits of research on the ecology of infectious diseases include development of theories of how diseases are transmitted, improved understanding of unintended health effects of development projects; increased capacity to forecast disease outbreaks and knowledge of how infectious diseases emerge and reemerge.

“Virtually all the world’s terrestrial and aquatic communities have undergone dramatic changes in biodiversity due primarily to habitat transformations such as deforestation and agricultural intensification, invasions of exotic species, chemical contamination and climate-change events,” said Sam Scheiner, ecology of infectious diseases (EID) program director at NSF. “The coincidence of broad-scale environmental changes with the emergence of infectious diseases may point to underlying and predictable ecological relationships.”

Examples of studies funded by the EID program include research on the origin and spread of the aspergillus-gorgonian coral disease and how climate and environment may have worked as facilitators of the disease; effects of human-induced change on the ecology of human pathogens in North Carolina’s Neuse River estuary, which is polluted by excess nutrients from human activities; the microbial community ecology of tick-borne human pathogens; plague as a model for disease dynamics; ecological reasons for rodent-borne disease outbreaks; and how social organization influences an infectious disease outbreak.

Further information about EID program support is available in the latest program solicitation.

Medical Mystery Solved

When young, otherwise healthy people in the remote Four Corners area of Arizona and New Mexico began dying of a mysterious, acute respiratory disease in the spring of 1993, scientists wondered at the cause.

Tests of the victims’ blood yielded surprising results: the people had become infected with a previously undetected kind of hantavirus.  Named for the Hantaan River in Korea, hantaviruses were known to spread from rodents to humans in Asia and Europe, but until the Four Corners outbreak, the microbes had only been seen outside of the United States.

For answers as to how the virus spread in the Four Corners, the U.S. Centers for Disease Control turned to scientists Robert Parmenter of the Sevilleta (SEV) Long-Term Ecological Research (LTER) site in New Mexico, and Terry Yates of the University of New Mexico.  Their research at the LTER site revealed that the hantavirus outbreak could be blamed on El Niño, a periodic pattern of change in the global circulation of oceans and atmosphere.  Massive rains associated with the 1991-1992 El Niño had substantially boosted plant productivity after several years of drought. A banner year for plants was followed by a banner year for rodents. More mice meant that more humans stood a greater chance of exposure to infected rodents as people moved among barns and did spring cleaning of cabins and trailers.

The deadly hantavirus wasn’t new to New Mexico.  The virus had been in the rodents all along.  It was the change in climate conditions that triggered the fatal outbreak in humans.  Such knowledge likely saved lives in 1998, when another active El Niño prompted health authorities to warn residents in the American Southwest to use caution when entering areas favored by mice.

Further information about the SEV LTER is available at http://www.lternet.edu/sites/sev/

General information about the LTER Network, including links to each of the LTER sites, is available at http://www.lternet.edu

Frogs Vs. Trout

Ecology of infectious diseases data gathered over seven years has played a key role in convincing the National Park Service and the California Department of Fish and Game to remove trout from high-altitude lakes in California’s Sierra Nevada.  The trout are causing the disappearance of the mountain yellow-legged frog.

Funded through the EID program, biologist Vance Vredenburg of the University of California at Berkeley showed that introduced-trout have devastated native frog populations over the past 50 years in formerly fish-free, high-Sierra lakes, but that removing the fish can allow the frogs to flourish once more.

“The mountain yellow-legged frog used to be the most common inhabitant of the high Sierra, but frog populations have declined dramatically enough to put it on the endangered species list,” said Vredenburg.

“The worldwide decline in frog and salamander populations is a harbinger of more serious threats posed by the current rapid environmental changes our planet is undergoing,” said Sam Scheiner, EID program director at NSF. “Possible culprits include the spread of disease, increased UV radiation and predation by introduced species.  This study helps to tease apart those complex causes and shows that, in these frogs, the decline is due to increased predation. For these populations, removing the trout will save the frogs. Such studies provide hope that we can reverse the large environmental changes we’re causing.”

As part of the research, Vredenburg removed trout from five lakes and documented a rebound in the frog population in all of them. Three years after trout removal, the frog populations in all five lakes were indistinguishable from populations at lakes that had never seen a trout.

“The response was incredibly dramatic and rapid,” Vredenburg said. “Every time you plant hundreds of thousands of fish, you’re hammering a nail in the frogs’ coffins.”

Vredenburg has also teamed up with other researchers to determine the effect of a chytrid fungus, Batrachochytrium dendrobatidis, on the mountain yellow-legged frog. The fungus, which is threatening frog populations around the world, attacks tadpoles as well as adults, and can kill adult frogs. It was discovered in the Sierra Nevada in 2001.

Loss of wetland habitat has also reduced populations of frogs and toads and endangered several species of amphibians with restricted ranges. Alarming new events have added to these trends. For example, frog and toad populations have declined dramatically in the past several years, many in high-altitude places in the United States, Puerto Rico, Costa Rica, Panama, Colombia and Australia. Studies suggest that these population declines may be caused by infections, perhaps promoted by environmental stressors.

For more information on studies of the chytrid fungus, please see Outbreak: Rapid Appearance of Fungus Devastates Frogs, Salamanders in Panama.

Deer Susceptible to Disease

Researchers funded through the EID program recently found that chronic wasting disease (CWD) can be transmitted through environments contaminated by whole carcasses or by excrement of animals infected with the pathogen that causes CWD.  CWD is rampant in Western states like Colorado.

“Diseases like CWD are poorly understood and of rising concern,” said Sam Scheiner, EID program director at NSF. “This new knowledge will substantially alter how we manage the disease in wild and domestic animals.”

CWD is a fatal neurological ailment of elk, white-tailed deer and mule deer.  Researchers believe the disease is caused by an aberrant prion protein that misfolds in the brain, destroying brain tissue as it progresses.  The disease is always fatal and there is no known cure or treatment.

Although live deer and elk still seem the most likely way for CWD to spread geographically, environmental sources could contribute to maintaining and prolonging local epidemics, even when all infected animals are eliminated, said biologist Tom Hobbs of Colorado State University. “Through the EID program, we hope to develop models that will predict the behavior of the disease, shedding light on how potentially complex these epidemics may be in natural populations.”

Further information about CWD is available on the Chronic Wasting Disease Alliance Web site.

Lyme Disease on the Rise

Lyme disease incidence is rising in the United States and is in fact far more common than West Nile virus and other insect-borne diseases.  Forest fragmentation could explain the increase.

Areas of patchy woods, which are very common in cities and suburban and rural areas, may have higher populations of Lyme-disease carrying ticks than forest fragments, which generally have fewer species than continuous habitat. This is because some species thrive in smaller places.

White-footed mice, for example, are more abundant in forest fragments in some parts of the country, likely because fewer predators and competitors remain there. These mice are particularly abundant in patches smaller than about five acres, which could spell trouble for people living nearby: the mice are the main carriers of Lyme disease-causing bacteria. In the Eastern and Central United States, Lyme disease is contracted via blacklegged ticks that feed on infected mice, and then transmit the bacteria when the ticks bite people. As a result, says biologist Felicia Keesing of Bard College in Annandale, New York, Lyme disease is concentrated in areas where people live near forests with blacklegged ticks.

Keesing and other scientists found that smaller forest fragments had more infected ticks, which could translate to more Lyme disease. Forest patches that were smaller than three acres had an average of three times as many ticks as did larger fragments, and seven times more infected ticks. As many as 80 percent of the ticks in the smallest patches were infected, the highest rate the scientists have seen.

“Our results suggest that efforts to reduce the risk of Lyme disease should be directed toward decreasing fragmentation of deciduous forests of the Northeastern United States, particularly in areas with a high incidence of Lyme disease,” said Keesing. “The creation of forest fragments smaller than five acres should especially be avoided.”

For more information, visit the Center for Disease Control’s Web page on Lyme Disease http://www.cdc.gov/ncidod/dvbid/lyme/

By Cheryl Dybas

Image 1:  Disease transmission is a complex process that involves the disease organism, disease vectors, disease hosts, and the predators of those hosts. It links relatively pristine areas with human habitations and human-dominated areas. Credit: Nicolle Rager, National Science Foundation

Image 2: Massive rainfall associated with El Niño boosts plant productivity. Feasting on the more abundant plant matter, the rodent population grows. Increased contact with rodents and their waste puts more humans at risk for exposure to hantavirus. Credit: Zina Deretsky, National Science Foundation

Image 3:  Mating season (June 2004) in Sixty Lake Basin. The large lake in the foreground is a frog population source and the three lakes in the background are trout removal lakes now colonized by large frog populations. Credit: Rob Bingham, University of California, Berkeley

On the Net:

First Satellite Map Of Devastated Haiti

A major magnitude 7 earthquake struck the Haitian capital of Port-au-Prince on January 12, causing major casualties and damage. The quake was followed by several aftershocks with magnitudes over 5.0.

Such a powerful earthquake can make current maps suddenly out of date, causing additional challenges to rescue workers on the ground. Earth observation satellite images can help rescue efforts by providing updated views of how the landscape and the infrastructure have been affected.

Following the event, the French Civil Protection authorities, the Public Safety of Canada, the American Earthquake Hazards Program of USGS and the UN Stabilization Mission in Haiti requested satellite data of the area from the International Charter on “ËœSpace and Major Disasters’. The initiative, referred to as “ËœThe Charter’, is aimed at providing satellite data free of charge to those affected by disasters anywhere in the world.

To meet the requirements of the rescue teams in Haiti, Very High Resolution imagery is needed from both optical and radar sensors. Through the Charter, the international space community is acquiring satellite imagery as quickly as possible. Currently, data are being collected by various satellites including Japan’s ALOS, CNES’s Spot-5, the U.S.’s WorldView and QuickBird, Canada’s RADARSAT-2 and ESA’s ERS-2 and Envisat.

Satellite imagery acquired immediately after the event are used to generate emergency maps to provide rescue services with an overview of the current state of the area. These can be compared with situation maps generated from archived satellite data to identify major changes on the ground caused by the disaster.

Comparison of the maps from before and after the event allows areas that have been hit hardest to be distinguished and identify passable routes for relief and rescue workers. Additionally, they can help to identify areas which are suitable for setting up aid camps where medical support and shelter can be provided to people. 

Radar satellites are able to peer through clouds, which is an asset when weather conditions prevent the use of optical satellite instruments. Radar imagery can be used to identify hazards such as landslides that may be triggered by earthquakes. In the long term, radar data can also be processed to map surface deformations caused by earthquakes to help scientists understand better seismic events.

The Global Monitoring for Environment and Security’s SAFER project is collaborating with the Charter to provide a specialized capacity to produce damage maps over the area. SAFER’s value-adding providers SERTIT from Strasbourg and the German Aerospace Centre’s (DLR) center for satellite-based crisis information (ZKI) from Munich are currently working on this.

In the framework of SAFER, other user organizations, including the German Federal Office of Civil Protection and Disaster Assistance and the UN World Food Program, have requested damage-mapping services. Based on the collaboration between the Charter and SAFER, the first space-maps derived from crisis data acquired on 13 January were produced by SERTIT within 24 hours as rapid situation maps to help locate damaged areas with up-to-date cartographic material.

Together with ESA and CNES, the Charter, founded in 2000, currently has 10 members: the Canadian Space Agency (CSA), the Indian Space Research Organization (ISRO), the US National Oceanic and Atmospheric Administration (NOAA), the Argentine Space Agency (CONAE), the Japan Aerospace Exploration Agency (JAXA), the British National Space Centre/Disaster Monitoring Constellation (BNSC/DMC), the U.S. Geological Survey (USGS) and the China National Space Administration (CNSA).

Via the Charter mechanism, all of these agencies have committed to provide free and unrestricted access to their space assets to support relief efforts in the immediate aftermath of a major disaster.

The Charter also collaborates with other satellite damage-mapping initiatives within the UN such as the UNITAR/UNOSAT team who is receiving support from the U.S. government to analyze satellite imagery to be provided to the Haitian government, UN sister agencies and NGOs.

Image Caption: Satellite map over the Port-au-Prince area of Haiti acquired on 13 January 2010, following a 7.0 magnitude earthquake and several aftershocks that hit the Caribbean nation on 12 January. Processed by SERTIT. Credits: SERTIT – CNES – International Charter

On the Net:

Sudden Death In Cocaine Abusers

Forensic pathologists have shown that over three per cent of all sudden deaths in south-west Spain are related to the use of cocaine. They believe their findings can be extrapolated to much of the rest of Europe, indicating that cocaine use is a growing public health problem in Europe and that there is no such thing as “safe” recreational use of small amounts of the drug.

The study published in Europe’s leading cardiology journal, the European Heart Journal [1] today (Wednesday 13 January), carefully investigated all the circumstances surrounding a consecutive series of sudden deaths between 2003 and 2006. During post-mortems the pathologists tested blood and urine for traces of toxic substances, and studied the organs, focusing on the cardiovascular system and toxicological analysis; they also gathered information on substance abuse prior to death, the circumstances of the death and death scene investigations.

Out of 668 sudden deaths during the study period, 21 (3.1%) were related to cocaine use; of these, all occurred in men aged between 21 and 45, and most of the cocaine-related deaths were due to problems with the heart and its related systems.

Dr Joaquín Lucena, MD PhD, Head of the Forensic Pathology Service at the Institute of Legal Medicine (Seville, Spain) who led the study, said: “Our findings show that cocaine use causes adverse changes to the heart and arteries that then lead to sudden death.”

Dr Lucena and his colleagues found that median levels of cocaine in blood or urine were 0.1 and 1.15 mg/L respectively, with a range that varied widely but which depended on a number of factors related to the drug itself (how it was taken, how people’s bodies processed it and what other substances were taken at the same time), and to the people themselves (body mass index, acute or chronic use of the drug, other underlying health issues, age and sex). They wrote: “Any amount of the drug can be considered to have the potential for toxicity due to the fact that some patients have poor outcomes with relatively low blood concentrations, whereas others tolerate large quantities without consequences.”

The researchers also found that 81% of the men who died after cocaine use also smoked, and 76% had drunk alcohol. Ethanol, the intoxicating ingredient in alcoholic drinks, enhances the “high” obtained from cocaine while minimising the subsequent “low”. However, both smoking and alcohol are associated with heart disease and Dr Lucena said: “The combination of cocaine with either or both of these habits can be considered as a lethal cocktail that promotes the development of premature heart disease.”

The study is the first to investigate the prevalence of cocaine-related sudden deaths in such a detailed and methodical way. The authors highlight the importance of this method of studying sudden deaths.

“For the correct diagnosis of the sudden death, especially in young adults, it is important to use a uniform autopsy protocol, including a toxicology investigation of the blood and urine for illicit drugs,” said Dr Lucena. “Cocaine abuse is a growing public health issue in Europe and we can only monitor its prevalence by performing these detailed autopsies whenever someone dies suddenly.”

In their study, the authors wrote: “The estimated number of COC [cocaine] consumers is about 12 million Europeans with an overall prevalence of 3.7% of the total adult population (15-64 years). Ever in lifetime experience of COC is reported by more than 5% of the total adult European population in three countries: UK (7.7%), Spain (7.0%) and Italy (6.6%). The prevalence of use of COC is higher among young adults (15-34 years), with around 7.5 million young Europeans (5.4% on average) estimated as having used it at least once in their lifetime. In the year 2007, an estimated 3.5 million (2.4%) European young adults have used COC, with the highest prevalence levels, of over 3%, being found in Spain, Italy and the UK.”

Dr Lucena said: “As the estimated number of European young adults cocaine consumers is similar in Spain, UK and Italy, there is no reason to consider that the cocaine-related sudden death in UK and Italy would be different to what we have found in our research in south-west Spain.”

To put the rates of sudden deaths in context, he added: “According to our experience in the Forensic Pathology Service at the Institute of Legal Medicine, the rate of cocaine-related deaths per year in Seville, is roughly half the number of people who die suddenly from haemorrhagic stroke.” [3]

Professor David Hillis and Professor Richard Lange, chairman and executive vice chairman respectively of the Department of Medicine at the University of Texas Health Science Center (San Antonio, USA), who were unconnected with the work, wrote an editorial to accompany Dr Lucena’s paper. They reported that the prevalence of cocaine use varied in Europe from 0.7% in Romania and Lithuania to 12.7% in the UK, but this was likely to be an under-estimate.

They agreed that uniform protocols were required for post-mortems on victims of sudden death, including toxicological examination of the blood and urine for illicit drugs. “Until these are accomplished, the prevalence of cocaine and other illicit drug use will be underestimated, and cocaine-related complications will not be recognized,” they wrote. “Physicians should consider the possibility of cocaine abuse in a young individual with cardiovascular disease or sudden death, especially in those without traditional risk factors for atherosclerosis. Finally, the notion that recreational cocaine use is ‘safe’ should be dispelled, since even small amounts may have catastrophic consequences, including sudden death.”

On the Net:

Cheaper Version Of Tamiflu On The Horizon

Scientists have developed an alternative method for producing the active ingredient in Tamiflu®, the mainstay for fighting H1N1 and other forms of influenza. The new process could expand availability of the drug by reducing its cost, which now retails for as about $8 per dose. Their study is in ACS’ Organic Letters, a bi-weekly journal.

Anqi Chen, Christina Chai and colleagues note that the global pandemic of H1N1 has resulted in millions of infected cases worldwide and nearly 10,000 deaths to date. Tamiflu®, also known as oseltamivir phosphate, remains the most widely used antiviral drug for the prevention and treatment of H1N1 infections as well as bird flu and seasonal influenzas. But growing demand for the drug has put pressure on the supply of shikimic acid, the raw material now used in making the drug. “As a result, chemists worldwide including ourselves have explored the possibility of using other alternative raw materials for the synthesis of the drug” said Chen and Chai, who led the research.

The scientists describe a new process for making the drug that does not use shikimic acid. They found that D-ribose, a naturally-occurring sugar produced by fermentation in large scales, potentially provides an inexpensive and abundant source of starting material for making the drug. D-ribose costs only about one-sixth as much as shikimic acid. In lab studies, the scientists demonstrated the potential use of D-ribose as an alternative source for the synthesis of Tamiflu®.

Image Caption: A new way of producing the active ingredient in Tamiflu, above, promises to reduce the cost of the widely used anti-flu medication. Credit: Vantey, Wikimedia Commons

On the Net:

No Safe Way To Use Cocaine

Researchers warned that there is no ‘safe’ amount of cocaine to use, after a study found that up to 3 percent of all sudden deaths are linked to the drug.

Even taking the smallest amount could lead to death from sudden heart problems.

Although the data comes from south-west Spain, experts say the findings could be extrapolated to Britain and other European countries, where cocaine has overtaken heroin as the most popular class A drug.

Out of 668 sudden deaths reported during the three-year study period, 21 (3.1 percent) were found to be related to cocaine use; all occurred in men aged between 21 and 45. In 17 of the cases the death related to problems with the heart and its related systems.

“The notion that recreational cocaine use is ‘safe’ should be dispelled, since even small amounts may have catastrophic consequences, including sudden death,” says Joaquin Lucena, a researcher at the Institute of Legal Medicine in Seville.

The researchers found that 81 percent of the men who died after cocaine use also smoked, and 76 percent had drunk alcohol.

Ethanol, the intoxicating ingredient in alcoholic drinks, enhances the “high” obtained from cocaine while minimizing the subsequent “low”. However, both smoking and alcohol are associated with heart disease and Dr Lucena said: “The combination of cocaine with either or both of these habits can be considered as a lethal cocktail that promotes the development of premature heart disease.” 

Fotini Rozakeas, Senior Cardiac Nurse at the British Heart Foundation said that the research should dispel the myth that cocaine is a “safe party drug.”

“The reality is that there are risks every time you use it. Cocaine can have devastating effects on the user including heart attacks, life-threatening heart rhythms, strokes and even sudden death. The potential deadly consequences from cocaine use can happen to anyone who takes it even in previously young healthy people with no history of heart disease.”

Urine Clogs Space Station Water Recycling System

The International Space Station’s $250 million water recycling system is facing a problem as the astronauts’ urine is clogging the system that turns it into drinkable clean water, according to NASA scientists.

The engineers trouble-shooting the problem think the clog is a result of high concentrations of calcium in the astronaut’s urine, reported Reuters.

Scientists are now trying to determine if the high concentration of calcium is because of bone-loss from living without gravity, or some other factor.

“We’ve learned a lot more about urine than we ever needed or wanted to know, some of us anyway,” said station flight director David Korth.

The $100 billion space station project backed by 16 nations has been a work in progress for over ten years.

The urine recycling system was implemented in November 2008 after being fully tested by NASA.

“Folks had good knowledge of the content of the urine going in, but the chemistry changes as it works through the processor are not always understood,” said program scientist Julie Robinson.

“There are a lot of parameters including urine calcium and pH (acidity) that everyone is looking at.”

The hope is that engineers at the Marshall Space Flight Center in Huntsville, Alabama, will be able to find a solution before replacement parts are shipped out on the shuttle Endeavor, set to launch February 7 for a construction mission.

On the Net:

Experts Divided On Implications Of Brutal Cold Spell

This year’s fierce winter in much of the Northern Hemisphere is only the beginning of a global trend towards cooler weather that is likely to last decades, say some of the world’s most renowned climate scientists.   However, other experts say the cold spell does not contradict an overall trend of global warming.

A report on Sunday by the British newspaper The Mail cited forecasts by eminent climate scientists that are a direct challenge to some of the most deeply held beliefs among those who say the world is experiencing global warming ““ including claims that the North Pole will be ice-free by the summer of 2013.

The climate scientists questioning such predictions of global warming based their predictions of a “mini ice-age” on analysis of natural water temperature cycles in the Pacific and Atlantic oceans.

Indeed, according to the U.S. National Snow and Ice Data Center in Colorado, summer Arctic sea ice has increased by 409,000 square miles, roughly 26 percent, since 2007 ““ a figure that even the most ardent global warming believers do not dispute.

The scientists’ predictions also challenge standard climate computer models, which contend that the Earth’s warming since the year 1900 is due solely to man-made greenhouse gas emissions, and will continue until CO2 levels taper off.

But the climate scientists say their research shows instead that much of the warming during the last century was caused by “Ëœwarm mode’ oceanic cycles, as opposed to the present “Ëœcold mode’.

This challenge to the theory of man-made global warming carries weight, given they come from prominent climate scientists that cannot be defined simply as global warming deniers.

Both of Britain’s major political parties maintain that the world is facing imminent disaster without dramatic CO2 reductions.  And many say the science of global warming is “Ëœsettled’.

Professor Mojib Latif, who leads a research team at the Leibniz Institute at Germany’s Kiel University, is a leading member of the UN’s Intergovernmental Panel on Climate Change (IPCC). Since its inception 22 years ago, the IPCC has been working to get the issue of man-made global warming on to the international political agenda.

Professor Latif has developed new techniques for measuring ocean temperatures far beneath the surface, where cooling and warming cycles begin.   In a paper published last year, he and his colleagues predicted the new cooling trend, and even warned of it again at a conference last September.

“A significant share of the warming we saw from 1980 to 2000 and at earlier periods in the 20th Century was due to these cycles ““ perhaps as much as 50 percent,” he said during an interview with The Mail on Sunday.

“They have now gone into reverse, so winters like this one will become much more likely. Summers will also probably be cooler, and all this may well last two decades or longer,” he said.

“The extreme retreats that we have seen in glaciers and sea ice will come to a halt. For the time being, global warming has paused, and there may well be some cooling.”

But amid bitter cold temperatures that froze much of Europe, Asia and North America last week, many insisted this was merely a “Ëœblip’ of no significance.

Britain’s BBC assured its viewers that the dramatic cold spell was merely short-term “Ëœweather’ unrelated to the “Ëœclimate’, which was still warming.

But Professor Latif’s work and that of other scientists refutes that view. 

Although the current freezing temperatures are indeed a result of the “ËœArctic oscillation’ ““ an anomaly that consists of a vast high-pressure system over Greenland that drives polar winds far to the south ““ meteorologists say it is the strongest for at least six decades. This has caused the jetstream that typically runs over the English Channel to run instead over the Strait of Gibraltar.

Professor Latif says this, in turn, results in much longer-term shifts known as the Pacific and Atlantic “Ëœmulti-decadal oscillations’ (MDOs).

These effects are not confined to the Northern Hemisphere, according to Professor Anastasios Tsonis, who leads the University of Wisconsin Atmospheric Sciences Group. 

Professor Tsonis has recently shown that these MDOs move together in a synchronized fashion throughout the world, causing abrupt changes in the world’s climate from a “Ëœwarm mode’ to a “Ëœcold mode’ and back again in 20 to 30-year cycles. 

“They amount to massive rearrangements in the dominant patterns of the weather,” he told The Mail yesterday.

“And their shifts explain all the major changes in world temperatures during the 20th and 21st Centuries.”

“We have such a change now and can therefore expect 20 or 30 years of cooler temperatures.”

A strong warm mode occurred during the period from 1915 to 1940, reflected in rising temperatures, Professor Tsonis added. However, the world cooled from 1940 until the late Seventies, the last MDO cold-mode era, despite rising levels of atmospheric CO2.

Many of the consequences of the recent warm mode were also observed 90 years ago, The Mail reported, citing a 1922 Washington Post report that described Greenland’s disappearing glaciers and Arctic seals that found “Ëœthe water too hot’.  Indeed, warm Gulf Stream water was still detectable just a few hundred miles of the North Pole at the time.

In contrast, last week 56 percent of the surface of the United States was covered by snow, Professor Tsonis said.

“That hasn’t happened for several decades.”

“It just isn’t true to say this is a blip. We can expect colder winters for quite a while,” he said, adding that towards the end of the last cold mode the world’s media were consumed by fears of freezing.

The Mail cited a 1974 a Time magazine cover entitled “Another Ice Age”.

“Man may be somewhat responsible ““ as a result of farming and fuel burning [which is] blocking more and more sunlight from reaching and heating the Earth,” the story read.

“Perhaps we will see talk of an ice age again by the early 2030s, just as the MDOs shift once more and temperatures begin to rise,” Tsonis told The Mail.

However, he is not a climate change “Ëœdenier’, and attributes a small amount of “Ëœbackground’ warming to human activity and greenhouse gases.  But he questions the dire predictions others have put forth.

“I do not believe in catastrophe theories. Man-made warming is balanced by the natural cycles, and I do not trust the computer models which state that if CO2 reaches a particular level then temperatures and sea levels will rise by a given amount.”

“These models cannot be trusted to predict the weather for a week, yet they are running them to give readings for 100 years.”

Professor Tsonis said he was flooded with “Ëœhate emails’ after publishing his work in the journal Geophysical Research Letters.

“People were accusing me of wanting to destroy the climate, yet all I’m interested in is the truth,” he said.

He also received complaints from climate change skeptics, may of whom said he had not gone far enough in debunking the theory of man-made global warming.

The work of Professors Latif and Tsonis raised a critical issue:  How much of the late 20th Century warming was caused not by carbon dioxide, but by MDOs?

While Tsonis did not give a figure, Latif suggested it could be somewhere between 10 and 50 percent.

Meanwhile, other critics of man-made global warming attribute an even greater role played by MDOs.

William Gray, emeritus Professor of Atmospheric Sciences at Colorado State University, said that while he believed greenhouse gases were responsible for some background rise in temperatures, the computer models used by global warming advocates had vastly exaggerated their effect.

These models, he said, distort the way the atmosphere works.

“Most of the rise in temperature from the Seventies to the Nineties was natural,” Professor Gray told The Mail.

“Very little was down to CO2 ““ in my view, as little as five to ten percent,” he said.

Nevertheless, many passionate advocates of man-made global warming dismiss the ideas that MDOs were having any impact on the world’s climate.

In March 2000, Dr. David Viner, at the time a member of the University of East Anglia Climatic Research Unit, said that snowfall in Britain would become a very rare event within just a few years.

The University of East Anglia Climatic Research Unit is currently under investigation in the so-called “Ëœclimategate’ leaked emails.

Dr. Viner, who now heads a British Council program that raises awareness of global warming among young people abroad, said last week he still stands by his prediction.

“We’ve had three weeks of relatively cold weather, and that doesn’t change anything. This winter is just a little cooler than average, and I still think that snow will become an increasingly rare event.”

Other scientists agree with Dr. Viner, saying the frigid weather that has engulfed enormous swathes of the northern hemisphere is unusual, but does not contradict an overall global trend of warming.

They, too, say the recent brutal snowstorms and freezing temperatures in North America, Northern Europe and parts of Asia are attributable to Arctic Oscillation, also known as Northern Hemisphere Annular Mode or the North Atlantic Oscillation.

“It’s a relatively abnormal pattern but it’s not unprecedented at all, it’s something that happens every 10 years or so,” said Barry Gromett of Britain’s Met Office in an interview with the AFP news agency.

“It’s like a great big boulder in the stream. It cuts off Europe’s supply of mild, moist Atlantic air. Instead, we get Arctic winds that feed in clockwise, which means we get the cold stuff off Scandinavia and the Arctic regions,” Gromett said.

These bitter cold air streams are also deflected around the “boulder” into North America, he said, and strengthen the grip of the Siberian high-pressure system, which intensifies cold weather in parts of Asia.

Gromett noted that while some parts of the world are experiencing extreme low temperatures, others are having unusual highs as a result of warmer winds directed to different areas.

Indeed, parts of Canada and Alaska have seen temperatures nine to 18 degrees Fahrenheit above normal, while parts of North Africa and the Mediterranean basin have also seen unusually warm temperatures.

“In fact, in the first week of January, Crete recorded a temperature of more than 30 C (86 F),” Gromett said.

Michel Daloz with the French national weathercaster Meteo France said this year’s northern hemisphere’s cold spell was relatively mild by historical comparison.

“The natural variability of the climate means that there are troughs of cold from time to time,” he told the AFP.

“There were temperatures of between -25 and -15 C (-13 F to 5 F) across France” in 1956, 1963 and 1985, he said.

Nor did it challenge data indicating persistent warming, he said.

“In fact, in early December, our main focus was on the clement weather.”

Indeed, the Met Office said that 2009 was provisionally the fifth warmest on record, with 2010 potentially being the warmest ever, due to man-made greenhouse gas emissions and a return to El Nino — a natural warming phenomenon triggered by warmer waters in the western Pacific Ocean.

El Nino reappeared in June 2009, and according to the UN’s World Meteorological Organization (WMO) will likely persist through early 2010.

WMO expert Omar Baddour said the present Arctic Oscillation was likely most severe in 30 to 50 years.

“Generally it lasts a few weeks or a month, a month and a half. It started in December, so we are nearing the end of the episode,” Baddour said in Geneva.

On the Net:

FDA Warnings Associated With Reduced Atypical Antipsychotic Use Among Older Adults With Dementia

The use of atypical antipsychotics to treat elderly patients with dementia appears to have decreased following a 2005 Food and Drug Administration (FDA) advisory regarding the risks of these medications in this population, according to a report in the January 11 issue of Archives of Internal Medicine, one of the JAMA/Archives journals.

Clozapine, the first second-generation or “atypical” antipsychotic medication, was introduced in the United States in 1989, according to background information in the article. Several additional drugs, including risperidone, olanzapine and paliperidone, followed. Although they are less likely to cause neurological adverse effects associated with conventional or “typical” antipsychotics, some reports have linked atypical antipsychotics to strokes, diabetes and other severe adverse events. In April 2005, the FDA issued a public health advisory that asked manufacturers to include a boxed warning regarding the increased risk of death associated with using atypical antipsychotics to treat behavioral symptoms in older patients with dementia (an off-label use of the drugs).

E. Ray Dorsey, M.D., M.B.A., of the University of Rochester Medical Center, New York, and colleagues analyzed nationally representative data to assess rates of atypical antipsychotic drug use between January 2003 and December 2008. Physicians participating in the national index recorded diagnoses, therapies and patient characteristics for all clinical encounters over a two-day period. The researchers calculated the number of patient-physician interactions during which an antipsychotic was mentioned as a therapy and compared time periods before and after the FDA warning was issued to quantify its effects.

From January 2003 to March 2005, the rate of atypical drug mentions increased 34 percent per year, including a 16 percent increase among patients with dementia. In the year before the FDA advisory, approximately 13.6 million atypical antipsychotic mentions occurred, 0.8 million of which involved patients with dementia.

An overall decline in the use of atypical medications began within one month of the FDA advisory. Mentions of atypical antipsychotics decreased 2 percent overall and 19 percent among those with dementia in the year following the warning; by 2008, monthly drug uses among elderly patients with dementia decreased more than 50 percent.

The use of these medications for both FDA-approved and off-label indications continued decline for all populations through the end of the study period (December 2008). However, despite the uncertain benefits and the decrease in use among elderly patients with dementia, atypical antipsychotics still comprised 9 percent of prescription drug uses for dementia among older adults by the end of 2008.

“The residual use in the population at risk and the decrease in the use of atypical antipsychotics in the general population, who were not targeted by the warning, raise the question as to whether the effect and specificity of FDA regulatory actions could be enhanced,” the authors conclude. “Targeting specific segments of patients and physicians (e.g., high prescribers) and further customizing and evaluating the impact of regulatory actions may improve their impact at minimizing the risks associated with select prescription medications.”

On the Net:

Researchers Trace HIV Mutations That Lead To Drug Resistance

Chemists at UC San Diego and statisticians at Harvard University have developed a novel way to trace mutations in HIV that lead to drug resistance. Their findings, once expanded to the full range of drugs available to treat the infection, would allow doctors to tailor drug cocktails to the particular strains of the virus found in individual patients.

“We want to crack the code of resistance,” said Wei Wang, associate professor chemistry and biochemistry at UC San Diego who led the collaboration along with Jun Liu of Harvard. The team reports their work in this week’s early online edition of the Proceedings of the National Academy of Sciences.

HIV replicates quickly, but the copies are imprecise. The constant mutation has made HIV infection difficult to treat, much less cure, because drugs designed to interrupt the cycle of infection fail when their targets change.

To better understand which mutations matter for drug resistance, the researchers compared sequences of HIV taken from patients treated with specific drugs to those from untreated patients. Using a novel statistical method, they identified clusters of mutations that seemed to be working together to help the virus escape treatment.

One drug, indinavir, targets a protein called protease, which the virus needs to assemble the capsule it uses to invade new cells. Substitutions in ten different places on protease occurred in patients who were taking the drug, but what combination of mutations would hinder the action of the drug wasn’t clear before this analysis.

Chemists can determine how a drug fits to a particular protein using computer modeling, but those computations take considerable time. Evaluating all possible combinations of those 10 substitutions is impractical. The statistical screen narrowed down the possibilities.

“People never looked at this, because they didn’t know which mutation or which combination of mutations to study,” Wang said. “That’s the advantage of using the statistical method first to find the patterns. After the statisticians discovered the connections between mutations, then we focused on those combinations. We built structural models to understand the molecular basis of drug resistance.”

Using the computing resources of the Center for Theoretical Biological Physics at UC San Diego where Wang is a senior scientist, they worked out how the substitutions would change the shape of protease and its affinity for the drug. One set of changes, for example, would tend to dislodge the drug from the pocket where it normally fits.

The researchers also determined that the mutations must happen in a particular order for replicants to survive treatment with indinavir, a window into how drug resistance develops.

Looking back into the database at samples taken from individual patients at several different times during the course of their treatment, the team found that mutations accumulated in the orders that they predicted would be possible during drug treatment. Sequential mutations that their models predicted would leave the virus vulnerable to drug treatment were not observed.

The team reports its results for two additional drugs, zidovudine and nevirapine, which target a different viral enzyme, in this paper and is extending its work to all nine drugs currently approved by the FDA to treat HIV.

On the Net:

Quantum Computer Calculates Exact Energy Of Molecular Hydrogen

Groundbreaking approach could impact fields from cryptography to materials science

In an important first for a promising new technology, scientists have used a quantum computer to calculate the precise energy of molecular hydrogen. This groundbreaking approach to molecular simulations could have profound implications not just for quantum chemistry, but also for a range of fields from cryptography to materials science.

“One of the most important problems for many theoretical chemists is how to execute exact simulations of chemical systems,” says author Alán Aspuru-Guzik, assistant professor of chemistry and chemical biology at Harvard University. “This is the first time that a quantum computer has been built to provide these precise calculations.”

The work, described this week in Nature Chemistry, comes from a partnership between Aspuru-Guzik’s team of theoretical chemists at Harvard and a group of experimental physicists led by Andrew White at the University of Queensland in Brisbane, Australia. Aspuru-Guzik’s team coordinated experimental design and performed key calculations, while his partners in Australia assembled the physical “computer” and ran the experiments.

“We were the software guys,” says Aspuru-Guzik, “and they were the hardware guys.”

While modern supercomputers can perform approximate simulations of simple molecular systems, increasing the size of the system results in an exponential increase in computation time. Quantum computing has been heralded for its potential to solve certain types of problems that are impossible for conventional computers to crack.

Rather than using binary bits labeled as “zero” and “one” to encode data, as in a conventional computer, quantum computing stores information in qubits, which can represent both “zero” and “one” simultaneously. When a quantum computer is put to work on a problem, it considers all possible answers by simultaneously arranging its qubits into every combination of “zeroes” and “ones.”

Since one sequence of qubits can represent many different numbers, a quantum computer would make far fewer computations than a conventional one in solving some problems. After the computer’s work is done, a measurement of its qubits provides the answer.

“Because classical computers don’t scale efficiently, if you simulate anything larger than four or five atoms — for example, a chemical reaction, or even a moderately complex molecule — it becomes an intractable problem very quickly,” says author James Whitfield, research assistant in chemistry and chemical biology at Harvard. “Approximate computations of such systems are usually the best chemists can do.”

Aspuru-Guzik and his colleagues confronted this problem with a conceptually elegant idea.

“If it is computationally too complex to simulate a quantum system using a classical computer,” he says, “why not simulate quantum systems with another quantum system?”

Such an approach could, in theory, result in highly precise calculations while using a fraction the resources of conventional computing.

While a number of other physical systems could serve as a computer framework, Aspuru-Guzik’s colleagues in Australia used the information encoded in two entangled photons to conduct their hydrogen molecule simulations. Each calculated energy level was the result of 20 such quantum measurements, resulting in a highly precise measurement of each geometric state of molecular hydrogen.

“This approach to computation represents an entirely new way of providing exact solutions to a range of problems for which the conventional wisdom is that approximation is the only possibility,” says Aspuru-Guzik.

Ultimately, the same quantum computer that could transform Internet cryptography could also calculate the lowest energy conformations of molecules as complex as cholesterol.

Aspuru-Guzik and Whitfield’s Harvard co-authors on the Nature Chemistry paper are Ivan Kassal, Jacob D. Biamonte, and Masoud Mohseni. Financial support was provided by the US Army Research Office and the Australian Research Council Federation Fellow and Centre of Excellence programs. Aspuru-Guzik recently received support from the DARPA Young Investigator Program, the Alfred P. Sloan Foundation, and the Camille and Henry Dreyfus Foundation to pursue research towards practical quantum simulators.

On the Net:

Detecting Toxins In Drinking Water

A strip of paper infused with carbon nanotubes can quickly and inexpensively detect a toxin produced by algae in drinking water.

Engineers at the University of Michigan led the development of the new biosensor.

The paper strips perform 28 times faster than the complicated method most commonly used today to detect microcystin-LR, a chemical compound produced by cyanobacteria, or blue-green algae. Cyanobacteria is commonly found on nutrient-rich waters.

Microcystin-LR (MC-LR), even in very small quantities, is suspected to cause liver damage and possibly liver cancer. The substance and others like it are among the leading causes of biological water pollution. It is believed to be a culprit of mass poisonings going back to early human history, said Nicholas Kotov, a professor in the departments of Chemical Engineering, Biomedical Engineering and Materials Science and Engineering who led the project.

Water treatment plants””even in developed countries””can’t always remove MC-LR completely, nor can they test for it often enough, Kotov said. The biosensor he and his colleagues developed provides a quick, cheap, portable and sensitive test that could allow water treatment plants and individuals to verify the safety of water on a more regular basis.

“The safety of drinking water is a vital issue in many developing countries and in many parts of the United States,” Kotov said. “We’ve developed a simple and inexpensive technology to detect multiple toxins.”

The technology could easily be adapted to detect a variety harmful chemicals or toxins in water or food.

A paper about the technique is published online in Nano Letters. It will soon be available in the journal’s print edition.

The sensor works by measuring the electrical conductivity of the nanotubes in the paper. Before the nanotubes are impregnated in the paper, they are mixed with antibodies for MC-LR. When the paper strips come in contact with water contaminated with MC-LR, those antibodies squeeze in between the nanotubes to bond with the MC-LR. This spreading apart of the nanotubes changes their electrical conductivity.

An external monitor measures the electrical conductivity. The whole device is about the size of a home pregnancy test, Kotov said. Results appear in fewer than 12 minutes.

To adapt the biosensor for other toxins, Kotov said, scientists could simply replace the antibodies that bond to the toxin.

The paper is called “Simple, Rapid, Sensitive and Versatile SWNT-Paper Sensor for Environmental Toxin Detection Competitive with ELISA.” It is available online at http://pubs.acs.org/doi/abs/10.1021/nl902368r.

This research was done in collaboration with the laboratory of professor Chuanlai Xu at Wuxi University in China. It is funded by the National Science Foundation, the Air Force Office of Scientific Research, and the National Institutes of Health, as well as the National Science Foundation of China and the 11th Five Years Key Programs for Science and Technology Development of China.

The university is pursuing patent protection for the intellectual property, and is seeking commercialization partners to help bring the technology to market.

On the Net: