2 Treatment Strategies For Severe Sepsis Show Similar Survival Rates

A comparison of two strategies for treating severe sepsis or septic shock finds that using lactate levels measured in blood samples showed a similar short-term survival rate compared to a treatment regimen using central venous oxygen saturation measured using a specialized catheter, according to a study in the February 24 issue of JAMA.

In the United States, the rate of severe sepsis hospitalizations has doubled during the last decade, with estimates indicating that at least 750,000 persons are affected annually. Approximately 500,000 patients with severe sepsis in the United States are initially treated in emergency departments every year. Among suggested treatment strategies is the controversial issue of the method of determining tissue oxygen delivery, according to background information in the article.

“Citing a single-center study, the Surviving Sepsis Campaign guidelines recommend the use of central venous oxygen saturation (ScvO2) or mixed venous oxygen saturation to assess the balance of tissue oxygen delivery and consumption; however, since its publication in 2001 a substantial amount of controversy about this single-center study has been generated in the scientific community. Additionally, recently published practice surveys have indicated that the time, expertise, and specialized equipment required to measure ScvO2 collectively pose a major barrier to the implementation of protocol-driven quantitative resuscitation programs. In contrast, lactate clearance, derived from calculating the change in lactate concentration from 2 blood specimens drawn at different times, potentially represents a more accessible method to assess tissue oxygen delivery,” the authors write.

Alan E. Jones, M.D., of the Carolinas Medical Center, Charlotte, N.C., and colleagues compared outcomes between early resuscitation for patients with severe sepsis or septic shock targeting lactate clearance as the marker of adequate oxygen delivery vs. targeting ScvO2 measured using a central venous catheter connected to a computerized system. The primary measured outcome was death while in the hospital. The randomized trial included 300 patients with severe sepsis and evidence of hypoperfusion (decreased blood flow to the body tissues) or septic shock who were admitted to the emergency department at 1 of 3 hospitals between January 2007 and January 2009. The patients were randomly assigned to one of the two resuscitation protocols.

The researchers found that 34 patients (23 percent) in the ScvO2 group died while in the hospital compared with 25 (17 percent) in the lactate clearance group, with the observed difference not reaching the predefined threshold difference of 10 percent. There were no differences in treatment-related adverse events between the groups.

“These data support the substitution of lactate measurements in peripheral venous blood as a safe and efficacious alternative to a computerized spectrophotometric catheter in the resuscitation of sepsis,” the authors write.

(JAMA. 2010;303[8]:739-746)

Editorial: Disassembling Goal-Directed Therapy for Sepsis ““ A First Step

In an accompanying editorial, Roger J. Lewis, M.D., Ph.D., of the Harbor-UCLA Medical Center, Los Angeles, comments on the findings of this study.

“…the study by Jones et at is an important first step to identifying less burdensome approaches to the initial management of critically ill patients with severe sepsis and septic shock. Substantial further progress most likely will depend on appropriately designed, rigorously conducted clinical trials (requiring novel strategies, such as adaptive design) that can efficiently and practically address the complicated questions inherent in identifying the optimal and least burdensome combination of resuscitation targets.”

(JAMA. 2010;303[8]:777-779)

On the Net:

Infection Deaths From U.S. Hospitals

A new study has shown that nearly 50,000 U.S. medical patients die every year of blood poisoning or pneumonia picked up in hospitals, AFP reported.

The study, led by researchers from the Center for Disease Dynamics, Economics and Policy at Washington-based Resources for the Future, showed that hospital-acquired sepsis and pneumonia in 2006 claimed 48,000 lives, led to 2.3 million extra patient-days in hospital and cost 8.1 billion dollars.

The study, published in the Archives of Internal Medicine on Monday, said the two hospital-acquired infections — also called nosocomial infections — accounted for about one-third of the 1.7 million infections U.S. patients pick up every year while in the hospital.

Nearly half of the 99,000 deaths a year from hospital-acquired infections reported by the Centers for Disease Control and Prevention (CDC) are also likely caused by the two infections.

According to the study, patients who underwent invasive surgery during their initial hospitalization were more likely to pick up a secondary infection while in the hospital, and elective surgery patients were at even higher risk of nosocomial infection.

The researchers estimated that 290,000 patients in U.S. hospitals picked up sepsis, or blood poisoning, during their hospitalization in 2006, and 200,000 developed pneumonia.

The study used the largest database of hospital records in the United States, which covered hospital discharges in 40 states.

It was discovered that hospital-acquired pneumonia extended a patient’s stay in the hospital by 14 days and added some 46,400 dollars to the final price tag, while sepsis extended the time spent in hospital by nearly 11 days and added 32,900 dollars on average to the final bill.

Ramanan Laxminarayan, one of the lead authors of the study, said improving hygiene in clinical settings could prevent the two infections and others picked up in hospitals.

“The magnitude of harm from these infections is deplorable and it is unconscionable that patients continue to experience harm from their interactions with the health system,” said two critical care doctors, in a commentary piece also published in the Archives of Internal Medicine.

David Murphy and John Pronovost of Johns Hopkins University’s department of medicine wrote in the commentary: “What is glaringly obvious is that preventable harm remains a substantial problem and that investments in research to reduce these harms are woefully inadequate given the magnitude of the problem.”

On the Net:

Sperm Whales Team Up To Corral Squid

A new study suggests that sperm whales may team up and work cooperatively to hunt down and corral their food.

Scientists from the U.S. used high-tech GPS tags to study the marine mammals’ astonishing hunting behaviors. The tracking equipment showed how the animals traveled together in groups, but when it came time to hunt for food, each whale took on various roles within the group.

The study, led by Professor Bruce Mate from the Hatfield Marine Science Center in Oregon, used new equipment to tag and follow the giant sea creatures. “We have [a tag with] GPS precision for the whales’ movements and a time and depth record of their dives,” Mate told BBC News. “And, for the first time, we have tagged several animals within the same group.”

Evidence showed that the whales stayed close together over several months in and around the Gulf of Mexico. But when the animals made their dives to hunt for food their behavior varied with each dive.

Pointing to the evidence, Professor Mate said: “We can see that they’re actually changing their role over time.” The team speculated that when they dive, often as deep as 3,300 feet, they are hunting and “herding a ball of squid.”

Mate said that some whales seemed to be guarding the bottom of the “bait ball”, keeping the prey from escaping downward, while other animals in the group concentrated on the center of the ball itself. It seemed that the whales took turns diving to the physiologically demanding depths, he added.

Professor Hal Whitehead, a researcher from Dalhousie University in Nova Scotia, told BBC News he was impressed by the data, but disagreed with the suggestion that they were herding squid, saying it seemed a little “far-fetched”.

However, Dr. Mate pointed to evidence from a previous research that showed that dolphins took on similar types of behavior. In that study, scientists captured footage of the dolphins herding a ball of fish, and they appeared to take turns diving through the ball taking in a mouthful of fish.

With the whales, it was difficult to capture their behavior, because they dive to far greater depths than dolphins, explained Mate.

“Our next step will be to image the squid at the same time as tracking the whales,” he said. The team also plans to tag more members of the same group to gain an even better understanding of how the social creatures collaborate.

The findings of the study was announced by Professor Mate and his colleagues at the Ocean Sciences meeting in Portland, Oregon.

On the Net:

Wingless Mosquitoes May Help Control Dengue Fever

Genetic approach could safely reduce or eliminate spread of disease affecting millions

A new strain of mosquitoes in which females cannot fly may help curb the transmission of dengue fever, according to UC Irvine and British scientists.

Dengue fever causes severe flulike symptoms and is among the world’s most pressing public health issues. There are 50 million to 100 million cases per year, and nearly 40 percent of the global population is at risk. The dengue virus is spread through the bite of infected female Aedes aegypti mosquitoes, and there is no vaccine or treatment.

UCI researchers and colleagues from Oxitec Ltd. and the University of Oxford created the new breed. Flightless females are expected to die quickly in the wild, curtailing the number of mosquitoes and reducing ““ or even eliminating ““ dengue transmission. Males of the strain can fly but do not bite or convey disease.

When genetically altered male mosquitoes mate with wild females and pass on their genes, females of the next generation are unable to fly. Scientists estimate that if released, the new breed could sustainably suppress the native mosquito population in six to nine months. The approach offers a safe, efficient alternative to harmful insecticides.

Study results appear in the early online edition of the Proceedings of the National Academy of Sciences for the week of Feb. 22. The research is receiving funding support from the Foundation for the National Institutes of Health through the Grand Challenges in Global Health initiative, which was launched to support breakthrough advances for health challenges in the developing world.

“Current dengue control methods are not sufficiently effective, and new ones are urgently needed,” said Anthony James, Distinguished Professor of microbiology & molecular genetics and molecular biology & biochemistry at UCI and an internationally recognized vector biologist. “Controlling the mosquito that transmits this virus could significantly reduce human morbidity and mortality.”

Using concepts developed by Oxitec’s Luke Alphey, the study’s senior author, researchers made a genetic alteration in the mosquitoes that disrupts wing muscle development in female offspring, rendering them incapable of flight. Males’ ability to fly is unaffected, and they show no ill effects from carrying the gene.

“The technology is completely species-specific, as the released males will mate only with females of the same species,” Alphey said. “It’s far more targeted and environmentally friendly than approaches dependent upon the use of chemical spray insecticides, which leave toxic residue.”

“Another attractive feature of this method is that it’s egalitarian: All people in the treated areas are equally protected, regardless of their wealth, power or education,” he added.

James and Alphey have pioneered the creation of genetically altered mosquitoes to limit transmission of vector-borne illnesses. While their current work is focused on the dengue fever vector, they noted that this approach could be adapted to other mosquito species that spread such diseases as malaria and West Nile fever.

Image Courtesy CDC

On the Net:

Typhoid Fever Bacteria Gather in Communities on Gallstones

COLUMBUS, Ohio ““ A new study suggests that the bacteria that cause typhoid fever collect in tiny but persistent communities on gallstones, making the infection particularly hard to fight in so-called “carriers” ““ people who have the disease but show no symptoms.

Humans who harbor these bacterial communities in their gallbladders, even without symptoms, are able to infect others with active typhoid fever, especially in developing areas of the world with poor sanitation. The disease is transmitted through fecal-oral contact, such as through poor hand-washing by people who prepare food.

Typhoid fever is rare in the United States, but it affects an estimated 22 million people worldwide, causing symptoms that include a high fever, headache, weakness and fatigue, and abdominal pain. It leads to hundreds of thousands of deaths each year.

Scientists and physicians have known for decades that these bacteria, Salmonella enterica serovar Typhi, accumulate in the gallbladder. In fact, the most widely accepted treatment of chronic typhoid infection is removal of the gallbladder.

“We’re trying to get to the heart of why this is. Why does Salmonella sit in a pool of highly concentrated detergent, which is what bile is, but not die?” said John Gunn, professor of molecular virology, immunology and medical genetics at Ohio State University and senior author of the study. “It’s got to survive in some way, and a good way to survive is by forming a biofilm.”

Biofilms ““ in this case, the collection of bacteria on gallstones ““ typically do not respond well to antibiotics or the human immune response. But now that the biofilms themselves have been discovered in association with asymptomatic typhoid infection, they present a potential treatment alternative to expensive and invasive gallbladder removal, Gunn said.

Specifically, targeting a sugar polymer on the bacterial surface that promotes development of the biofilm might be a strategy to prevent biofilm formation in the first place, he said.

The research appears this week in the online early edition of the Proceedings of the National Academy of Sciences.

Gunn and colleagues observed this biofilm formation in mice infected with a strain of Salmonella bacteria similar to the strain that causes typhoid fever in humans. The scientists also detected these biofilms on gallstones in about 5 percent of humans in a Mexican hospital who had their gallbladders removed because of complications from gallstones. Typhoid fever is widespread in Mexico.

“The mouse data coupled with the human data suggest strongly that biofilms lay a foundation that allows for establishment and maintenance of chronic typhoid infection,” said Gunn, also a vice director of Ohio State’s Center for Microbial Interface Biology.

And the researchers suspect biofilms are at play in the gallbladder’s association with typhoid fever because in most cases, the only way to treat a biofilm-related infection is to remove whatever the biofilm has attached to from the body. For example, infections that form on catheters, implanted joints or artificial heart valves typically result from biofilms, and the only way to clear the infection is to remove those devices.

“Information in our lab and in the literature that gallstones were associated with how people became carriers of typhoid bacteria, that organisms were confined to one site, and that antibiotics are ineffective so one has to remove the gallbladder for successful therapy ““ it all fit with biofilm-related disease,” Gunn said.

In the study, the researchers fed mice either normal food or a high-cholesterol diet for eight weeks, intending to induce gallstones in the animals on the fatty diet. The scientists then gave these mice a type of Salmonella bacteria designed to mimic a chronic human typhoid infection without causing actual illness in the mice. A control group of mice received no bacteria.

The number of bacteria harbored in the gallbladders of mice with gallstones increased over time, becoming abundant within 21 days, and was significantly higher than bacteria in mice that did not have any stones. No bacteria were detected in mice that weren’t given the infection, even if they had gallstones.

In the infected mice, the Salmonella bacteria also could be seen in the gallbladder lining and in bile as well as on the surface of the gallstones. The gallstones were the focus of this study because Gunn’s lab has determined in previous experiments that Salmonellae are attracted to cholesterol-coated surfaces.

There are two common types of gallstones ““ cholesterol stones and brown or black stones composed primarily of calcium bilirubinate, which can be found in bile. Gunn’s test-tube research to date had suggested that Salmonella Typhi bacteria bind particularly well to cholesterol gallstones to form biofilms, and this current study supported that.

Three weeks after infection, biofilms covered about 50 percent of the surfaces of the gallstones removed from the infected mice.

“What we think is that having gallstones makes you more susceptible to becoming a carrier because it provides that environment for Salmonella to bind to the surface, form a biofilm and establish infection,” Gunn said. “Whether that happens 100 percent of the time, nobody knows.”

In a second component of the mouse study, the researchers tested fresh fecal pellets from infected mice to test the association between gallstone biofilms and transmission of a typhoid-like infection via feces, a phenomenon called “shedding.” The mice with gallstones shed three times more bacteria than did infected mice without gallstones.

“The mice that had gallstones and were infected with bacteria had a much higher rate of shedding, meaning those bacteria were released, probably because they had more bacteria in the gallbladder itself,” Gunn said.

The mouse data not only supported Gunn’s hypothesis that gallstones present at least one surface on which Salmonella biofilms form and maintain the carrier state of typhoid fever. The researchers also realized they had developed a new mouse model for further study of asymptomatic typhoid carriage.

Gunn and colleagues also obtained data from humans at a hospital in Mexico whose gallbladders were removed as a treatment for gallstone complications. Though none of the patients had ever shown symptoms for typhoid fever, 5 percent of them ended up being carriers of Salmonella Typhi bacteria biofilms on their gallstones. In the single patient determined to be a typhoid carrier who didn’t have biofilm on his gallstones, the stones were dark in color, suggesting they were likely composed of something other than cholesterol, Gunn said.

This ability of a single individual to harbor latent bacteria elsewhere in the gallbladder leads Gunn and colleagues to suspect that biofilms can form elsewhere in the gallbladder ““ perhaps in its lining or persisting within specific cells of the gallbladder wall. Gunn’s lab is exploring those possibilities.

This work is supported by the National Institutes of Health and a graduate education fellowship from Ohio State’s Public Health Preparedness for Infectious Diseases initiative.

Co-authors of the study are Robert Crawford of the Center for Microbial Interface Biology and Department of Molecular Virology, Immunology and Medical Genetics at Ohio State; Roberto Rosales-Reyes and María de la Luz Ramírez-Aguilar of the Universidad Nacional Autonoma de Mexico; Oscar Chapa-Azuela of Hospital General de Mexico; and Celia Alpuche-Aranda of the Instituto Nacional de Referencia Epidemiologica in Mexico.

On the Net:

Ohio State University

Marijuana Use On The Rise With Baby Boomers

A new study reveals that a growing trend among the baby boomer generation is to take part in the country’s most popular illicit drug.

The Substance Abuse and Mental Health Services Administration said people born in the 1960s and 1970s that reported using marijuana went up from 1.9 percent to 2.9 percent in just six years.

The most dramatic rise was between the age range of 55 to 59-years-old, in which the use of the drug tripled from 1.6 percent in 2002 to 5.1 percent in 2008.

Researchers believe that as the 78 million boomers age, there will be further increases.

Some started using it for recreation or as a way to cope with the aches and pains of getting older, while others never stopped using it.

Political advocates for legalizing marijuana say the number of elderly users may represent an important push to change the laws on the substance.

“For the longest time, our political opponents were older Americans who were not familiar with marijuana and had lived through the ‘Reefer Madness’ mentality and they considered marijuana a very dangerous drug,” Keith Stroup, the founder and lawyer of NORML, a marijuana advocacy group, told the Associated Press.

“Now, whether they resume the habit of smoking or whether they simply understand that it’s no big deal and that it shouldn’t be a crime, in large numbers they’re on our side of the issue.”

Stroup says that every night he sits down to the evening news, pours himself a glass of wine and rolls a joint.  The 66-year-old has used the drug since he was a freshman at Georgetown, but many older adults are revisiting the drug after years of not partaking in it.

“The kids are grown, they’re out of school, you’ve got time on your hands and frankly it’s a time when you can really enjoy marijuana,” Stroup said. “Food tastes better, music sounds better, sex is more enjoyable.”

Marijuana is credited to help with aches and pains, glaucoma, macular degeneration, and other age related problems.  So far, 14 states have passed laws to allow the use of marijuana for medical purposes, but those living in other states have to rely on the drug illegally.

However, there are risks of health problems from aging that can be exacerbated by the regular use of marijuana.

Dr. William Dale, chief of geriatrics and palliative medicine at the University of Chicago Medical Center, told AP that older users might be at risk for falls if they become dizzy.  He also said smoking it increases the risk of heart disease and it can cause cognitive impairment.

He said he’d caution using it even if a patient cites benefits.

“There are other better ways to achieve the same effects,” he todl the news agency.

The director of applied studies at the Substance Abuse and Mental Health Services Administration, Pete Delany, said boomers’ drug use defied stereotypes, but is important to address.

“When you think about people who are 50 and older you don’t generally think of them as using illicit drugs “” the occasional Hunter Thompson or the kind of hippie dippie guy that gets a lot of press maybe,” he told AP. “As a nation, it’s important to us to say, ‘It’s not just young people using drugs it’s older people using drugs.'”

Older marijuana smokers say that they prefer to smoke in privately.  They also said the quality and price of the drug has increased substantially since their youth.

Thesis: Malaria Research Must Be Based In Africa

Gunilla Priebe has studied the international research alliance the Multilateral Initiative on Malaria (MIM), which advocates for malaria research in general and the strengthening of research environments in Africa. Malaria research has historically been controlled by interests located in areas outside Africa. This has led to a huge gap in knowledge in relation to the malaria problems that dominate everyday life in those areas where people are most affected by the disease.

Better opportunities for researchers on site

According to MIM, researchers based in Africa have a more comprehensive understanding of malaria and its effects on the population. In addition, a locally based researcher is often more motivated to solve problems that are of considerable significance for the majority of malaria patients.

“Proximity to the environment where the social, political, economic as well as the biological dynamics related to malaria are evident provides the researcher with better opportunities to formulate relevant research questions. If the research is based in Africa it increases the chances of the results being of some practical use,” says Gunilla Priebe.

Africanization brings fresh approach

Gunilla Priebe’s analysis is framed by the concept of “Africanization”, which in relation to scientific knowledge production entails two integrated themes: the meaning of time and location with regard to researchers’ ability to represent a study object correctly and in a relevant manner, and partly the impact of remnants of colonialism on the production of scientific knowledge.

According to Gunilla Priebe, the study of MIM shows that Africanization of malaria research means investments in infrastructure, education and improved forms for research cooperation. In such a case, Africanization will also lead to innovative approaches when it comes to research methods and arguments, as well as enhanced influence from both patient and researcher groups that have in the past been marginalized within malaria research.

Definition of researcher’s role

Based on the conclusions drawn from Theory of science study of MIM, Gunilla Priebe’s recommendation to research organizations, such as philanthropic foundations and national aid and research bodies, is to evaluate the role of research for development in the same way as other “foreign” initiated development cooperation.

“Naturally more money for research into malaria is welcomed, but if the organizations that support research in and about Africa don’t intend to reproduce colonial methods, then these cannot work on the basis of utopian ideals of scientific autonomy. They must also take such issues as the right to co-ownership at each stage of knowledge production into consideration. Neither can they focus solely on financial support or the content and organization of the research; they must also engage with the political and social effects of research and research support,” says Gunilla Priebe.

On the Net:

Special Issue Of NeuroRehabilitation Focuses On Hypoxic-Ischemic Brain Injuries

IOS Press announces publication of a special issue of NeuroRehabilitation: An International Journal (NRE) devoted specifically to hypoxic-ischemic brain injury (HI-BI), a significant disruption of brain function due to a deficient supply of oxygen to the brain. This is the first publication to present a consolidated overview of HI-BI. It provides a thorough review of neuropathophysiology, neuroimaging assessment, and evaluation and management of the neurological and neurobehavioral sequelae of these injuries in adults and children.

“This special issue of NeuroRehabilitation on hypoxic-ischemic brain injury will serve as an excellent resource for clinicians assessing and treating this unique patient group given the absence of a comprehensive source of clinical information of this scope and detail,” comments NRE Co-Editor Nathan Zasler, MD, FAAPM&R, FACRM, CBIST, CEO and Medical Director of Tree of Life Services, Inc and Concussion Care Centre of Virginia, Ltd., as well as Clinical Professor of Physical Medicine and Rehabilitation at Virginia Commonwealth University, Richmond, VA.

Guest Editor of this special issue David B. Arciniegas, Director of the Neurobehavioral Disorders Program and Associate Professor of Psychiatry and Neurology, University of Colorado School of Medicine; and Medical Director, Brain Injury Rehabilitation Unit, HealthONE Spalding Rehabilitation Hospital, talks about the challenges of treating patients with HI-BI. He states, “As with the approach to HI-BI adopted in the TBI (Traumatic Brain Injury) Act of 2008, applying a certain measure of care-by-analogy is understandable and unavoidable ““ doing so allows those of us working with persons with HI-BI and their families to organize and deliver care that supports their neurological and functional recovery, assists with adaptation to disability, and, to the greatest extent possible, facilitates re-entry into the community and workforce. “¦”We hope that our readers and others interested in this subject will find this issue of NeuroRehabilitation informative and useful.”

The issue includes contributions by globally recognized experts:

* Hypoxic-ischemic brain injury: Addressing the disconnect between pathophysiology and public policy. David B. Arciniegas, HealthONE Spalding Rehabilitation Hospital and University of Colorado Denver, introduces the issue and defines the set of clinical conditions within the spectrum of HI-BI.

* Hypoxic-ischemic brain injury ““ pathophysiology, neuropathology and mechanisms. Katharina M. Busl, Massachusetts General Hospital, and David M. Greer, Harvard Medical School, contribute a critical overview of the pathophysiology of HI-BI.

* Neuroimaging of hypoxic-ischemic brain injury
      Deborah M. Little, Marilyn F. Kraus, Catherine Jiam, Michael Moynihan, Michelle Siroko, Evan Schulze and Elizabeth K. Geary, University of Illinois College of Medicine, discuss current neuroimaging techniques and their current and potential applications to the clinical evaluation of persons with HI-BI.

* Neurocognitive outcomes following neonatal encephalopathy. Jennifer Armstrong-Wells, Timothy J. Bernard, Richard Boada and Marilyn Manco-Johnson, The Children’s Hospital and the University of Colorado Denver, focus on perinatal HI-BI, or neonatal encephalopathy.

* Neurological sequelae of hypoxic-ischemic brain injury. Christine Lu-Emerson and Sandeep Khot, University of Washington, explore the neurological aftermath of HI-BI injuries.

* Cognitive sequelae of hypoxic-ischemic brain injury: A review. C. Alan Anderson, Denver Veterans Affairs Medical Center and University of Colorado Denver, and David B. Arciniegas, HealthONE Spalding Rehabilitation Hospital and University of Colorado Denver, offer a review of the broad spectrum of post-hypoxic cognitive impairments and their treatments.

* The syndrome of delayed post-hypoxic leukoencephalopathy. David Shprecher University of Utah and Lahar Mehta Evergreen Neuroscience Institute, address the under-recognized problem of delayed post-hypoxic leukoencephalopathy.

* Hypobaric hypoxic cerebral insults: The neurological consequences of going higher. Edward H. Maa, Denver Health and Hospitals and University of Colorado Denver, offers insights on hypobaric (high-altitude) HI-BI.

* Neurological and neurobehavioral sequelae of obstructive sleep apnea. Jean C.G. Tsai University of Colorado Denver, provides a thorough review of HI-BI consequences of OSA.

On the Net:

Dolphins Enlisted In Diabetes Research

Apart from humans, dolphins are the only animals to develop a natural form of type 2 diabetes, researchers recently discovered.

An American study found that bottlenose dolphins have insulin resistances similar to that seen in humans. However, unlike humans, dolphins are able to turn the conditions on and off when appropriate, so it doesn’t pose harm to the animal.

Research leader and veterinarian Stephanie Venn-Watson of the US National Marine Mammal Foundation said that these findings could have profound implications for the disease that is linked to one in twenty deaths.

The dolphin could be an invaluable model for researchers to uncover the mysteries of type 2 diabetes. If researchers can figure out how the marine mammals turn off the insulin before it can become harmful, there might be a possible cure involved.

Bottlenose dolphins are “an important, natural and long-lived model for insulin resistance and diabetes,” Venn-Watson told the San Diego conference of the American Association for the Advancement of Science. She hopes the discovery will lead scientists to find a way to prevent, treat and possibly cure diabetes.

She stressed that research does not intend to use dolphins as laboratory animals, but to study their genetic code and physiology through blood and urine samples that could provide important clues about the biological makeup of diabetes.

Researchers made the surprising discovery while studying more than 1,000 blood samples they collected from 52 dolphins. They found that the dolphins’ blood sugar remained at elevated levels when they fasted overnight. Their blood chemistry also changed in ways similar to human diabetics. Although, when the dolphins were fed, their blood reverted to normal, unlike in humans.

Dr. Venn-Watson said that there may be some beneficial reason why dolphins control their own diabetes. Their fish diet is high in protein and low in sugar, and they often go long periods without eating, yet they have very high energy demands and a large brain. By having a resistance to insulin while fasting, they may be able to supply enough sugar to the brain. Once they have eaten, the resistance stops to prevent damage.

“We propose that, while some people may eat high- protein diets to help control diabetes, dolphins appear to have developed a diabetes-like state to support a high-protein diet,” she said.

Dr. Venn-Watson and her team are hoping to find a genetic fasting switch that dolphins may have to turn diabetes on and off. “Finding and controlling such a switch could lead to the control of insulin resistance and possibly the cure to type 2 diabetes in humans,” she added.

High iron that is associated with insulin resistance in humans has also been found in dolphins. The team found that dolphins with excessive iron levels also have high insulin levels that could point to a more damaging form of diabetes. The discovery is significant because no other animal known, besides humans, has such an advanced form of type 2 diabetes.

There is some doubt as to how beneficial dolphins may be pertaining to diabetes research. Mark Simmonds, international head of science at the Whale and Dolphin Conservation Society, posed that ethical issues would arise from dolphins being used to study human diseases, and that they were too distantly related to us to be of any use.

“It is a grave concern that dolphins might be used in biomedical research. Dolphins are intelligent and sophisticated animals, vulnerable to stress and suffering when confined and removed from their natural environment,” he said.

On the Net:

Google Attacks Came From Chinese Schools: Report

Recent Internet attacks against Google’s email service have been pinpointed to two prominent schools in China, according to a story published by the New York Times late Thursday.

Security investigators followed the digital fingerprints which led them to computers at Shanghai Jiaotong University and Lanxiang Vocational School in China. The newspaper attributed the information to an unnamed source involved in the investigation.

Google revealed on January 12 that hackers stole computer codes and tried to break into the accounts of human rights activists opposed to China’s policies. The attacks were also launched against more than 30 other online companies, security experts said. A leak in Microsoft’s Internet Explorer Web browser is believed to have allowed for the cyber criminals to infiltrate the system.

Due to the seriousness of the online assault, Google confronted the Chinese government about censorship rules that remove political and cultural material from search results that the country feels may be too sensitive. Google is threatening to shut down its China-based search engine and all of its offices in the country unless China loosens its restrictions on free speech.

As of Thursday, Google and the Chinese government are still discussing a possible compromise.

Speculation arose over the Chinese government’s involvement in the debacle when Google threatened to pull its online tool from China. Google has never accused the government of any involvement, only saying that they believed the attack originated from within China. The government has denied having any involvement in the attacks.

Google has been working with US intelligence agencies to investigate the attacks, which were described as bearing the marks of high-level espionage. Jill Hazelbaker, Google’s director of corporate communications said that the company’s investigation is ongoing, but otherwise declined to comment. According to the report, investigators believe there is evidence suggesting a link to a computer science class at the vocational school taught by a Ukrainian professor.

Both Chinese schools have popular science programs. Jiaotong University’s computer science program is said to be the best in the country, and Lanxiang Vocational School has been known to train computer scientists for the Chinese military, the New York Times said.

On the Net:

SDO Destroys A Sundog

Last week, on Feb. 11th, the Solar Dynamics Observatory (SDO) lifted off from Cape Canaveral on a five-year mission to study the sun. Researchers have called the advanced spacecraft the “crown jewel” of NASA’s heliophysics fleet. SDO will beam back IMAX-quality images of solar explosions and peer beneath the stellar surface to see the sun’s magnetic dynamo in action.

SDO is designed to amaze””and it got off to a good start.

“The observatory did something amazing before it even left the atmosphere,” says SDO project scientist Dean Pesnell of the Goddard Space Flight Center.

Moments after launch, SDO’s Atlas V rocket flew past a sundog hanging suspended in the blue Florida sky and, with a rippling flurry of shock waves, destroyed it. 13-year-old Anna Herbst recorded the video at NASA’s Banana River viewing site “” and don’t forget to turn up the volume to hear the reaction of the crowd.

“I couldn’t believe my eyes,” says Anna. “The shock waves were so cool.” Anna traveled with classmate Amelia Phillips three thousand miles from Bishop, California, to witness the launch. “I’m so glad we came,” says Amelia. “I’ve never seen anything like it!”

Sundogs are formed by plate-shaped ice crystals in high, cold cirrus clouds. As the crystals drift down from the sky like leaves fluttering from trees, aerodynamic forces tend to align their broad faces parallel to the ground. When sunlight hits a patch of well-aligned crystals at just the right distance from the sun, voila!–a sundog.

“When the Atlas V rocket penetrated the cirrus, shock waves rippled through the cloud and destroyed the alignment of the crystals,” explains atmospheric optics expert Les Cowley. “This extinguished the sundog.”

Videos by other photographers at Banana River show the shock waves particularly well. Here’s one from Romeo Durscher of Stanford, California, and another from Barbara Tomlinson of Beachton, Georgia.

In the past, says Cowley, there have been anecdotal reports of atmospheric disturbances destroying sundogs””for instance, “gunfire and meteor shock waves have been invoked to explain their disruption. But this is the first video I know of that shows the effect in action.”

The effect on the crowd was electric.

“When the sundog disappeared, we started screaming and jumping up and down,” says Pesnell. “SDO hit a home run: Perfect launch, rippling waves, and a disappearing sundog. You couldn’t ask for a better start for a mission.”

SDO is now in orbit. “The observatory is doing great as the post-launch checkout continues,” he reports. “We’ll spend much of the first month moving into our final orbit and then we’ll turn on the instruments. The first jaw-dropping images should be available sometime in April.”

Believe it or not, Pesnell says, the best is yet to come.

Author: Dr. Tony Phillips – Science @ NASA

Image 1: SDO has a close encounter with a sundog. Movie formats: 10 MB Quicktime, 1 MB mpeg-4. Credit: Anna Herbst of Bishop, California.

Image 2: Sundogs are formed by the refracting action of plate-shaped ice crystals. Image credit: Les Cowley/Atmospheric Optics [more]

On the Net:

Big Plant Seeds Don’t Always Beat Out Small Seeds

College of Biological Sciences researcher Helene Muller-Landau has developed a new theory explaining why some plant species produce a small number of large seeds while others produce a large number of small seeds.

Using mathematical modeling, Muller-Landau demonstrated that plants having different size seeds can coexist when regeneration sites vary in stressfulness.  Species that produce large seeds (e.g., coconuts) have the advantage under stressful conditions — such as drought or shade — while plants that produce large numbers of small seeds (e.g., fig species) have the advantage in areas with adequate water and light.

The research was published in the Early Online edition of Proceedings of the National Academy of Sciences during this week of Feb. 15.

“The standard explanation has been that big seeds beat out small seeds under all conditions, but that’s not necessarily true,” Muller-Landau says.  “Big seeds have the advantage in stressful conditions and small seeds have the advantage when sun and water are abundant. It’s a trade-off between tolerance and fecundity.”

Muller-Landau’s “tolerance-fecundity model” explains why different plant species have different size seeds and may also provide insight into the variation of the number and size of offspring among animal species, she says. It also helps to explain why there’s so much diversity among species, a key finding that advances understanding of evolutionary biology.

As a staff scientist at the Smithsonian Tropical Research Institute and head of an international effort to quantify carbon in forests worldwide, Muller-Landau has visited forests in China, Malaysia, Ecuador and Panama, among other exotic destinations. Her experience has enabled her to observe a broad spectrum of plant species and the conditions under which they grow. This led her to question the prevailing theory about seed size.

Research in tropical biology has long focused on natural history and basic biology, as the bewildering diversity and complexity of these ecosystems has made them seem beyond the reach of quantitative ecological theory. In recent years, however, as larger datasets have accumulated, and some general patterns have begun to emerge, mathematical models have been increasingly been applied — and have provided important insights.

“This simple, elegant theory, so well grounded in sound natural history, is a considerable advance in our understanding of plant species and how they coexist,” said Egbert Leigh, of the Smithsonian Tropical Research Institute.

Financial support was provided by the HSBC Climate Partnership, a Packard Fellowship in Science and Engineering, the University of Minnesota and the U.S. National Science Foundation.

On the Net:

Orange Peels, Newspapers May Lead To Better Ethanol Fuel

Scientists may have just made the breakthrough of a lifetime, turning discarded fruit peels and other throwaways into cheap, clean fuel to power the world’s vehicles.

University of Central Florida professor Henry Daniell has developed a groundbreaking way to produce ethanol from waste products such as orange peels and newspapers. His approach is greener and less expensive than the current methods available to run vehicles on cleaner fuel ““ and his goal is to relegate gasoline to a secondary fuel.

Daniell’s breakthrough can be applied to several non-food products throughout the United States, including sugarcane, switchgrass and straw.

“This could be a turning point where vehicles could use this fuel as the norm for protecting our air and environment for future generations,” he said.

Daniell’s technique ““ developed with U.S. Department of Agriculture funding — uses plant-derived enzyme cocktails to break down orange peels and other waste materials into sugar, which is then fermented into ethanol.

Corn starch now is fermented and converted into ethanol. But ethanol derived from corn produces more greenhouse gas emissions than gasoline does. Ethanol created using Daniell’s approach produces much lower greenhouse gas emissions than gasoline or electricity.

There’s also an abundance of waste products that could be used without reducing the world’s food supply or driving up food prices. In Florida alone, discarded orange peels could create about 200 million gallons of ethanol each year, Daniell said.

More research is needed before Daniell’s findings, published this month in the highly regarded Plant Biotechnology Journal, can move from his laboratory to the market. But other scientists conducting research in biofuels describe the early results as promising.

“Dr. Henry Daniell’s team’s success in producing a combination of several cell wall degrading enzymes in plants using chloroplast transgenesis is a great achievement,” said Mariam Sticklen, a professor of crop and soil sciences at Michigan State University. In 2008, she received international media attention for her research looking at an enzyme in a cow’s stomach that could help turn corn plants into fuel.

Daniell said no company in the world can produce cellulosic ethanol ““ ethanol that comes from wood or the non-edible parts of plants.

Depending on the waste product used, a specific combination or “cocktail” of more than 10 enzymes is needed to change the biomass into sugar and eventually ethanol. Orange peels need more of the pectinase enzyme, while wood waste requires more of the xylanase enzyme. All of the enzymes Daniell’s team uses are found in nature, created by a range of microbial species, including bacteria and fungi.

Daniell’s team cloned genes from wood-rotting fungi or bacteria and produced enzymes in tobacco plants. Producing these enzymes in tobacco instead of manufacturing synthetic versions could reduce the cost of production by a thousand times, which should significantly reduce the cost of making ethanol, Daniell said.

Tobacco was chosen as an ideal system for enzyme production for several reasons. It is not a food crop, it produces large amounts of energy per acre and an alternate use could potentially decrease its use for smoking.

Daniell’s team includes Dheeraj Verma, Anderson Kanagaraj, Shuangxia Jin, Nameirakpam Singh and Pappachan E. Kolattukudy in the Burnett School of Biomedical Sciences at UCF’s College of Medicine. Genes for the pectinase enzyme were cloned in Kolattukudy’s laboratory.

Daniell joined UCF’s Burnett School of Biomedical Sciences in 1998. His research led to the formation of the university’s first biotechnology company. Daniell became only the 14th American in the last 222 years to be elected to the Italian National Academy of Sciences, and he is a fellow of the American Association for the Advancement of Sciences.

On the Net:

Guidelines for Sedating Near-Death Patients At Home

Los Angeles, London, New Delhi, Singapore and Washington DC ““ Can patients near death safely receive sedation at home, fully respecting their own and their families’ wishes? This practice, which is on the rise, is coming under increasing scrutiny and debate by palliative care researchers and practitioners. Now palliative care specialists from a team based in Spain have documented their experiences and data, and developed a standard checklist to help other clinicians. Their research appears in the journal Palliative Medicine, published by SAGE.

Physicians use specific sedatives to relieve intolerable suffering as patients near death ““ a practice known as palliative sedation (PS). The rate of PS use varies widely from 3-52% in terminally ill patients according to the literature – a wide range considering it is considered ethical and legally acceptable for those with irreversible and advanced disease. This raises questions over whether the definition of PS, or its setting could be behind these differences.

Despite a trend for PS in patients’ homes increasing in recent years, academics know very little about what kinds of sedation are administered ““ or who is receiving it ““ at home. Some fear that using PS, particularly at home, should not replace thorough assessment and treatment of patients’ physical symptoms, or their psychological or spiritual distress. A set of standard guidelines offers one solution.

Alberto Alonso-Babarro from Hospital Universitario La Paz, Madrid led the study into home PS, which was conducted in Madrid by a palliative home care team (PHCT) composed of two physicians, two nurses, a nurse assistant, a part-time social worker, and an administrative clerk. The PHCT regularly follows up patients with progressive, incurable diseases with many symptoms who are referred by acute care hospitals, medical oncologists or family physicians.

Alonso-Babarro and his team retrospectively reviewed medical records from 370 patients, all of whom had been followed by a palliative home care team. They developed a decision-making and treatment checklist, which they used to assess how frequently PS was used for cancer patients dying at home, and how effective it was. A total of 245 patients (66%) died at home, and 125 patients (34%) died at a hospital or hospice.

Twenty-nine of 245 patients (12%) who died at home received PS. Those who received it had a younger mean age (58) than those who did not (69), but there were no other differences detected between these two patient groups. The most common reasons for using PS were delirium (62%) and dyspnea (laboured breathing), in the case of 14% of patients. The vast majority of patients were given the sedative drug midazolam for PS, with less than a tenth receiving levomepromazine, an anti-psychotic sedative used in Europe and Canada, but not currently registered in the US.

On average, patients died 2.6 days after PS, and in almost half of cases the decision to use PS was taken with both the patient and his or her family. In other cases the family made the decision. Importantly, the authors concluded that using PS does not hasten death.

Other interesting findings were that at home, PS was used at a lower rate than in hospital (where 20-50% of palliative patients have PS). Hospitalised patients often have a greater symptom burden, or may be more agitated and so prone to delirium than in a home setting, the authors suggest.

There is also controversy in the palliative care literature around psycho-existential suffering, where cultural context appears to play a role. In particular, a multi-centre study found that patients in Spain had a higher rate of PS for this reason than in other countries. Alonso-Babarro suggests that lack of agreement on treatment between the patients and their families in Spain could be a significant factor in this distress. “Incorporating the patient’s wishes regarding PS in advanced directives or discussing these issues with patients prior to the final days of their lives may help avoid unnecessary patient and caregiver stress and burden,” he suggests.

“We concluded that palliative sedation may be used safely and efficaciously to treat dying cancer patients with refractory symptoms at home,” said Alonso-Babarro, who added: “To our knowledge, this is one of first studies addressing PS in the home setting to demonstrate the safety and efficacy of at-home PS administered by a PHCT.”

The checklist his team developed recommends beginning PS with midazolam followed by levomepromazine if midazolam proves ineffective. If both midazolam and levomepromazine fail, phenobarbital is the next option to consider. The team also recommends these medications should be injected.

The team hope that their checklist will provide other researchers and clinicians with an easy-to-use decision aid and treatment tool to facilitate the PS process. Researchers will need to carry out further multi-centre prospective home-based studies to replicate their findings.

In some cases, PS may be the only way to achieve a peaceful death at home, thus ensuring that the wishes of the patients and their caregivers are respected.

On the Net:

SAGE Publications UK

Extreme Jets Take New Shape

Jets of particles streaming from black holes in far-away galaxies operate differently than previously thought, according to a study published today in Nature. The new study reveals that most of the jet’s light””gamma rays, the universe’s most energetic form of light””is created much farther from the black hole than expected and suggests a more complex shape for the jet.

The research was led by scientists at the Kavli Institute for Particle Astrophysics and Cosmology, jointly located at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University, with participation from scientists from around the world. The study included data from more than 20 telescopes including the Fermi Gamma-ray Space Telescope and KANATA telescope.

High above the flat Milky Way galaxy, bright galaxies called blazars dominate the gamma-ray sky, discrete spots on the dark backdrop of the universe. As nearby matter falls into the black hole at the center of a blazar, “feeding” the black hole, it sprays some of this energy back out into the universe as a jet of particles.

“As the universe’s biggest accelerators, blazar jets are important to understand,” said KIPAC Research Fellow Masaaki Hayashida, who serves as corresponding author on the paper with KIPAC Astrophysicist Greg Madejski. “But how they are produced and how they are structured is not well understood. We’re still looking to understand the basics.”

Researchers had previously theorized that such jets are held together by strong magnetic field tendrils, while the jet’s light is created by particles revolving around these wisp-thin magnetic field “lines.”

Yet, until now, the details have been relatively poorly understood. The recent study upsets the prevailing understanding of the jet’s structure, revealing new insight into these mysterious yet mighty beasts.

“This work is a significant step toward understanding the physics of these jets,” said KIPAC Director Roger Blandford. “It’s this type of observation that is going to make it possible for us to figure out their anatomy.”

Locating the Gamma Rays

Over a full year of observations, the researchers focused on one particular blazar jet, located in the constellation Virgo, monitoring it in many different wavelengths of light: gamma-ray, X-ray, optical, infrared and radio. Blazars continuously flicker, and researchers expected continual changes in all types of light. Midway through the year, however, researchers observed a spectacular change in the jet’s optical and gamma-ray emission: a 20-day-long flare in gamma rays was accompanied by a dramatic change in the jet’s optical light.

Although most optical light is unpolarized””consisting of light rays with an equal mix of all polarizations or directionality””the extreme bending of energetic particles around a magnetic field line can polarize light. During the 20-day gamma-ray flare, optical light streaming from the jet changed its polarization. This temporal connection between changes in the gamma-ray light and changes in the optical light suggests that both types of light are created in the same geographical region of the jet; during those 20 days, something in the local environment altered to cause both the optical and gamma-ray light to vary.

“We have a fairly good idea of where in the jet optical light is created; now that we know the gamma rays and optical light are created in the same place, we can for the first time determine where the gamma rays come from,” said Hayashida.

This knowledge has far-reaching implications about how energy escapes a black hole. The great majority of energy released in a jet escapes in the form of gamma rays, and researchers previously thought that all of this energy must be released near the black hole, close to where the matter flowing into the black hole gives up its energy in the first place. Yet the new results suggest that””like optical light””the gamma rays are emitted relatively far from the black hole. This, Hayashida and Madejski said, in turn suggests that the magnetic field lines must somehow help the energy travel far from the black hole before it is released in the form of gamma rays.

“What we found was very different from what we were expecting,” said Madejski. “The data suggest that gamma rays are produced not one or two light days from the black hole [as was expected] but closer to one light year. That’s surprising.”

Rethinking Jet Structure

In addition to revealing where in the jet light is produced, the gradual change of the optical light’s polarization also reveals something unexpected about the overall shape of the jet: the jet appears to curve as it travels away from the black hole.

“At one point during a gamma-ray flare, the polarization rotated about 180 degrees as the intensity of the light changed,” said Hayashida. “This suggests that the whole jet curves.”

This new understanding of the inner workings and construction of a blazar jet requires a new working model of the jet’s structure, one in which the jet curves dramatically and the most energetic light originates far from the black hole. This, Madejski said, is where theorists come in. “Our study poses a very important challenge to theorists: how would you construct a jet that could potentially be carrying energy so far from the black hole? And how could we then detect that? Taking the magnetic field lines into account is not simple. Related calculations are difficult to do analytically, and must be solved with extremely complex numerical schemes.”

Theorist Jonathan McKinney, a Stanford University Einstein Fellow and expert on the formation of magnetized jets, agrees that the results pose as many questions as they answer. “There’s been a long-time controversy about these jets””about exactly where the gamma-ray emission is coming from. This work constrains the types of jet models that are possible,” said McKinney, who is unassociated with the recent study. “From a theoretician’s point of view, I’m excited because it means we need to rethink our models.”

As theorists consider how the new observations fit models of how jets work, Hayashida, Madejski and other members of the research team will continue to gather more data. “There’s a clear need to conduct such observations across all types of light to understand this better,” said Madejski. “It takes a massive amount of coordination to accomplish this type of study, which included more than 250 scientists and data from about 20 telescopes. But it’s worth it.”

With this and future multi-wavelength studies, theorists will have new insight with which to craft models of how the universe’s biggest accelerators work.

The gamma-ray observations used in this study were made by the Large Area Telescope on board the Fermi Gamma-ray Space Telescope, an astrophysics and particle physics partnership developed by NASA in collaboration with the U.S. Department of Energy Office of Science, along with important contributions from academic institutions and partners in France, Germany, Italy, Japan, Sweden, and the United States. LAT collaboration members were key participants in the development of this research. SLAC National Accelerator Laboratory managed construction of the LAT and now plays the central role in science operations, data processing and making scientific data available to collaborators for analysis.

The optical polarization data that played a crucial role in this study was taken by the KANATA collaboration, using the KANATA telescope located in Higashihiroshima, Japan. The KANATA telescope is operated by Hiroshima University.

The GASP-WEBT observatories participating in this work are Abastumani, Calar Alto, Campo Imperatore, Crimean, Kitt Peak (MDM), L’Ampolla, Lowell (Perkins-PRISM), Lulin, Roque de los Muchachos (KVA and Liverpool), San Pedro Ma´rtir, St Petersburg for the optical”“NIR bands, and Mauna Kea (SMA),Medicina, Metsahovi, Noto and UMRAO for the millimeter radio band.

The campaign also included data from NASA satellites Swift and the ROSSI X-ray Timing Explorer, and the Japanese satellite Suzaku.

Image Caption: Recent observations of blazar jets require researchers to look deeper into whether current theories about jet formation and motion require refinement. This simulation, courtesy of Jonathan McKinney (KIPAC), shows a black hole pulling in nearby matter (yellow) and spraying energy back out into the universe in a jet (blue and red) that is held together by magnetic field lines (green).

On the Net:

Scientists Create the Hottest Temperature in the Universe Here on Earth

Two University of Colorado at Boulder physicists are part of a collaborative team working with the U.S. Department of Energy’s Brookhaven National Laboratory in New York that have created the hottest temperature matter ever measured in the universe — 7.2 trillion degrees Fahrenheit.

The team used Brookhaven’s giant atom smasher, the Relativistic Heavy Ion Collider, or RHIC, to ram charged gold particles into each other billions of times, creating a “quark-gluon plasma” with a temperature hotter than anything known in the universe, even supernova explosions. The experiment is recreating the conditions of the universe a few microseconds after the Big Bang.

CU-Boulder physics department Professors Jamie Nagle and Edward Kinney are collaborators on the Pioneering High Energy Nuclear Interaction eXperiment, or PHENIX, one of four large detectors that helps physicists analyze the particle collisions using RHIC. PHENIX, which weighs 4,000 tons and has a dozen detector subsystems, sports three large steel magnets that produce high magnetic fields to bend charged particles along curved paths.

RHIC is the only machine in the world capable of colliding so called “heavy ions” — atoms that have had their outer cloud of electrons stripped away. The research team used gold, one of the heaviest elements, for the experiment. The gold atoms were sent flying in opposite directions in RHIC, a 2.4-mile underground loop located in Upton, New York. The collisions melted protons and neutrons and liberated subatomic particles known as quarks and gluons.

“It is very exciting that scientists at the University of Colorado are world leaders in laboratory studies of both the coldest atomic matter and now the hottest nuclear matter in the universe,” said Nagle, deputy spokesperson for the 500-person PHENIX team.

In 1995 CU-Boulder Distinguished Professor Carl Wieman and Adjoint Professor Eric Cornell of the physics department led a team of physicists that created the world’s first Bose-Einstein condensate — a new form of matter. Both Wieman and Cornell are fellows of JILA, a joint institute of CU-Boulder and the National Institute of Standards and Technology where Cornell also is a fellow. “¨The physicists, who shared the Nobel Prize in physics for their work in 2001, achieved the lowest temperature ever recorded at the time by cooling rubidium atoms to less that 170 billionths of a degree above absolute zero, causing individual atoms to form a “superatom” that behaved as a single entity.

The new experiments with RHIC produced a temperature 250,000 times hotter than the sun’s interior. The collisions created miniscule bubbles heated to temperatures 40 times hotter than the interior of supernova. By studying the “soup” of subatomic particles created by the RHIC, researchers hope to gain insight into what occurred in the first microseconds after the Big Bang some 13.7 billion years ago, said Kinney.

Later this year physicists that include a team from CU-Boulder hope to use the Large Hadron Collider in Switzerland to ram ions together to create even hotter temperatures to replicate even earlier conditions following the Big Bang.

On the Net:

University of Colorado Boulder

Menstrual Cramps And Acupuncture

Researchers say an extensive review of past studies has found that acupuncture may be helpful in alleviating menstrual cramps, which affects up to half of all young women, Reuters reported.

A team from the Oriental Hospital at Kyung Hee University Medical Center in South Korea conducted a review of 27 studies that involved nearly 3,000 women and found that acupuncture may be more effective than drugs or herbal medicines.

The researchers said there is convincing evidence on the effectiveness of using acupuncture to treat pain as it stimulates the production of endorphins and serotonin in the central nervous system.

Endorphins are compounds produced naturally by the human body during exercise and excitement and they result in a feeling of well-being. Serotonin is a brain chemical.

The study, published in the latest issue of the Journal of Obstetrics and Gynecology, said acupuncture was associated with a significant reduction in pain, compared with pharmacological treatment or herbal medicine.

Acupuncture is also cited as a possibly effective way of dealing with menstrual cramps, according to the U.S. National Institutes of Health.

It is unknown exactly what causes menstrual cramps, and for some women, the pain — accompanied by bloating, nausea, vomiting, diarrhea, dizziness and headache — can become more severe or may last longer as they grow older.

Around 10 percent of younger women have bad enough menstrual pain that they cannot go to work, resulting in billions of dollars in lost wages and productivity on the job annually.

Exercise, painkillers and applying heat to the lower abdomen are a few common treatments, but acupuncture has also become the subject of discussion and investigation.

The researchers did, however, note several flaws in the methodology of some studies and called for more clinical trials to be done.

The Chinese have used acupuncture as a form of anesthesia for at least 2,600 years and experts believe it can clear blockages in circulation.

Many doctors trained in western medicine are turning to acupuncture for their patients as a complementary treatment to help relieve pain.

Traditional acupuncturists insert needles in acupuncture points located along what they describe as “energy meridians” – a concept for which many scientists say there is no evidence.

On the Net:

Green Tea May Help Fight Glaucoma, Other Eye Diseases

Scientists have confirmed that the healthful substances found in green tea “” renowned for their powerful antioxidant and disease-fighting properties “” do penetrate into tissues of the eye. Their new report, the first documenting how the lens, retina, and other eye tissues absorb these substances, raises the possibility that green tea may protect against glaucoma and other common eye diseases. It appears in ACS’s bi-weekly Journal of Agricultural and Food Chemistry.

Chi Pui Pang and colleagues point out that so-called green tea “catechins” have been among a number of antioxidants thought capable of protecting the eye. Those include vitamin C, vitamin E, lutein, and zeaxanthin. Until now, however, nobody knew if the catechins in green tea actually passed from the stomach and gastrointestinal tract into the tissues of the eye.

Pang and his colleagues resolved that uncertainty in experiments with laboratory rats that drank green tea. Analysis of eye tissues showed beyond a doubt that eye structures absorbed significant amounts of individual catechins. The retina, for example, absorbed the highest levels of gallocatechin, while the aqueous humor tended to absorb epigallocatechin. The effects of green tea catechins in reducing harmful oxidative stress in the eye lasted for up to 20 hours. “Our results indicate that green tea consumption could benefit the eye against oxidative stress,” the report concludes.

Image Caption: Green tea contains healthful substances that can penetrate eye tissues, raising the possibility that the tea may protect against glaucoma and other eye diseases. Credit: iStock

On the Net:

Heart Failure Worse When Right Ventricle Goes Bad

New research from the University of Alabama at Birmingham (UAB) suggests that the ability of right side of the heart to pump blood may be an indication of the risk of death to heart-failure patients whose condition is caused by low function by the left side of their heart.

The ability of the two chambers of the heart, the left and right ventricles, to pump blood is described as ejection fraction. Healthy individuals typically have ejection fractions between 50 and 65 percent in both chambers.

In findings reported in January in Circulation, a journal of the American Heart Association, researchers at UAB say that low right-ventricular ejection fraction (RVEF) increased the risk of death in patients with systolic heart failure – heart failure associated with low left-ventricular ejection fraction.

“The role of the right ventricle in chronic systolic heart failure has been overlooked for many years, in part because it was considered to be merely a passive chamber,” said Ali Ahmed, M.D, MPH., associate professor of medicine in the Division of Cardiovascular Disease and the senior author of the study. “Studies of the effect of RVEF on outcomes in heart failure have been limited by small sample size and short follow-up.”

Using data from 2,008 patients with advanced chronic systolic heart failure in the Beta-Blocker Evaluation of Survival Trial sponsored by the National Heart, Lung and Blood Institute, one of the National Institutes of Health, the UAB team discovered that death rates went up as RVEF went down.

Patients with a RVEF of greater than 40 percent had a death rate of 27 percent during the two-year study. But when the RVEF dropped to less than 20 percent, the death rate increased to nearly half, or 47 percent, of the patients.

“Our study suggest that RVEF is a marker of poor prognosis in patients with heart failure and should be routinely measured to better identify these at-risk patients and provide appropriate therapy for them,” said Ahmed. “Future studies need to determine the risk factors for RVEF impairment and to develop and test interventions that may improve outcomes in heart failure patients with low RVEF.”

The study was conducted by Philippe Meyer, M.D., of the University Hospital of Geneva, Switzerland, while a research fellow at the Montreal Heart Institute and under the mentorship of UAB’s Ahmed, and Michel White, M.D., a cardiologist at the Montreal Heart Institute.  The rest of the UAB team was Mustafa I. Ahmed, M.D., Ami E. Iskandrian, M.D., Vera Bittner, M.D., MSPH, Gilbert J. Perry, M.D., Inmaculada B. Aban, Ph.D., Marjan Mujib, MBBS, MPH, and Louis J. Dell’Italia, M.D., along with Gerasimos S. Filippatos, M.D., Ph.D., of the University of Athens, Greece.

On the Net:

Making A Better Medical Safety Checklist

Interest in checklists grows, but they’re no magic wand

In the wake of Johns Hopkins’ success in virtually eliminating intensive-care unit bloodstream infections via a simple five-step checklist, the safety scientist who developed and popularized the tool warns medical colleagues that they are no panacea.

“Checklists are useful, but they’re not Harry Potter’s wand,” says Peter Pronovost, M.D., Ph.D., a professor of anesthesiology and critical care medicine at Johns Hopkins University School of Medicine and a patient safety expert. “The science needed to best develop focused, unambiguous and succinct checklists for medicine’s thousands of diagnoses and procedures is in its infancy, and there can be unintended consequences of reliance on simple tools.”

In a review by Pronovost and other Johns Hopkins researchers recently published in the journal Critical Care, the authors say it’s clear that use of aviation-like safety checklists based on scientific evidence can work, and that more hospitals should use them to help prevent errors and reduce costs associated with medical mistakes.

But says Pronovost, whose eponymous checklist is credited with preventing thousands of central-line infections at Hopkins, throughout the state of Michigan and elsewhere, they need to be accompanied by a “change in the culture of arrogance still widespread in medical care.”

Culture change, he says, “insists,” for example, that nurses are empowered to question doctors who don’t follow the steps properly and that every single member of the health care team toss out long-held beliefs that infections are an inevitable cost of being in the hospital.

“Just having a checklist on a piece of paper isn’t going to be enough,” he says.

In the Critical Care review, Pronovost and his colleagues took a step back and applied a rigorous scientific analysis of checklists, looking especially for which ones have the potential to work best in varying situations.

For example, some checklists are like grocery lists, a basic catalog of what needs to be accomplished by just one person in order for a process or procedure to be completed properly. In an operating room, the anesthesiologist has a checklist that assists her in making sure that every step is followed to ensure the anesthesia machine is working properly before a patient is put under.

“But that sort of checklist doesn’t work in all cases,” Pronovost says. “Central-line infection checklists work best, for example, when there is what we call a challenge and response, in which one person reads a series of items and a second person verifies that each item had been completed. With the check and balance of another person, the list is more likely to be completed properly.”

Pronovost also warns of checklist overload. “Creating too many checklists “” especially those that are not proven to improve patient safety “” or using checklists where they are not truly needed can be distracting and time-consuming,” he says, “and over-reliance on them can lead to a false sense of safety.”

“Each step in the diagnosis, treatment and monitoring process poses risks for error that we need to defend against,” the Johns Hopkins researcher says. “We do not know how many checklists are too many, when they are most useful, when we have overloaded the checklist users or how strictly the benefits are being measured.”

In fact, the Johns Hopkins team says, the underuse of checklists that do work is a problem in part caused by the paucity of scholarly research on how best to use them, how to build and implement them, how to measure their effectiveness in improving patient outcomes, and how they can best be sustained in a culture that is slow to change.

Pronovost’s central-line safety checklist was created after reviewing the literature and guidelines on how to best prevent bloodstream infections in ICUs and selecting the five for which evidence showed they were most likely to accomplish that goal. The checklist was piloted in a small setting (one ICU at The Johns Hopkins Hospital) before undergoing a test on a larger scale (the state of Michigan’s ICUs). After the work was published in the New England Journal of Medicine, he get calls from not only doctors asking him to design checklists for them, but CEOs, financial-industry executives and even a man who wanted a checklist for sailing a boat.

While standardization is at the heart of any checklist, Pronovost says checklists need to be continually assessed to be sure they are still accomplishing their goals “” in this case, keeping bloodstream infection rates near zero. It is important not only to be able to tell patients that the checklist is being used, but to be able to answer the bigger question: Am I safe in the hospital?

“There’s a lot more research to do and a lot of work to be done,” Pronovost says.

Other Johns Hopkins researchers on the paper include Bradford D. Winters, M.D., Ph.D.; Ayse P. Gurses, Ph.D.; Harold Lehmann, M.D., Ph.D.; and J. Bryan Sexton, Ph.D, M.A.

On the Net:

What Is Crippling Food Production In Africa?

Despite good intentions, the push to privatize government functions and insistence upon “free trade” that is too often unfair has caused declining food production, increased poverty and a hunger crisis for millions of people in many African nations, researchers conclude in a new study.

Market reforms that began in the mid-1980s and were supposed to aid economic growth have actually backfired in some of the poorest nations in the world, and just in recent years led to multiple food riots, scientists report today in Proceedings of the National Academy of Sciences, a professional journal.

“Many of these reforms were designed to make countries more efficient, and seen as a solution to failing schools, hospitals and other infrastructure,” said Laurence Becker, an associate professor of geosciences at Oregon State University. “But they sometimes eliminated critical support systems for poor farmers who had no car, no land security, made $1 a day and had their life savings of $600 hidden under a mattress.

“These people were then asked to compete with some of the most efficient agricultural systems in the world, and they simply couldn’t do it,” Becker said. “With tariff barriers removed, less expensive imported food flooded into countries, some of which at one point were nearly self-sufficient in agriculture. Many people quit farming and abandoned systems that had worked in their cultures for centuries.”

These forces have undercut food production for 25 years, the researchers concluded. They came to a head in early 2008 when the price of rice ““ a staple in several African nations ““ doubled in one year for consumers who spent much of their income solely on food. Food riots, political and economic disruption ensued.

The study was done by researchers from OSU, the University of California at Los Angeles and Macalester College. It was based on household and market surveys and national production data.

There are no simple or obvious solutions, Becker said, but developed nations and organizations such as the World Bank or International Monetary Fund need to better recognize that approaches which can be effective in more advanced economies don’t readily translate to less developed nations.

“We don’t suggest that all local producers, such as small farmers, live in some false economy that’s cut off from the rest of the world,” Becker said.

“But at the same time, we have to understand these are often people with little formal education, no extension systems or bank accounts, often no cars or roads,” he said. “They can farm land and provide both food and jobs in their countries, but sometimes they need a little help, in forms that will work for them. Some good seeds, good advice, a little fertilizer, a local market for their products.”

Many people in African nations, Becker said, farm local land communally, as they have been doing for generations, without title to it or expensive equipment ““ and have developed systems that may not be advanced, but are functional. They are often not prepared to compete with multinational corporations or sophisticated trade systems. The loss of local agricultural production puts them at the mercy of sudden spikes in food costs around the world. And some of the farmers they compete with in the U.S., East Asia and other nations receive crop supports or subsidies of various types, while they are told they must embrace completely free trade with no assistance.

“A truly free market does not exist in this world,” Becker said. “We don’t have one, but we tell hungry people in Africa that they are supposed to.”

This research examined problems in Gambia and Cote d’Ivoire in Western Africa, where problems of this nature have been severe in recent years. It also looked at conditions in Mali, which by contrast has been better able to sustain local food production – because of better roads, a location that makes imported rice more expensive, a cultural commitment to local products and other factors.

Historically corrupt governments continue to be a problem, the researchers said.

“In many African nations people think of the government as looters, not as helpers or protectors of rights,” Becker said. “But despite that, we have to achieve a better balance in governments providing some minimal supports to help local agriculture survive.”

An emphasis that began in the 1980s on wider responsibilities for the private sector, the report said, worked to an extent so long as prices for food imports, especially rice, remained cheap. But it steadily caused higher unemployment and an erosion in local food production, which in 2007-08 exploded in a global food crisis, street riots and violence. The sophisticated techniques and cash-crop emphasis of the “Green Revolution” may have caused more harm than help in many locations, the study concluded.

Another issue, they said, was an “urban bias” in government assistance programs, where the few support systems in place were far more oriented to the needs of city dwellers than their rural counterparts.

Potential solutions, the researchers concluded, include more diversity of local crops, appropriate tariff barriers to give local producers a reasonable chance, subsidies where appropriate, and the credit systems, road networks, and local mills necessary to process local crops and get them to local markets.

Image 1: A worker in Cote d’Ivoire in West Africa harvests locally grown rice. (Photo courtesy of Oregon State University)

Image 2: A small rice mill in Cote d’Ivoire, West Africa, offers possible job opportunities for local residents, waiting here in hope of getting work operating pushcarts. (Photo courtesy of Oregon State University)

Image 3: A worker in Cote d’voire finds work removing the husk from locally produced rice using old-fashioned, but functional mortar and pestle techniques. (Photo courtesy of Oregon State University)

On the Net:

Montserrat Battles Ash Cloud

Many flights around the Caribbean remained cancelled Saturday due to clouds of ash spewed up into the skies on the island of Montserrat.

The ash forced LIAT, the region’s biggest airline, to temporarily suspend flights in and out of Antigua’s V.C. Bird International airport.

Montserrat’s Soufriere Hills volcano erupted Thursday, sending a plume of ash 10 kilometers high into the sky, the Montserrat Volcano Observatory said.

“The continued ash hanging in the atmosphere presents a risk to planes and to the security of passengers,” the police in the nearby island of Guadeloupe said in a statement.

“It is makes you sneeze a lot,” Gregory Willock, the president of the Montserrat Cricket Association, told AFP from the nearby island of Antigua.

The ash is also creating difficult driving conditions.

“Visibility was extremely poor. Coming off a hill you don’t see what’s below. I had to switch (my lights) to low beam all the time because high beam confused me even more,” a local said.

Meteorological officials described the ash fall as “quite bad.”

“All I know is it can create problems for people who have sinus problems,” Willock added, saying many residents had taken to wearing masks while schools and government offices have been operating as normal.

Montserrat is about 35 kilometers south of Antigua and Barbuda, but residents on the southern side of Antigua repotred that heavy ash had fallen on their homes and made driving conditions difficult at times.

Thursday’s eruption came almost 15 years after the volcano, which had lain dormant throughout recorded history, first rumbled into life in July 1995.

Senior forecaster Lorne Salmon says weather conditions have a lot to do in how thick the ash will fall.

Cooling meant the air had become heavier, sinking into valley and low-lying areas.

“When this happens a lot of the ash becomes more concentrated,” she said.

On the Net:

17,000 Total U.S. H1N1 Deaths Reported

The H1N1 virus, also known as swine flu, may have killed as many as 17,000 Americans, according to new estimates by the Centers for Disease Control and Prevention (CDC) in Atlanta.

Though 2,498 confirmed deaths linked to the H1N1 virus had been reported to the CDC as of January 30, the agency estimates that between 8,330 and 17,160 people actually have died from H1N1.

The overwhelming majority of the people who died — between 6,390 and 13,170 — were 18 to 64 years old, the CDC estimates. Between 880 and 1,810 children 17 years old and younger also died from this flu, according to CDC estimates.

In comparison, the CDC says that in a regular flu season, about 36,000 people in the United States die from seasonal flu, with 90 percent of the deaths usually occurring in people age 65 and older.

Since this new flu virus emerged in April 2009, health officials have recognized that the reported numbers of people who have been hospitalized and died provide only a partial picture of the full outbreak. Underreporting of influenza cases and deaths is common, especially in the early weeks or months of an outbreak.

By mid-July last year, the World Health Organization (WHO) no longer recommended that countries with known H1N1 transmission test everybody who had flulike symptoms, as laboratories were being overwhelmed. Patients were to be diagnosed based on symptoms alone, since nearly all of the circulating flu strains were H1N1.

Only severe cases of flu, which lead to hospitalization or death, were to be tested. The CDC adopted these recommendations for the United States, also.

Despite the recommendation that deaths suspected to be from H1N1 be tested, the CDC is aware that many are not, and in many cases H1N1 might not have been suspected.

The new CDC estimates are based on laboratory-confirmed cases, flu surveillance data and mathematical modeling, CDC spokesman Richard Quartarone told CNN.

According to the report released Friday, the CDC estimates an average of 57 million people have been infected with H1N1 and an average 257,000 cases resulted in hospitalizations.

Health officials continue to urge people who haven’t received a vaccination to do so.

“The real tragedy is that people are still getting sick and we have a vaccine that will help prevent illness,” Quartarone said.

On the Net:

Heel First More Efficient For Walking

Walking heels-first is less work than walking on your toes or balls of the feet

Humans, other great apes and bears are among the few animals that step first on the heel when walking, and then roll onto the ball of the foot and toes. Now, a University of Utah study shows the advantage: Compared with heel-first walking, it takes 53 percent more energy to walk on the balls of your feet, and 83 percent more energy to walk on your toes.

“Our heel touches the ground at the start of each step. In most mammals, the heel remains elevated during walking and running,” says biology Professor David Carrier, senior author of the new study being published online Friday, Feb. 12 and in the March 1 print issue of The Journal of Experimental Biology.

“Most mammals ““ dogs, cats, raccoons ““ walk and run around on the balls of their feet. Ungulates like horses and deer run and walk on their tiptoes,” he adds. “Few species land on their heel: bears and humans and other great apes ““ chimps, gorillas, orangutans.”

“Our study shows that the heel-down posture increases the economy of walking but not the economy of running,” says Carrier. “You consume more energy when you walk on the balls of your feet or your toes than when you walk heels first.”

Economical walking would have helped early human hunter-gatherers find food, he says. Yet, because other great apes also are heel-first walkers, it means the trait evolved before our common ancestors descended from the trees, he adds.

“We [human ancestors] had this foot posture when we were up in the trees,” Carrier says. “Heel-first walking was there in the great apes, but great apes don’t walk long distances. So economy of walking probably doesn’t explain this foot posture [and why it evolved], even though it helps us to walk economically.”

Carrier speculates that a heel-first foot posture “may be advantageous during fighting by increasing stability and applying more torque to the ground to twist, push and shove. And it increases agility in rapid turning maneuvers during aggressive encounters.”

 The study concludes: “Relative to other mammals, humans are economical walkers but not economical runners. Given the great distances hunter-gatherers travel, it is not surprising that humans retained a foot posture, inherited from our more arboreal [tree-dwelling] great ape ancestors, that facilitates economical walking.”

Measuring the Costs of Different Modes of Walking and Running

Carrier conducted the study with Christopher Cunningham, a doctoral student in biology at the University of Utah; Nadja Schilling, a zoologist at Friedrich Schiller University of Jena, Germany; and Christoph Anders, a physician at University Hospital Jena. The study was funded by the National Science Foundation, Friedrich Schiller University of Jena and a German food industry insurance group interested in back pain.

The study involved 27 volunteers, mostly athletes in their 20s, 30s and 40s. Each subject walked or ran three different ways, with each step either heel-first, ball-of-foot first with the heel a bit elevated or toes first with the heel even more elevated.

In his lab, Carrier and colleagues measured oxygen consumption ““ and thus energy use ““ as 11 volunteers wore face masks while walking or running on a treadmill. They also walked on a “force plate” to measure forces exerted on the ground.

Part of the study was conducted at Anders’ lab in Germany, where 16 people walked or ran on a treadmill as scientists monitored activity of muscles that help the ankles, knees, hips and back do work during walking and running.

Findings of the experiments included:

* “You consume more energy when you walk on the balls of your feet or your toes than when you walk heels-first,” Carrier says. Compared with heels-first walkers, those stepping first on the balls of their feet used 53 percent more energy, and those stepping toes-first expended 83 percent more energy.

* “The activity of the major muscles of the ankle, knee, hip and back all increase if you walk on the balls of your feet or your toes as opposed to landing on your heels,” says Carrier. “That tells us the muscles increase the amount of work they are producing if you walk on the balls of your feet.”

* “When we walk on the balls of our feet, we take shorter, more frequent strides,” Carrier says. “But this did not make walking less economical.” Putting the heel down first and pivoting onto the ball of the foot makes the stride longer because the full length of the foot is added to the length of the step. But that has no effect on energy use.

* The researchers wondered if stepping first on the balls of the feet took more energy than walking heel-first because people are less stable on their toes or balls of the feet. But increased stability did not explain why heel-first walking uses less energy.

* Stepping heel-first reduced the up-and-down motion of the body’s center of mass during walking and required less work by the hips, knees and ankles. Stepping first onto the balls of the feet slows the body more and requires more re-acceleration.

* Heels-first steps also made walking more economical by increasing the transfer of movement or “kinetic” energy to stored or “potential” energy and back again. As a person starts to step forward and downward, stored energy is changed to motion or kinetic energy. Then, as weight shifts onto the foot and the person moved forward and upward, their speed slows down, so the kinetic energy of motion is converted back into stored or potential energy. The study found that stepping first onto the balls of the feet made this energy exchange less efficient that walking heels-first.

* Heel-first walking also reduced the “ground reaction force moment” at the ankle. That means stepping first onto the ball of the foot “decreases the leverage, decreases the mechanical advantage” compared with walking heel-first, Carrier says.

In sum, walking heel-first is not more economical because it is more stable or involves fewer, longer strides, but because when we land on our heels, less energy is lost to the ground, we have more leverage, and kinetic and potential energy are converted more efficiently.

Form and Function of the Foot

If heel-first walking is so economical, why do so many animals walk other ways?

“They are adapted for running,” Carrier says. “They’ve compromised their economy of walking for the economy of running.”

“Humans are very good at running long distances. We are physiologically and anatomically specialized for running long distances. But the anatomy of our feet is not consistent with economical running. Think of all the animals that are the best runners ““ gazelles, deer, horses, dogs ““ they all run on the ball of their feet or the tips of their toes.”

When people run, why is there no difference in the amount of energy they expend when stepping first onto their heels versus the balls of their feet or toes?

The answer is unknown, but “if you land on your heel when you run, the force underneath the foot shoots very quickly to the ball of your foot,” Carrier says. “Even when we run with a heel plant, most of the step our weight is supported by the ball of our foot. Lots of elite athletes, whether sprinters or distance runners, don’t land on their heel. Many of them run on the balls of their feet,” as do people who run barefoot. That appears to be the natural ancestral condition for early human runners, he adds.

“The important thing is we are remarkable economical walkers,” Carrier says. “We are not efficient runners. In fact, we consume more energy to run than the typical mammal our size. But we are exceptionally economical walkers.”

“This study suggests that one of the things that may explain such economy is the unusual structure of our foot,” he adds. “The whole foot contacts the ground when we walk. We have a big heel. Our big toe is as long as our other toes and is much more robust. Our big toe also is parallel to and right next to the second toe.”

“These features are distinct among apes, and provide the mechanical basis for economical walking. No other primate or mammal could fit into human shoes.”

Image 1: English photographer Eadweard Muybridge (1830-1904), who pioneered the use of multiple cameras to capture motion, is shown walking heel-first as humans usually do. A new University of Utah study shows that stepping onto the heel first requires much less energy than putting the ball of the foot or the toes onto the ground first. Photo Credit: Eadweard Muybridge (public domain)

Image 2: A University of Utah student walks by placing the ball of his foot down first while the heel is somewhat elevated. Humans normally do not walk that way because it requires 53 percent more energy than walking heel-first, according to a new University of Utah study. Images like this were used during analysis of study results. Photo Credit: David Carrier, The University of Utah

Image 3: The heels-first walking pattern typical of humans is demonstrated by a University of Utah student who participated in a study showing that people save a lot of energy by planting their heels down first rather than walking on their toes or the balls of their feet. Photo Credit: David Carrier, The University of Utah

On the Net:

New Screening System For Hepatitis C

A newly designed system of identifying molecules for treating hepatitis C should enable scientists to discover novel and effective therapies for the dangerous and difficult-to-cure disease of the liver, says Zhilei Chen, a Texas A&M University assistant professor of chemical engineering who helped develop the screening system.

The system, Chen explains, enables researchers to study the effects of molecules that obstruct all aspects of the hepatitis C virus (HCV) life cycle. That’s a significant milestone in HCV research, says Chen, noting that previous methods of developing drug treatments for the virus have been limited by the fact that researchers were only able to study one aspect of the HCV life cycle. Chen’s findings appear in the most recent edition of the scientific journal Proceedings of the National Academy of Sciences.

First identified in 1989 and responsible for hepatitis C, an infectious disease affecting the liver, HCV has infected an estimated 180 million people worldwide. Spread by blood-to-blood contact, HCV can cause chronic infection that leads to dangerous scarring of the liver, liver failure, liver cancer and death.

Although new infections resulting from blood transfusions are rare thanks to screening measures that began in 1990, the overall number of people facing death or serious liver disease from HCV is steadily rising because people often live decades with the virus before showing symptoms, Chen says. In addition, injection drug users are at high risk for infection from contaminated needles.

The only existing therapy for HCV is a physically and emotionally taxing 48-week course of treatment that cures less than half of all patients who undergo it, Chen says. The particularly grueling nature of the treatment ““ it’s been compared to chemotherapy ““ as well as the high financial costs associated with it often result in many patients opting to forego the therapy.

Because Chen’s newly developed screening system enables the discovery of small, low-cost molecules that block the HCV life cycle, she believes it could contribute to new, more affordable and more effective therapies for hepatitis C.

The screening system uses an innovative way to “see” cells that are infected with HCV.

“Typically when a virus infects a cell, it’s not obvious to detect; it’s not easy to distinguish an infected cell from an uninfected cell,” Chen says. “Much in the same way a person who is infected with HCV does not initially feel anything, when a cell is initially infected nothing really observable happens. This makes it difficult to distinguish HCV infection in cells.”

To address this challenge, Chen “tweaked” the cells she was studying by inserting a gene into them that triggers cell death if HCV enters that cell. This allowed Chen to easily measure the extent of infection in her genetically engineered cells by quantifying the degree of cell death within the cell cultures she was examining.

These engineered cells were grown in miniature compartments in the presence of infectious HCV, and a different chemical was added to each compartment.

“We could then look and see which cells were able to survive because if you have chemicals that don’t inhibit HCV, the cells will die, but if you have a molecule that blocks the HCV life cycle, the cells will grow,” Chen says. “And because we were able to look at the complete life cycle of the virus with our system, we discovered inhibitors of the virus across three different stages: entry into cells, reproduction within cells, and final escape from infected cells to attack new cells.”

Testing about 1,000 different chemicals, Chen found several that strongly inhibited the HCV life cycle. Some of the inhibitors, she said, obstruct virus entry into a cell. Others inhibit virus replication, meaning that infected cells won’t be able to support the reproduction and growth of the virus as much. Chen also found effective inhibitors that keep the virus from escaping the cell even if it grows well inside the cell.

“Since this virus changes all of the time, you really want to hit it across multiple aspects simultaneously,” Chen says. “Nevertheless, most current efforts to block the HCV life cycle focus only on its replication within cells due to the long-time absence of a system that allows for convenient screening of molecules blocking other aspects of the virus’ life cycle such as entry into cells and release from cells.

“Our system is well-suited to large-scale drug screening efforts because the technology is simple to use and can be easily scaled up to test extremely large collections of compounds using a robotic system,” Chen says. “We anticipate that this system will enable the discovery of many more new and more potent HCV antivirals.”

Working with Chen to develop the system were Karuppiah Chockalingam and Rudo Simeon, postdoctoral associate and graduate student, respectively, from Texas A&M and Charles Rice, professor from Rockefeller University.

On the Net:

If Children Won’t Go To School

Children and adolescents who refuse to attend school should not be given doctors’ sick notes. In the current issue of Deutsches Óžrzteblatt International (Dtsch Arztebl Int 2010; 107[4]), child and adolescent psychiatrist Martin Knollmann and colleagues explain the causes of school avoidance and describe measures to tackle the problem.

Truancy assumes psychiatric relevance only if it occurs frequently and is accompanied by psychiatric symptoms. Children typically play truant for the first time at the age of about 11 years, whereas anxiety related school avoidance occurs in children as young as 6 years. School avoiders seem to be exposed to more stressful life events, but physical disorders such as asthma or obesity may also play a part.

In contrast to truancy, of which parents are usually unaware, children displaying school avoiding behavior often stay at home. They often express fears and anxieties, especially in the morning, and complain of diffuse physical symptoms.

The authors assume that a proportion of 5% to 10% of children is regularly absent from schools in Germany. How many of these children have mental health problems is not known. In adolescents, school avoidance is clearly more common than in children, and some studies have shown that boys are affected twice as often as girls.

In school avoidance, the primary objective of treatment is to quickly re-establish regular school attendance. Sick notes or prescriptions for residential care breaks are usually not advisable because the child’s behavior may deteriorate as a result.

Appropriate treatment options include cognitive behavior therapies, in combination with antidepressants if required. Exclusively child and adolescent psychiatric treatment, however, is usually not sufficient; those children who are affected need a support network consisting of school staff, youth services, and medical professionals.

On the Net:

Self-Control Impaired In Type 2 Diabetics

Type-2 diabetes, an increasingly common complication of obesity, is associated with poor impulse control. Researchers writing in BioMed Central’s open access journal BioPsychoSocial Medicine suggest that neurological changes result in this inability to resist temptation, which may in turn exacerbate diabetes.

Hiroaki Kumano, from Waseda University, Japan, worked with a team of researchers to assess response inhibition, a measure of self-control, in 27 patients with type-2 diabetes and 27 healthy controls. He said, “Patients with type 2 diabetes are required to make strict daily decisions; for example, they should resist the temptation of high-fat, high-calorie food, which is frequently cued by specific people, places and events. Appropriate behavior modification thus depends on the patient’s ability to inhibit impulsive thoughts and actions cued by these environmental stimuli”.

In order to gauge the patients’ ability to resist such impulsive behavior, the researchers used a test in which participants had to quickly press a button in response to the correct signal on a computer screen, while pressing the button in response to the wrong symbol counted against their score. They found that patients with diabetes performed significantly worse at the test, suggesting that they struggled to control the impulse to press the button. Other results showed that the inhibitory failure observed in diabetic patients was mainly explained by cognitive impairment of impulsivity control, rather than by deficits in motor performance, error monitoring and adjustment. According to Kumano, “This suggests the possibility that the neuropsychological deficits in response inhibition may contribute to the behavioral problems leading to chronic lifestyle-related diseases, such as type 2 diabetes”.

On the Net:

New Stent Improves Ability To Keep Vessels Open For Dialysis Patients

Kidney dialysis patients often need repeated procedures, such as balloon angioplasty, to open blood vessels that become blocked or narrowed at the point where dialysis machines connect to the body. These blockages can impact the effectiveness of hemodialysis, a life-saving treatment to remove toxins from the blood when the kidneys are unable to do so. But a new FDA-approved stent graft can keep these access points open longer, reducing the number of procedures these patients may need, according to research from the University of Maryland published in the February 11, 2010, edition of the New England Journal of Medicine.

“This is the first large-scale randomized study to find a therapy to be superior to the gold standard of balloon angioplasty. We found that using this new stent for dialysis patients whose access grafts have become narrowed improves graft function. It also clearly reduces the need for repeated invasive procedures and interruption of dialysis,” explains lead author Ziv Haskal, M.D., chief of vascular and interventional radiology at the University of Maryland Medical Center. Dr. Haskal is also professor of diagnostic radiology and nuclear medicine, and surgery at the University of Maryland School of Medicine.

The prospective multi-center study took place at 13 sites across the country and enrolled nearly 200 patients. Ninety-seven patients received angioplasty with the new stent, which is a small metal scaffold inserted in the patient’s arm, compared to 93 who received angioplasty alone.

In the study, patients with the stent graft were more than twice as likely to have open vessels compared to the angioplasty only group after six months. The recurrence of vessel narrowing, restenosis, was nearly three times lower with the stent group, (27.6 percent vs. 77.6 percent). In later follow-up, some patients still had functioning grafts two years after the stent graft was first implanted.

“Results of this research should change the way we treat hemodialysis patients. In this study, patients who received angioplasty alone were twice as likely to need additional procedures compared to those who had the stent in addition to angioplasty,” adds Dr. Haskal. “That can translate into cost savings and improved quality of life for these patients, who already spend about nine to 12 hours a week in dialysis. We can now start considering grafts as something that may last for years in dialysis patients, instead of months.”

According to the researchers, the cost to treat dialysis access failure amounts to about $1 billion per year, and the number of patients needing hemodialysis is expected to continue to grow substantially over the next decade.

Kidney failure patients often have a synthetic portal, known as an access graft, embedded into their arm before they begin hemodialysis. The access graft works like an artificial blood vessel, allowing needles to be inserted repeatedly, so the blood can be circulated out of the body, filtered in a machine and then returned to the patient’s circulatory system. Patients must undergo dialysis several times a week. (Another less commonly used form of dialysis, peritoneal dialysis, filters waste by using the peritoneal membrane inside the abdomen. Patients inject a special solution into the body, which is later drained from the abdomen after the toxins are filtered. Peritoneal dialysis can be done at home, but must be done every day.)

For hemodialysis, scar tissue naturally forms at the edges of the access grafts. That scarring can impede blood flow, requiring doctors to perform angioplasty to open the vessels. In that outpatient procedure, doctors insert a balloon into the blood vessel and inflate the balloon to open the narrowed artery or vein. Following angioplasty, vessel narrowing frequently recurs, requiring repeated procedures, up to several times a year. If scarring becomes too severe and repeated angioplasties do not work, the patient may need another procedure put in an access graft at a different site on the arm. Other therapies have been compared to balloon angioplasty, but, until now, none has shown benefit in a prospective randomized study.

“More than 350,000 Americans are currently receiving dialysis. These patients need those access grafts to be as durable as possible because they only have so much space on their arms for the surgical creation of new access grafts. Our research has shown that using this stent graft to treat failing accesses keeps them open longer that the existing gold standard; it offers a real, longer-term solution for patients, reducing the need for repeated surgeries. This research suggests that physicians may need to make a fundamental change in their approach to treating hemodialysis patients,” says Dr. Haskal. This self-expanding metal stent graft creates a scaffold to keep the blood vessel open. It is encapsulated by polytetrafluoroethylene, the same material from which most dialysis grafts are made. The device allows the physician to mimic the effect of surgery at the scarred area without actually performing surgery.

“This study offers strong evidence of the benefit of using this new stent therapy for hemodialysis patients. It represents the type of important clinical research with direct patient benefit undertaken by physicians at the University of Maryland School of Medicine,” says E. Albert Reece, M.D., Ph.D., M.B.A, vice president for medical affairs, University of Maryland, and dean, University of Maryland School of Medicine.

Dr. Haskal is leading another large study that is currently enrolling patients to assess the benefits of the device over a longer period of time. The other sites participating in this study were the Hospital of the University of Pennsylvania; University of Texas, Southwestern Medical Center; Oregon Surgical Consultants, Portland, OR; Open Access Vascular Access Center, Miami, FL; Vascular Access Center, Augusta, GA; Tucson Vascular Surgery; Indiana University School of Medicine; Bamberg County Hospital and Nursing Center, Bamberg, SC; and Vascular Access Center of Frontenac Grove, Frontenac, MO. This study was funded by Bard Peripheral Vascular, Inc., manufacturer of the Flair Endovascular stent graft.

On the Net:

Prevention Is Key Research Goal For Premature Babies

Family history, infection and stress all may play a role in raising a woman’s risk of having a premature baby ““ but they don’t fully explain why some women give birth too soon and others don’t, according to a review article published today in the New England Journal of Medicine.

Only if scientists of all disciplines work together and share information ““ databases, biological samples and new perspectives ““ will the research community be able to determine how to prevent spontaneous preterm birth and spare babies from the serious consequences of an early birth, according to “The Enigma of Spontaneous Preterm Birth,” by Louis Muglia, MD, PhD, of Vanderbilt University Medical Center and Michael Katz, MD, senior vice president for Research and Global Program at the March Dimes.

Premature birth is a leading cause of infant death in the United States, and only about half of these deaths have a known cause, Drs. Muglia and Katz note.

More than 543,000 babies are born too soon each year in the United States. Worldwide, about 13 million babies are born prematurely each year. Babies who survive an early birth face serious risks of lifelong health problems, including learning disabilities, cerebral palsy, blindness, hearing loss and other chronic conditions.

Medical problems, such as preeclampsia, which is extremely high blood pressure in the mother, or fetal distress, do not fully explain the increase of induced deliveries, which often result in late preterm births, birth between 32 and 36 weeks gestation.

“The decision to induce delivery in order to improve fetal viability must be balanced by recognition of the need to minimize the impairments that arise from preterm birth,” the authors wrote. “Making this decision will remain a challenge for practitioners, because inducing delivery ““ by whatever method ““ before full term has adverse consequences for the newborn, even when it happens close to term.”

Family history of preterm birth, stress, race, infection, inflammation and genetics do appear to play a role. One of every three preterm births occurs to a mother who has an infection in her uterus, but may have no symptoms. Recent research has shown that the genes of the mother seem to make the greatest contribution to preterm birth risk, and genes in the fetus may also play a role.

The New England Journal of Medicine review article is a summary of a three-day symposium held in December 2008, entitled “Preventing Prematurity: Establishing a Network for Innovation and Discovery.” It was cosponsored by the Burroughs Wellcome Fund and the March of Dimes and brought together the top investigators in preterm birth prevention research.

The authors note in their article that technological advances allow clinicians to save premature infants from death and some complications by treating the consequences of prematurity. However, “prevention is what is needed, and current research is being directed toward reaching this goal,” the authors wrote.

The third biennial symposium, a meeting by invitation only, will be held in December, 2010.

The article was published in the Feb. 11, 2010, issue of the New England Journal of Medicine, Vol. 362, No. 6, pages 529-35.

On the Net:

Solar Dynamics Observatory Launch Now Set For Feb 11

For some years now, an unorthodox idea has been gaining favor among astronomers. It contradicts old teachings and unsettles thoughtful observers, especially climatologists.

“The sun,” explains Lika Guhathakurta of NASA headquarters in Washington DC, “is a variable star.”

But it looks so constant…

That’s only a limitation of the human eye. Modern telescopes and spacecraft have penetrated the sun’s blinding glare and found a maelstrom of unpredictable turmoil. Solar flares explode with the power of a billion atomic bombs. Clouds of magnetized gas (CMEs) big enough to swallow planets break away from the stellar surface. Holes in the sun’s atmosphere spew million mile-per-hour gusts of solar wind.

And those are the things that can happen in just one day.

Over longer periods of decades to centuries, solar activity waxes and wanes with a complex rhythm that researchers are still sorting out. The most famous “beat” is the 11-year sunspot cycle, described in many texts as a regular, clockwork process. In fact, it seems to have a mind of its own.

“It’s not even 11 years,” says Guhathakurtha. “The cycle ranges in length from 9 to 12 years. Some cycles are intense, with many sunspots and solar flares; others are mild, with relatively little solar activity. In the 17th century, during a period called the ‘Maunder Minimum,’ the cycle appeared to stop altogether for about 70 years and no one knows why.”

There is no need to go so far back in time, however, to find an example of the cycle’s unpredictability. Right now the sun is climbing out of a century-class solar minimum that almost no one anticipated.

“The depth of the solar minimum in 2008-2009 really took us by surprise,” says sunspot expert David Hathaway of the Marshall Space Flight Center in Huntsville, Alabama. “It highlights how far we still have to go to successfully forecast solar activity.”

That’s a problem, because human society is increasingly vulnerable to solar flare ups. Modern people depend on a network of interconnected high-tech systems for the basics of daily life. Smart power grids, GPS navigation, air travel, financial services, emergency radio communications””they can all be knocked out by intense solar activity. According to a 2008 study by the National Academy of Sciences, a century-class solar storm could cause twenty times more economic damage than Hurricane Katrina.

“Understanding solar variability is crucial,” says space scientist Judith Lean of the Naval Research Lab in Washington DC. “Our modern way of life depends upon it.”

Enter the Solar Dynamics Observatory”””SDO” for short””slated to launch on Feb. 11, 2010 at 10:23 a.m. EST, from the Kennedy Space Center in Florida.

SDO is designed to probe solar variability unlike any other mission in NASA history. It will observe the sun faster, deeper, and in greater detail than previous observatories, breaking barriers of time-scale and clarity that have long blocked progress in solar physics.

Guhathakurta believes that “SDO is going to revolutionize our view of the sun.”

The revolution begins with high-speed photography. SDO will record IMAX-quality images of the sun every 10 seconds using a bank of multi-wavelength telescopes called the Atmospheric Imaging Assembly (AIA). For comparison, previous observatories have taken pictures at best every few minutes with resolutions akin to what you see on the web, not at a movie theatre. Researchers believe that SDO’s rapid-fire cadence could have the same transformative effect on solar physics that the invention of high-speed photography had on many sciences in the 19th century.

SDO doesn’t stop at the stellar surface. SDO’s Helioseismic Magnetic Imager (HMI) can actually look inside the sun at the solar dynamo itself.

The solar dynamo is a network of deep plasma currents that generates the sun’s tangled and sometimes explosive magnetic field. It regulates all forms of solar activity from the lightning-fast eruptions of solar flares to the slow decadal undulations of the sunspot cycle.

“Understanding the inner workings of the solar dynamo has long been a ‘holy grail’ of solar physics,” says Dean Pesnell of the Goddard Space Flight Center in Greenbelt, Maryland. “HMI could finally deliver this to us.”

The dynamo is hidden from view by about 140,000 miles of overlying hot gas. SDO penetrates the veil using a technique familiar to geologists””seismology. Just as geologists probe Earth’s interior using waves generated by earthquakes, solar physicists can probe the sun’s interior using acoustic waves generated by the sun’s own boiling turbulence. HMI detects the waves, which researchers on Earth can transform into fairly clear pictures.

“It’s a little like taking an ultrasound of a pregnant mother,” Pesnell explains. “We can see ‘the baby’ right through the skin.”

Finally ““ and of most immediate relevance for Earth–SDO will observe the sun at wavelengths where the sun is most variable, the extreme ultraviolet (EUV). EUV photons are high-energy cousins of regular UV rays that cause sunburns. Fortunately, our atmosphere blocks solar EUV; otherwise a day at the beach could be fatal. In space, solar EUV emission is easy to detect and arguably the most sensitive indicator of solar activity.

“If human eyes could see EUV wavelengths, no one would doubt that the sun is a variable star,” says Tom Woods of the University of Colorado in Boulder.

During a solar flare, the sun’s extreme ultraviolet output can vary by factors of hundreds to thousands in a matter of seconds. Surges of EUV photons heat Earth’s upper atmosphere, causing the atmosphere to “puff up” and drag down low-orbiting satellites. EUV rays also break apart atoms and molecules, creating a layer of ions in the upper atmosphere that can severely disturb radio signals. According to Judith Lean, “EUV controls Earth’s environment throughout the entire atmosphere above about 100 km.”

“EUV is where the action is,” agrees Woods.

That’s why Woods and colleagues built an extreme ultraviolet sensor for SDO called the EUV Variability Experiment (“EVE”). “EVE gives us the highest time resolution (10 sec) and the highest spectral resolution (< 0.1 nm) that we’ve ever had for measuring the sun, and we’ll have it 24/7,” he says. “This is a huge improvement over past missions.”

Woods expects EVE to reveal how fast the sun can change”””we really don’t know,” he points out””and to surprise astronomers with the size of the outbursts.

EVE, AIA, HMI. For the next five years, the Solar Dynamics Observatory will use these instruments to redefine our star and its potential for variability. What unorthodox ideas will they beam back? Old teachings beware! 

Sidebar: ‘Solar Constant’ is an Oxymoron

Astronomers were once so convinced of the sun’s constancy, they called the irradiance of the sun “the solar constant,” and they set out to measure it as they would any constant of Nature. By definition, the solar constant is the amount of solar energy deposited at the top of Earth’s atmosphere in units of watts per meter-squared. All wavelengths of radiation are included””radio, infrared, visible light, ultraviolet, x-rays and so on. The approximate value of the solar constant is 1361 W/m2.

Clouds, atmospheric absorption and other factors complicate measurements from Earth’s surface, so NASA has taken the measuring devices to space. Today, VIRGO, ACRIM and SORCE are making measurements with precisions approaching 10 parts per million per year. Future instruments scheduled for flight on NASA’s Glory and NOAA’s NPOESS spacecraft aim for even higher precisions.

To the amazement of many researchers, the solar constant has turned out to be not constant.

“‘Solar constant’ is an oxymoron,” says Judith Lean of the Naval Research Lab. “Satellite data show that the sun’s total irradiance rises and falls with the sunspot cycle by a significant amount.”

At solar maximum, the sun is about 0.1% brighter than it is at solar minimum. That may not sound like much, but consider the following: A 0.1% change in 1361 W/m2 equals 1.4 Watts/m2. Averaging this number over the spherical Earth and correcting for Earth’s reflectivity yields 0.24 Watts for every square meter of our planet.

“Add it all up and you get a lot of energy,” says Lean. “How this might affect weather and climate is a matter of””at times passionate””debate.”

Because SDO specializes in extreme ultraviolet wavelengths, it won’t be making direct measurements of the total solar irradiance, which requires sensitivity across the entire electromagnetic spectrum. Nevertheless, a combination of data from SDO and other spacecraft could shed new light on this important topic””and perhaps reveal other oxymorons as well.

By Dr. Tony Phillips – Science @ NASA

Image 2: Areas of the USA vulnerable to power system collapse in response to an extreme geomagnetic storm. Source: National Academy of Sciences.

On the Net:

Early Life Stress A Predictor Of Cardiovascular Disease

Early life stress could be a risk factor for cardiovascular disease in adulthood, researchers report.

“We think early life stress increases sensitivity to a hormone known to increase your blood pressure and increases your cardiovascular risk in adult life,” said Dr. Jennifer Pollock, biochemist in the Vascular Biology Center at the Medical College of Georgia and corresponding author on the study published online in Hypertension.

The studies in a proven model of chronic behavioral stress ““ separating rat pups from their mother three hours daily for two weeks ““ showed no long-term impact on key indicators of cardiovascular disease such as increased blood pressure, heart rate or inflammation in blood vessel walls.

But when the rats reached adulthood, an infusion of the hormone angiotensin II resulted in rapid and dramatic increases in all key indicators in animals that experienced early life stress. Stress activates the renin-angiotensin system which produces angiotensin II and is a major regulator of blood vessel growth and inflammation ““ both heavily implicated in heart disease. “They cannot adapt to stress as well as a normal animal does,” Dr. Pollock said. Within a few days, for example, blood pressure was nearly twice as high in the early-stress animals.

The chronic stress model most typically has been used to look at the psychological impact of childhood stress; this was the first time it was used to measure cardiovascular impact, Dr. Pollock said. Findings correlate with studies published in Circulation in 2004 that identified adverse childhood events, such as abuse or parental loss, in the backgrounds of many adults with ischemic heart disease.

“We want to be able to prevent this long-term consequence,” said Dr. Analia S. Loria, MCG postdoctoral fellow and the study’s first author. Although the adult rats seemed fine until stressed, the scientists noted the inevitability of stress in life.

Next steps include determining the mechanism that translates early life stress into cardiovascular risk; they suspect it results in genetic alterations at a vulnerable time in development. “Hormones can modulate gene expression and, during stress, you have very high levels of stress hormone,” Dr. Loria said.

To further test the findings. they are blocking the angiotensin II receptor in rats to see if that decreases the cardiovascular impact in animals with early life stress. And, to more closely mimic what happens in real life, they are feeding high-fat diets to the rats to see if, like the angiotensin II infusions, it exaggerates cardiovascular disease risk. Receptor blockers are commonly used in cardiovascular patients who have high levels of angiotensin II.

The scientists also will be studying gender differences in response to early life stress since their initial studies were in male rats. Psychological studies indicate that females are less impacted by early life stress and the scientists predict they will find similar results in the cardiovascular response.

The research was funded by the National Institutes of Health and an American Heart Association postdoctoral fellowship.

Image Caption: Drs. Jennifer Pollock (left) and Analia S. Loria. Credit: Medical College of Georgia

On the Net:

What Happens To Nerve Cells In Parkinson’s Disease

A new study from The Montreal Neurological Institute and Hospital ““ The Neuro – at McGill University is the first to discover a molecular link between Parkinson’s disease and defects in the ability of nerve cells to communicate. The study, published in the prestigious journal Molecular Cell and selected as Editor’s Choice in the prominent journal Science, provides new insight into the mechanisms underlying Parkinson’s disease, and could lead to innovative new therapeutic strategies.

Parkinson’s, a neurodegenerative disease affecting approximately 100,000 Canadians and over 4 million people worldwide, a number expected to double by the year 2030, causes muscle stiffness and tremor and prevents people from controlling their movements in a normal manner. The disease is characterized by the degeneration and death of dopamine neurons in specific regions of the brain, causing neurological impairment. It is not known exactly what causes the death of these neurons.

Mutations in the parkin gene are responsible for a common inherited form of Parkinson’s disease. By studying defects in the genes and proteins of patients with inherited forms of Parkinson’s, principal author of the study at The Neuro, Dr. Edward Fon, is learning about the molecular mechanisms involved in the death of dopamine neurons.

The function of the parkin protein is not yet well defined. Scientists learn about the function of a protein by studying normal and mutated forms as well as investigating what the protein binds to. These kind of studies provide clues as to what processes the protein may be involved in. It is known that parkin is involved in the degradation of other proteins. However, how defects in this function are linked to Parkinson’s remains unclear. To further understand how a mutated parkin protein causes Parkinson’s, Dr. Fon and his colleagues looked at where mutations are found on the gene and focused on understanding the function of region that is commonly mutated and searched for proteins that bind to this particular domain of the protein.

They identified that parkin binds to a protein called endophilin-A, a protein discovered at The Neuro in 1997 by Dr. Peter McPherson, a co-investigator on the current study. Endophilin-A is central to the process of synaptic transmission, specifically synaptic vesicle trafficking. Synaptic transmission is the process whereby one nerve cell communicates with another. It involves the release of neurotransmitters from a synaptic vesicle at the surface of the cell.  The neurotransmitter travels across the gap or synapse and is brought into (or endocytosed) the communicating neuron.  Synaptic vesicles are spheres that transport and release neurotransmitters, the “Ëœsignal’ required for the propagation of nerve cell signals across the synapse. The binding protein, endophilin-A plays an important role in regulating synaptic vesicle endocytosis, that is the formation, as well as recycling of synaptic vesicles. 

“One of the most consistent and intriguing findings associated with both dominant and recessive forms of Parkinson’s, including those due to parkin mutations, have been defects in synaptic transmission, possibly related to altered synaptic vesicle endocytosis, recycling or release,” says Dr. Fon. “Yet, until now, the molecular mechanisms involved have remained completely unknown. Thus, by linking parkin to endophilin-A, a protein at the heart of synaptic vesicle endocytosis and recycling, our findings provide a molecular link between recessive Parkinson’s genes and defects in synaptic transmission. This now gives us a whole new set of potential treatment targets.”

“This provides a novel and critical molecular link between the parkin gene and synaptic regulation,” says Dr. Jean-Francois Trempe, post-doctoral student in Dr. Kalle Gehring’s lab at McGill, who studied the structural biology of the binding of the two proteins.”The strength and specificity of the interaction makes it a very clear and interesting finding, and indicates that we are heading in the right direction.”

“Our next studies will investigate the function of the parkin-endophilin-A interaction, adds Dr. Fon. “We believe that, if the parkin is mutated then the proper functioning of endophilin-A will be affected as it binds parkin, thereby disrupting synaptic vesicle recycling. This could potentially lead to the death of dopamine neurons by depriving neurons of neurotransmitters necessary for neuronal survival and functioning.”

“Dr. Fon’s new findings will improve our understanding of the defects in the genes and proteins of people who suffer from Parkinson’s disease,” says Dr. Anthony Phillips, Scientific Director at the Canadian Institutes of Health Research (CIHR) Institute of Neurosciences, Mental Health and Addiction. “CIHR is proud to support research that may pave the way to innovative new therapeutic strategies to cure Parkinson’s, which affects too many Canadians.”

There is as yet, no known cure for Parkinson’s disease. A number of drugs and clinical treatments have been developed which can help to control or minimize the symptoms of this disabling and debilitating disease.

As a world-class academic medical centre and a designated National Parkinson Foundation (NPF) Center of Excellence, The Neuro not only delivers first class clinical care but, also conducts innovative research that leads to important discoveries about the disease and significant advancements in medical care and treatments for patients.

This work was supported by CIHR, the Canadian Foundation for Innovation, the R.H. Tomlinson Fellowship program, and the Fonds de la Recherche en Sant© du Qu©bec.

On the Net:

Study Examines Impact Of Movie Food Product Placement On Children

New research from the Hood Center for Children and Families at Dartmouth Medical School (DMS) for the first time sheds light on the significant potential negative impact that food product placements in the movies could be having on children.

The study, which appears in the current edition of the journal Pediatrics, shows that most of the “brand placements” for food, beverage, and food retail establishments that are frequently portrayed in movies, are for energy-dense, nutrient-poor foods or product lines. In addition, the study shows for the first time that product placements in movies may be a far more potent source of advertising to children in terms of food choices than previously understood.

“The current situation in the United States is very serious in terms of the health of our children, and we have to look seriously at all of the factors that may be contributing to it, including the impact of product placements in movies,” says Lisa Sutherland, Ph.D. the lead author of the study. Sutherland says that the diet quality of U.S. children and adolescents has declined markedly during the past 20 years, and current estimates suggest that only one percent of children eat a diet consistent with the U.S. Department of Agriculture’s (USDA) My-Pyramid food guidance. Additionally, fewer than one fifth of adolescents meet the dietary recommendations for fat or fruit and vegetable intakes, and during the last 20 years obesity rates have doubled for children aged 6 to 11 years and tripled for adolescents aged 12 to 19 years.

“While the issue of food advertising and its effect on children has been well documented in numerous studies, comparatively little is known about product placement in movies and how it affects the food and beverage preferences and choices of children and adolescents,” Sutherland said. The study notes that while there are similarities between television advertising and movie product placement, such as the low nutritional quality of the majority of branded products, there are also interesting differences. Recent studies that examined television ads during adolescent programming found fast food and ready-to-eat cereals and cereal bars to be the most prevalent during children’s programming. In contrast, the Dartmouth study found that sugar-sweetened beverages, comprised largely of soda, accounted for the largest proportion of all of the movie-based food product brand placements, accounting for one of every four brand placements overall.

The study notes that of particular concern are the food and beverage product placements in comedies and PG-rated and PG-13″“rated movies, which are often geared specifically to older children and teenagers, who are at an age where they are gaining independence with respect to their food choices. Although the impact of this type of advertising on children is not fully known, it provides a likely avenue by which brand loyalty and product preference can be built in addition to eating patterns. The study also revealed that six companies accounted for 45 percent of all brand placements and included PepsiCo, Coca-Cola, Nestle USA, McDonald’s, Dr. Pepper/Snapple Group and Burger King.

The study acknowledges that many companies have made pledges not to direct advertising at children in order to encourage healthier dietary choices, and that while this is a step in the right direction, more clearly needs to be done. In addition, the study’s authors say that a number of studies to date that focused on other health-related behaviors, including alcohol and tobacco use, showed that movies contain frequent portrayals of these risk behaviors and often include brand appearances of the products. They say it is well established that children who view these risk behaviors in movies are more likely to engage in the behavior themselves.

“This is an area of study which clearly requires more research,” says Sutherland who was part of a team of advisers that, in 2006, helped to develop the Guiding Stars program used by supermarkets to help shoppers better identify the nutritional values of food products. “At a time in their development where children and adolescents are very susceptible to outside influences, we have to carefully examine the influence of all the factors that are combining to create what may end up being lifelong habits around food and lifestyle choices. Certainly, food-product placement in movies is one of many factors, but it is one that may be far more influential than previously realized and perhaps the least well understood.”

Co-authors included Todd MacKenzie, Ph.D., Lisa A. Purvis, MPH, MBA, and Madeline Dalton, Ph.D. all with Dartmouth Medical School.

On the Net:

IOC: Global Warming Could Affect Winter Olympics

The International Olympic Committee is beginning to worry about how global warming may affect future Olympic Games.

President of IOC, Jacques Rogge, told AFP the issue had been discussed during a meeting on Monday ahead of the Winter Olympics, which begin on February 12. The highlight of the meeting was over Cypress Mountain, near Vancouver, where the games will be held.

The mountain has been plagued by a lack of snow caused by warmer temperatures reaching un-seasonal highs of around 50 degrees Fahrenheit during the daytime. The temperatures and lack of snow will wreak havoc on many of the events being held on the mountain.

Tons of snow have been transported in from elsewhere in an attempt to ready the area for the Games. Media outlets looking for news of the situation have been banned from visiting as the IOC works feverishly to be ready on time.

“Global warming of course is a worry,” Rogge said. “It might affect, in the long-term, the staging of Winter Games but I can tell you that today in the evaluation committee meeting we asked for statistics.” Looking for reports of what snow conditions are like in particular resorts for future venues is a must, but, this of course is no guarantee either, he added.

“Global warming is definitely a factor that must be taken into account in Olympic preparations,” noted Rogge. In the future, global warming will be a key issue when determining which cities will host Olympic Games. “In awarding the event to a host city, we must look at the climate and snow conditions and geography, as well as ways to alleviate any lack of snow.”

Artificial snow making machines and other tools have been brought in to help with the preparation for the Games.

On the Net:

Exposure To Secondhand Smoke In English Children has Declined

The most comprehensive study to date of secondhand smoke exposure among children in England is published Feb. 8 in the journal Addiction. The study, carried out by researchers from the University of Bath’s School for Health, reveals that exposure to household secondhand smoke among children aged 4-15 has declined steadily since 1996.

The researchers wanted to find out if there were ways to predict the levels of secondhand smoke encountered by children in private households, and whether those levels were changing over time. Using eight surveys conducted between 1996 and 2006, researchers took saliva samples from over 19,000 children aged 4-15 years. The saliva samples were analyzed for a substance called cotinine, an indicator of tobacco smoke exposure.

The results show that the average cotinine levels among non-smoking children declined by 59% from 1996 to 2006, indicating that children’s exposure to secondhand smoke has decreased markedly since the mid-nineties. The researchers point out that the largest decline was between 2005 and 2006, a time of increased public debate and public information campaigns about secondhand smoke in the lead-up to the 2007 implementation of smoke-free legislation for public spaces.

The research also reveals that secondhand smoke exposure in non-smoking children is highest when one or both parents smoke, when the children are looked after by carers that smoke, and when smoking is allowed in the home. Dr Michelle Sims, first author of the paper, adds: “the importance of carer and parental smoking and household exposure tells us that reducing exposure in the home is the key to reducing the health risks associated with secondhand smoke exposure in children.”

Dr Anna Gilmore, who led the project, said “this study shows that the factors which most strongly influence children’s exposure are modifiable. Parents and carers can reduce their children’s exposure to smoke by giving up smoking, or failing this, making a decision to smoke outside the house. Stopping others from smoking in their house is also important. The fact that children’s exposure has already fallen so markedly shows that making these changes is feasible.”

Sims M., Tomkins S., Judge K., Taylor G., Jarvis M.J., Gilmore A. “Trends in and predictors of secondhand smoke exposure indexed by cotinine in children in England from 1996-2006.” Addiction 2010; 105:

The paper is an independent report commissioned and funded by the Policy Research Programme in the Department of Health. The views expressed are not necessarily those of the Department.

On the Net:

Quantum Tunneling Leads To New Touch-Screen Technology

New material from a UK company that exploits a quantum physics trick could soon lead to pressure-sensitive touch-screens and keys on many hand-held devices, BBC News reported.

The technology allows scrolling down a long list or webpage faster as more pressure is applied.

The “Quantum Tunneling Composite” has now been licensed to a division of Samsung that distributes mobile phone components to several handset manufacturers.

The technology could be used in devices from phones to games to GPS handsets.

Nissha, a Japanese touch-screen maker, also licensed the approach from Yorkshire-based Peratech, who make the composite material QTC.

But Peratech is not yet allowed to reveal the phone, gaming, and device makers that could soon be using the technology to bring pressure sensitivity to a raft of new devices.

Experts say the pressure-sensitivity could lead to a “third dimension” in touchscreens. Meaning that instead of many “2-D” pages of applications, they could be grouped by type on a single page – using the press of a finger to dive into each type and select the desired app.

The technology uses spiky conducting nanoparticles, similar to tiny medieval maces, dispersed evenly in a polymer. However, none of these spiky balls actually touch, but the closer they get to each other, the more likely they are to undergo a quantum physics phenomenon known as tunneling.

Tunneling is one of several effects in quantum mechanics that defies explanation in terms of the “classical” physics that preceded it. Quantum mechanics says that there is a tiny probability that a particle shot at a wall will pass through it in an effect known as tunneling.

In QTC, the material that surrounds the spiky balls acts like a wall to electric current and as the balls draw closer together the probability of a charge tunneling through increases when squashed or deformed by a finger’s pressure.

Therefore, pressing harder on the material leads to a smooth increase in the current through it.

The QTC approach is particularly suited to making thin devices. Pressure-sensitive QTC switches can be made as small as the thickness of a human hair.

Samsung Electro-mechanics has now incorporated the QTC into the navigation switch familiar on smartphones””which is useful for scrolling more or less quickly through, for example, a long list of emails.

Peratech’s chief executive Philip Taysom said that same model can be used in many other ways, like in games: to control how hard I want to jump or run for example.

“Electronics are being given the ability to sense something that we take for granted, which is how much we’re touching and applying force,” he added.

On the Net:

Bullet-Shaped Virus Has Potential To Fight Cancer, HIV

Study of vesicular stomatitis virus leads to model of viral assembly process

Vesicular stomatitis virus, or VSV, has long been a model system for studying and understanding the life cycle of negative-strand RNA viruses, which include viruses that cause influenza, measles and rabies.

More importantly, research has shown that VSV has the potential to be genetically modified to serve as an anti-cancer agent, exercising high selectivity in killing cancer cells while sparing healthy cells, and as a potent vaccine against HIV.

For such modifications to occur, however, scientists must have an accurate picture of the virus’s structure. While three-dimensional structural information of VSV’s characteristic bullet shape and its assembly process has been sought for decades, efforts have been hampered by technological and methodological limitations.

Now, researchers at UCLA’s California NanoSystems Institute and the UCLA Department of Microbiology, Immunology and Molecular Genetics and colleagues have not only revealed the 3-D structure of the trunk section of VSV but have further deduced the architectural organization of the entire bullet-shaped virion through cryo-electron microscopy and an integrated use of image-processing methods.

Their research findings appear this month in the journal Science.

“Structures of individual rhabdovirus proteins have been reported in Science and other high-profile journals, but until now, how they are organized into a bullet shape has remained unclear,” said study author Z. Hong Zhou, UCLA professor of microbiology, immunology and molecular genetics and a member of the CNSI. “The special shape of VSV– a bullet head with a short, helical trunk– has lent to its evasion from three-dimensional structural studies.”

Based on their research into the structure of VSV, the team proposed a model for the assembly of the virus, with its origin at the bullet tip. Their data suggest that VSV assembles through the alternating use of several possible interaction interfaces coded in viral protein sequences to wind its protein and RNA chain into the characteristic bullet shape.

“Our structure provides the first direct visualization of the N and M proteins inside the VSV virion at 10.6-Ó¦ resolution. Surprisingly, our data clearly demonstrated that VSV is a highly ordered particle, with the nucleocapsid surrounded by, instead of surrounding, a matrix of M proteins,” said lead study author Peng Ge, a visiting graduate student at UCLA from Baylor College of Medicine. “To our amusement, the sequence in assembling viral protein and RNA molecules into the virus appears to rhyme with the first several measures of Mozart’s piano sonata in C-Major, K.545.”

The findings could help lead to advances in the development of VSV-based vaccines for HIV and other deadly viruses, according to the researchers.

“Our structure provides some of the first clues for understanding VSV-derived vaccine pseudotypes and for optimizing therapeutic VSV variants,” Zhou said. “This work moves our understanding of the biology of this large and medically important class of viruses ahead in a dramatic way. The next stage of research for our team will be to reveal the details of molecular interactions at the atomic scale using advanced imaging instruments now available at CNSI.”

The Electron Imaging Center for Nanomachines (EICN) lab at the CNSI has Cryo-EM instrumentation, including the Titan Krios microscope, which makes atomically precise 3-D computer reconstructions of biological samples and produces the highest-resolution images available of viruses, which may lead to better vaccines and new treatments for disease.

In addition to Z. Hong Zhou and Peng Ge, the research team included colleagues from the laboratory of Ming Luo, professor of microbiology at the University of Alabama at Birmingham, and Stan Schein, UCLA professor of psychology.
 
The research was supported by the National Institutes of Health.

By Jennifer Marcus, UCLA

Image Caption: Assembly of bullet-shaped VSV virion

On the Net:

Nicotine Replacement Therapy Over-Promoted

Health authorities should emphasize the positive message that the most successful method used by most ex-smokers is unassisted cessation, despite the promotion of cessation drugs by pharmaceutical companies and many tobacco control advocates. The dominant messages about smoking cessation contained in most tobacco control campaigns, which emphasize that serious attempts at quitting smoking must be pharmacologically or professionally mediated, are critiqued in an essay in this week’s PLoS Medicine by Simon Chapman and Ross MacKenzie from the School of Public Health at the University of Sydney, Australia. This overemphasis on quit methods like nicotine replacement therapy (NRT) has led to the “medicalization of smoking cessation,” despite good evidence that the most successful method used by most ex-smokers is quitting “cold turkey” or reducing-then-quitting. Reviewing 511 studies published in 2007 and 2 008 the authors report that studies repeatedly show that two-thirds to three-quarters of ex-smokers stop unaided and most ex-smokers report that cessation was less difficult than expected.

The medicalization of smoking cessation is fuelled by the extent and influence of pharmaceutical support for cessation intervention studies, say the authors. They cite a recent review of randomized controlled trials of nicotine replacement therapy (NRT) that found that 51% of industry-funded trials reported significant cessation effects, while only 22% of non-industry trials did. Many assisted cessation studies””but few if any unassisted cessation studies””involve researchers who declare support from a pharmaceutical company manufacturing cessation products.

The authors conclude that “public sector communicators should be encouraged to redress the overwhelming dominance of assisted cessation in public awareness, so that some balance can restored in smokers’ minds regarding the contribution that assisted and unassisted smoking cessation approaches can make to helping them quit smoking.”

Funding: National Health and Medical Research Council (Australia) Project Grant 2006óˆµ #401558.The funders had no role in the decision to submit this manuscript or in its preparation.

Competing Interests: SC was a member of the Australian Smoking Cessation Consortium that received research funding from GlaxoSmithKline in 2001.

Citation: Chapman S, MacKenzie R (2010) The Global Research Neglect of Unassisted Smoking Cessation: Causes and Consequences. PLoS Med 7(2): e1000216. doi:10.1371/journal.pmed.1000216

On the Net:

Patients ‘Unafraid’ To Gamble Highlight Role Of Amygdala In Decision-Making

Two patients with rare lesions to the brain have provided direct of evidence of how we make decisions ““ and what makes us dislike the thought of losing money.

Researchers at the California Institute of Technology studied a phenomenon known as ‘loss aversion’ in two patients with lesions to the amygdala, a region deep within the brain involved in emotions and decision-making. The results of the study, part-funded by the Wellcome Trust, are published today in the journal Proceedings of the National Academy of Sciences.

Loss aversion describes the avoidance of choices which can lead to losses, even when accompanied by equal or much larger gains. Examples in the everyday life include how we make a decision on whether to proceed with an operation: the more serious the potential complications from the operation ““ even if the risk is low compared to the chances of success ““ the less likely we would be to proceed. It even has implications on organ donation rates ““ if people are required to ‘opt-in’ to a system, they are less likely to move away from the default option.

Dr Benedetto De Martino, a Sir Henry Wellcome Trust Postdoctoral Fellow and first author of the report, explains: “Imagine you’re on Who Wants to Be a Millionaire. You’ve just answered the £500,000 question correctly and have moved on to the final question. You’re down to your 50:50 lifeline but don’t know the answer. If you get it right, you’ll win £1 million; if you get it wrong, you’ll drop back to £32,000. The vast majority of people would take the ‘loss averse option’ and walk away with £500,000.”

This new study has explored whether loss aversion is mediated by the amygdala, as is currently hypothesized. The researchers studied two patients affected by a rare genetic condition which has led to the formation of lesions to the amygdala. These lesions prevent the patients from perceiving, recognizing or feeling fear. For example, the patients can recognize all other emotions in a person’s face, but if shown a fearful face they cannot say what emotion that person is experiencing.

Each patient ““ together with twelve ‘healthy’ controls ““ took part in a task designed to test whether the chance of losing money affected people’s likelihood to gamble

At the beginning of the experiment, each participant was given $50 with which to gamble on the outcome of flipping a coin, which carries a 50:50 chance of winning (or losing). However, each time, the amount that the volunteers could win or lose varied. For example, one time they might stand to win $50 or lose $20 depending on the outcome. The second time, they might stand to win $30 or lose $40.

The researchers found that, as expected, the healthy individuals were less likely to gamble when the difference between the potential winnings and potential losses was smaller ““ for example, whilst they might gamble if they stood to win $50 but lose only $10, they would be less likely to gamble if they stood to win only $20 but lose $15. When the potential losses outweighed the potential gains, the controls would not gamble.

However, the two patients with impaired amygdala activity were much less affected by the disparity between potential gains and losses; occasionally, even when the potential losses outweighed the potential gains they would choose to gamble, showing a lack of loss aversion.

“A fully-functioning amygdala appears to make us more cautious,” explains Ralph Adolphs, the Bren Professor of Psychology and Neuroscience. “We already know that the amygdala is involved in processing fear, and it also appears to make us ‘afraid’ to risk losing money.”

“It may be that the amygdala controls a very general biological mechanism for inhibiting risky behavior when outcomes are potentially negative, such as the monetary loss aversion which shapes our everyday financial decisions,” comments Dr De Martino, a visiting researcher from UCL (University College London).

“Loss aversion has been observed in many economic studies, from monkeys trading tokens for food to people on high-stakes game shows,” adds Colin Camerer, the Robert Kirby Professor of Behavioral Economics, “but this is the first clear evidence of a special brain structure which is responsible for fear of those losses.”

Dr De Martino and colleagues also investigated whether, as well as being ‘loss averse’, the patients were also ‘risk averse’. Risk aversion and loss aversion are two similar, but not identical, processes and as such can be easily confused. People who are ‘risk averse’ are less likely to take chances even when they do not stand to lose anything.

The volunteers were again asked to make a decision based on the outcome of a coin toss. However, in this situation, the options were either to take a set amount without gambling (for example, $5), or gamble with a chance of winning $10 or receiving nothing but not losses were involved. In this experiment, both patients and controls showed little difference in their decisions, suggesting that the amygdala goes not control this aspect of risk taking.

The research was supported by the Gordon and Betty Moore Foundation, the Human Frontier Science Program, the Wellcome Trust, the National Institutes of Health, the Simons Foundation, and a global Center of Excellence grant from the Japanese government.

On the Net:

The Dangers Of Third-Hand Smoke

Nicotine in third-hand smoke, the residue from tobacco smoke that clings to virtually all surfaces long after a cigarette has been extinguished, reacts with the common indoor air pollutant nitrous acid to produce dangerous carcinogens. This new potential health hazard was revealed in a multi-institutional study led by researchers with the Lawrence Berkeley National Laboratory (Berkeley Lab).

“The burning of tobacco releases nicotine in the form of a vapor that adsorbs strongly onto indoor surfaces, such as walls, floors, carpeting, drapes and furniture. Nicotine can persist on those materials for days, weeks and even months. Our study shows that when this residual nicotine reacts with ambient nitrous acid it forms carcinogenic tobacco-specific nitrosamines or TSNAs,” says Hugo Destaillats, a chemist with the Indoor Environment Department of Berkeley Lab’s Environmental Energy Technologies Division. “TSNAs are among the most broadly acting and potent carcinogens present in unburned tobacco and tobacco smoke.”

Destaillats is the corresponding author of a paper published in the Proceedings of the National Academy of Sciences (PNAS) titled “Formation of carcinogens indoors by surface-mediated reactions of nicotine with nitrous acid, leading to potential third-hand smoke hazards.”

Co-authoring the PNAS paper with Destaillats were Mohamad Sleiman, Lara Gundel and Brett Singer, all with Berkeley Lab’s Indoor Environment Department, plus James Pankow with Portland State University, and Peyton Jacob with the University of California, San Francisco.

The authors report that in laboratory tests using cellulose as a model indoor material exposed to smoke, levels of newly formed TSNAs detected on cellulose surfaces were 10 times higher than those originally present in the sample following exposure for three hours to a “high but reasonable” concentration of nitrous acid (60 parts per billion by volume). Unvented gas appliances are the main source of nitrous acid indoors. Since most vehicle engines emit some nitrous acid that can infiltrate the passenger compartments, tests were also conducted on surfaces inside the truck of a heavy smoker, including the surface of a stainless steel glove compartment. These measurements also showed substantial levels of TSNAs. In both cases, one of the major products found was a TSNA that is absent in freshly emitted tobacco smoke ““ the nitrosamine known as NNA. The potent carcinogens NNN and NNK were also formed in this reaction.

“Time-course measurements revealed fast TSNA formation, up to 0.4 percent conversion of nicotine within the first hour,” says lead author Sleiman. “Given the rapid sorption and persistence of high levels of nicotine on indoor surfaces, including clothing and human skin, our findings indicate that third-hand smoke represents an unappreciated health hazard through dermal exposure, dust inhalation and ingestion.”

Since the most likely human exposure to these TSNAs is through either inhalation of dust or the contact of skin with carpet or clothes, third-hand smoke would seem to pose the greatest hazard to infants and toddlers. The study’s findings indicate that opening a window or deploying a fan to ventilate the room while a cigarette burns does not eliminate the hazard of third-hand smoke. Smoking outdoors is not much of an improvement, as co-author Gundel explains.

“Smoking outside is better than smoking indoors but nicotine residues will stick to a smoker’s skin and clothing,” she says. “Those residues follow a smoker back inside and get spread everywhere. The biggest risk is to young children. Dermal uptake of the nicotine through a child’s skin is likely to occur when the smoker returns and if nitrous acid is in the air, which it usually is, then TSNAs will be formed.”

The dangers of mainstream and secondhand tobacco smoke have been well documented as a cause of cancer, cardiovascular disease and stroke, pulmonary disease and birth defects. Only recently, however, has the general public been made aware of the threats posed by third-hand smoke. The term was coined in a study that appeared in the January 2009 edition of the journal “Pediatrics,” in which it was reported that only 65 percent of non-smokers and 43 percent of smokers surveyed agreed with the statement that “Breathing air in a room today where people smoked yesterday can harm the health of infants and children.”

Anyone who has entered a confined space ““ a room, an elevator, a vehicle, etc. – where someone recently smoked, knows that the scent lingers for an extended period of time. Scientists have been aware for several years that tobacco smoke is adsorbed on surfaces where semi-volatile and non-volatile chemical constituents can undergo reactions, but reactions of residual smoke constituents with atmospheric molecules such as nitrous acid have been overlooked as a source of harmful pollutants. This is the first study to quantify the reactions of third-hand smoke with nitrous acid, according to the authors.

“Whereas the sidestream smoke of one cigarette contains at least 100 nanograms equivalent total TSNAs, our results indicate that several hundred nanograms per square meter of nitrosamines may be formed on indoor surfaces in the presence of nitrous acid,” says lead-author Sleiman.

Co-author James Pankow points out that the results of this study should raise concerns about the purported safety of electronic cigarettes. Also known as “e-cigarettes,” electronic cigarettes claim to provide the “smoking experience,” but without the risks of cancer. A battery-powered vaporizer inside the tube of a plastic cigarette turns a solution of nicotine into a smoky mist that can be inhaled and exhaled like tobacco smoke. Since no flame is required to ignite the e-cigarette and there is no tobacco or combustion, e-cigarettes are not restricted by anti-smoking laws.

“Nicotine, the addictive substance in tobacco smoke, has until now been considered to be non-toxic in the strictest sense of the term,” says Kamlesh Asotra of the University of California’s Tobacco-Related Disease Research Program, which funded this study. “What we see in this study is that the reactions of residual nicotine with nitrous acid at surface interfaces are a potential cancer hazard, and these results may be just the tip of the iceberg.”

The Berkeley Lab researchers are now investigating the long-term stability in an indoor environment of the TSNAs produced as a result of third-hand smoke interactions with nitrous acid. The authors are also looking into the development of biomarkers to track exposures to these TSNAs. In addition, they are conducting studies to gain a better understanding of the chemistry behind the formation of these TSNAs and to find out more about other chemicals that are being produced when third-hand smoke reacts with nitrous acid.

“We know that these residual levels of nicotine may build up over time after several smoking cycles, and we know that through the process of aging, third-hand smoke can become more toxic over time,” says Destaillats. “Our work highlights the importance of third-hand smoke reactions at indoor interfaces, particularly the production of nitrosamines with potential health impacts.”

In the PNAS paper, Destaillats and his co-authors suggest various ways to limit the impact of the third hand smoke health hazard, starting with the implementation of 100 percent smoke-free environments in public places and self-restrictions in residences and automobiles. In buildings where substantial smoking has occurred, replacing nicotine-laden furnishings, carpets and wallboard can significantly reduce exposures.

Image 2: In tests at Berkeley Lab of cellulose surfaces contaminated with nicotine residues from third-hand smoke, levels of newly formed TSNAs rose 10 times following a three hour exposure to nitrous acid. TSNAs are potent carcinogens. Credit: Photo by Roy Kaltschmidt, Berkeley Lab Public Affairs

On the Net:

SW Australian Drought Linked To Antarctic Snowfall

According to scientists on Sunday, a drought that has been ongoing in the southwestern region of Australia for more than 30 years has been linked to higher snowfall in East Antarctica, a phenomenon that may be tied to global warming.

The southwestern drought, where rainfall has declined 15 to 20 percent in recent years, is very unusual when compared to normal activity over the past 750 years, researchers Tas van Ommen and Vin Morgan of the Australian Antarctic Division told AFP.

In a report on the issue published online in the journal Nature Geoscience, the researchers remark on the ties to increased snowfall on Antarctica’s Law Dome icecap with Australia’s drought in the southwest saying it is the result of an apparent “precipitation see-saw.”

Cool, dry air flows northwards to southwest Australia, which keeps rainfall amounts down, while warm, moist air flows into East Antarctica, providing abundant snow. This pattern is consistent with previous studies that tied the events to man-made factors that may have played a role in drought.

Other earlier studies also pointed to greenhouse gases that may have changed the Southern Annular Model, a key feature of atmospheric circulation in the southern hemisphere.

Image Caption: Law Dome. Photo: Tas van Ommen

On the Net:

New Technique Easily Makes Stem Cells Pluripotent

Tiny circles of DNA are the key to a new and easier way to transform stem cells from human fat into induced pluripotent stem cells for use in regenerative medicine, say scientists at the Stanford University School of Medicine. Unlike other commonly used techniques, the method, which is based on standard molecular biology practices, does not use viruses to introduce genes into the cells or permanently alter a cell’s genome.

It is the first example of reprogramming adult cells to pluripotency in this manner, and is hailed by the researchers as a major step toward the use of such cells in humans. They hope that the ease of the technique and its relative safety will smooth its way through the necessary FDA approval process.

“This technique is not only safer, it’s relatively simple,” said Stanford surgery professor Michael Longaker, MD, and co-author of the paper. “It will be a relatively straightforward process for labs around the world to begin using this technique. We are moving toward clinically applicable regenerative medicine.”

The Stanford researchers used the so-called minicircles – rings of DNA about one-half the size of those usually used to reprogram cell – to induce pluripotency in stem cells from human fat. Pluripotent cells can then be induced to become many different specialized cell types. Although the researchers plan to first use these cells to better understand – and perhaps one day treat-human heart disease, induced pluripotent stem cells, or iPS cells, are a starting point for research on many human diseases.

“Imagine doing a fat or skin biopsy from a member of a family with heart problems, reprogramming the cells to pluripotency and then making cardiac cells to study in a laboratory dish,” said cardiologist Joseph Wu, MD, PhD. “This would be much easier and less invasive than taking cell samples from a patient’s heart.” Wu is the senior author of the research, which was published online Feb. 7 in Nature Methods. Research assistant Fangjun Jia, PhD is the lead author of the work.

Longaker is the deputy director of Stanford’s Institute for Stem Cell Biology and Regenerative Medicine and director of children’s surgical research at Lucile Packard Children’s Hospital. Wu is an assistant professor of cardiology and of radiology, and a member of Stanford’s Cardiovascular Institute. A third author, Mark Kay, MD, PhD, is the Dennis Farrey Family Professor in Pediatrics and professor of genetics.

The finding brings together disparate areas of Stanford research. Kay’s laboratory invented the minicircles several years ago in a quest to develop suitable gene therapy techniques. At the same time, Longaker was discovering the unusual prevalence and developmental flexibility of stem cells from human fat. Meanwhile, Wu was searching for ways to create patient-specific cell lines to study some of the common, yet devastating, heart problems he was seeing in the clinic.

“About three years ago Mark gave a talk and I asked him if we could use minicircles for cardiac gene therapy,” said Wu. “And then it clicked for me, that we should also be able to use them for non-viral reprogramming of adult cells.”

The minicircle reprogramming vector works so well because it is made of only the four genes needed to reprogram the cells (plus a gene for a green fluorescent protein to track minicircle-containing cells). Unlike the larger, more commonly used DNA circles called plasmids, the minicircles contain no bacterial DNA, meaning that the cells containing the minicircles are less likely than plasmids to be perceived as foreign by the body. The expression of minicircle genes is also more robust, and the smaller size of the minicircles allows them to enter the cells more easily than the larger plasmids. Finally, because they don’t replicate they are naturally lost as the cells divide, rather than hanging around to potentially muck up any subsequent therapeutic applications.

The researchers chose to test the reprogramming efficiency of the minicircles in stem cells from human fat because previous work in Wu and Longaker’s lab has shown that the cells are numerous, easy to isolate and amenable to the iPS transformation, probably because of the naturally higher levels of expression of some reprogramming genes. They found that about 10.8 percent of the stem cells took up the minicircles and expressed the green fluorescent protein, or GFP, versus about 2.7 percent of cells treated with a more traditional DNA plasmid.

When the researchers isolated the GFP-expressing cells and grew them in a laboratory dish, they found that the minicircles were gradually lost over a period of four weeks. To be sure the cells got a good dose of the genes, they reapplied the minicircles at days four and six. After 14 to 16 days, they began to observe clusters of cells resembling embryonic stem cell colonies – some of which no longer expressed GFP.

They isolated these GFP-free clusters and found that they exhibited all of the hallmarks of induced pluripotent cells: they expressed embryonic stem cell genes, they had similar patterns of DNA methylation, they could become multiple types of cells and they could form tumors called teratomas when injected under the skin of laboratory mice. They also confirmed that the minicircles had truly been lost and had not integrated into the stem cells’ DNA.

Altogether, the researchers were able to make 22 new iPS cell lines from adult human adipose stem cells and adult human fibroblasts. Although the overall reprogramming efficiency of the minicircle method is lower than that of methods using viral vectors to introduce the genes (about 0.005 percent vs. about 0.01-0.05 percent, respectively), it still surpasses that of using conventional bacterial-based plasmids. Furthermore, stem cells from fat, and, for that matter, fat itself, are so prevalent that a slight reduction in efficiency should be easily overcome.

“This is a great example of collaboration,” said Longaker. “This discovery represents research from four different departments: pediatrics, surgery, cardiology and radiology. We were all doing our own things, and it wasn’t until we focused on cross-applications of our research that we realized the potential.”

“We knew minicircles worked better than plasmids for gene therapy,” agreed Kay, “but it wasn’t until I started talking to stem cell people like Joe and Mike that we started thinking of using minicircles for this purpose. Now it’s kind of like ‘why didn’t we think of this sooner?'”

In addition to Longaker, Wu, Kay and Jia, other Stanford researchers involved in the work include Kitchener Wilson, MD; Ning Sun, PhD; Deepak Gupta, MD; Mei Huang, PhD; Zongjin Li, MD, PhD; Nicholas Panetta, MD; Zhi Ying Chen, PhD; and Robert Robbins, MD.

The research was supported by the Mallinckrodt Foundation, the National Institutes of Health, the Burroughs Wellcome Foundation, the American Heart Association, the California Institute for Regenerative Medicine, the Oak Foundation and the Hagey Laboratory for Pediatric Regenerative Medicine.

On the Net:

Boredom Can Kill

According to new scientific research, boredom could actually kill you.

Researchers say that people who complain of boredom are more likely to die at younger ages, and those who experience excessive moments of monotony are more than two-and-a-half times as likely to die from heart disease or stroke as those who satisfy their brain and body on a regular basis.

The study was conducted on more than 7,000 civil servants over a 25 year period.

Those who actively complained of boredom were nearly 40 percent more likely to have died by the end of the study than those who did not.

Scientists said that those who felt unhappy with their lives often turn to unhealthy life choices including smoking, drinking and over-eating which cut their life expectancy.

The data from 7,524 civil servants between the ages of 35 and 55 who were interviewed between 1985 and 1988, was analyzed by specialists from the Department of Epidemiology and Public Health at University College London. They then checked figures to see who had died by April 2009.

The report, to be published in the International Journal of Epidemiology this week, was co-written by researcher Martin Shipley, who said: “The findings on heart disease show there was sufficient evidence to say there is a link with boredom.” He added that it was important for people who have dull jobs to find other interests to keep boredom from taking over their lives.

Psychologist Graham Price told Mail Online that it is important to distinguish first if people are turning to drinking and drugs because of boredom, or if they have certain predetermined characteristics that lead them to unhealthy practices. People who are unmotivated in, or by, life, should find ways to change their focus away from themselves and on to other people. That is where inspiration may lie to remove the dullness and tedium from one’s life.

The original survey found that one in ten civil servants had been bored within the past month. Women were more than twice as likely than men to suffer boredom. Younger employees and those with more menial jobs were also found to be more prone to boredom. Those who reported tedious feelings on a regular basis were 37 percent more likely to have died by the end of the study.
ã

On the Net:

Bats Hit Their Target By Not Aiming At It

New research conducted at the University of Maryland’s bat lab shows Egyptian fruit bats find a target by NOT aiming their guiding sonar directly at it. Instead, they alternately point the sound beam to either side of the target. The new findings by researchers from Maryland and the Weizmann Institute of Science in Israel suggest that this strategy optimizes the bats’ ability to pinpoint the location of a target, but also makes it harder for them to detect a target in the first place.

“We think that this tradeoff between detecting a object and determining its location is fundamental to any process that involves tracking an object whether done by a bat, a dog or a human, and whether accomplished through hearing, smell or sight,” said coauthor Cynthia Moss, a University of Maryland professor of psychology, who directs interdisciplinary bat echolocation research in the university’s Auditory Neuroethology Lab, better known as the bat lab.

Moss, colleagues Nachum Ulanovsky and Yossi Yovel of the Weizmann Institute, and Ben Falk, a graduate student of Moss’s in Maryland’s Neuroscience and Cognitive Science Program, published their findings in this week’s edition of the journal Science. Ulanovsky, the paper’s corresponding author, was a Maryland postdoctoral student under Moss.

For this research, Ulanovsky and Yossi trained fruit bats to land on a spherical target while relying exclusively on their sonar. Trained in Israel, the bats were then brought to Maryland to be studied in Moss’s specialized lab. High speed infrared cameras recorded the bats movement in flight while the shape and direction of their sonar beam patterns was measured with a sensitive arrangement of 20 microphones positioned around the large room. These bats emit paired clicking sounds and the researchers found that the sonar beam created by each click alternated to the left and right of a target. This alternating pattern effectively directed the inside edge, or maximum slope, of each sonar beam onto the target. As a result, any change in the relative position of the target to the bat reflected that large sonar edge back at the bat, delivering the largest possible change in echo intensity.

However, as the researchers note, there is a cost to this approach: less sound is reflected back to the bat from the object than if the sound beam were aimed more directly toward the object. Thus the fruit bat’s strategy of using the steepest edge of a sonar beam (which intuitively follows a mathematical optimization formula) sacrifices a little target-detection for pinpoint accuracy in tracking.

Detecting vs. Tracking

By changing the conditions of their experiments in the bat lab the researchers were able to show that the fruit bats will sometimes change their echolocation strategy based on the situation. To do this, a reflecting board was positioned behind the target, creating “noise” echoes that competed with those from the target, potentially making detection of the target more difficult. In trials under these conditions, some bats altered their sound beam directional strategy. These bats started off with the wide side to side pointing that maximizes determining location, but once they got closer, switched to point the beams from both clicks almost directly at the target.

Moss, who has been researching bat echolocation for 20 years, notes that even among different bat species there are different approaches to tracking objects. Much of the research she and her students have conducted in the bat lab has looked at big brown bats, which are common in the United States. Big brown bars generate sound with their vocal cords and aim the resulting sound beam directly at their targets, an approach that maximizes their ability to detect the mosquitoes and other small, fast moving insects that serve as food.

“The difference in sonar beam directing strategies between the Egyptian Fruit bats and Big Brown bats may be related to the differences between their sound production mechanisms (tongue clicks vs. vocalizations), echo processing systems, behavioral requirements in nature, or other species differences,” said Moss, a former director of the Neuroscience and Cognitive Science Program, who also holds an appointment in the university’s Institute for Systems Research.

The researchers suggest other animals reflect similar approaches to the detection versus tracking tradeoff. Their paper gives the example of a hound dog following an odor trail using a similar localization approach to that used by the tongue clicking Egyptian fruit bats.

“People without sight use echolocation, in some cases also generating sound by tongue clicking,” said Moss. One such person, she noted, is Daniel Kish, Executive Director of World Access for the Blind. Totally blind from birth, Kish uses tongue clicking for sonar that allows him to “see” his environment at a very high level, even allowing him to safely ride a bike in a city street. Kish is “the first totally blind, national certified Orientation & Mobility Specialist,” according to the website for World Access for the Blind.

“There are no measurements of the directionality of the sound beams used by blind echolocators like Kish,” Moss said. “But looking at head movements in echo locating blind individuals, it seems that some may show a similar strategy [to that of fruit bats].”

On the Net:

Melatonin Precursor Stimulates Growth Factor Circuits In Brain

Scientists at Emory University School of Medicine have discovered unexpected properties for a precursor to melatonin, the hormone that regulates sleep cycles.

Melatonin is produced from the neurotransmitter serotonin in a daily rhythm that peaks at night. Melatonin’s immediate precursor, N-acetylserotonin, was not previously thought to have effects separate from those of melatonin or serotonin.

Now an Emory team has shown that N-acetylserotonin can stimulate the same circuits in the brain activated by the growth factor BDNF (brain-derived neurotrophic factor).

The results will be published online this week in the Proceedings of the National Academy of Sciences.

The team was led by Keqiang Ye, associate professor of pathology and laboratory medicine, and P. Michael Iuvone, professor of pharmacology and director of research at Emory Eye Center. Researchers from Morehouse School of Medicine and the University of Wisconsin contributed to the paper.

The discovery has implications for the study of how some antidepressants function and may also explain previous observations that N-acetylserotonin has antidepressant activity in animal models of depression.

“Our results suggest that the molecules and pathways involved in mood regulation and circadian rhythms are intertwined,” Ye says.

A lack of BDNF, which pushes brain cells to grow and helps them resist stress, is thought to lie behind depression and several neurodegenerative diseases. Ye and his colleagues have been searching for chemicals that can mimic BDNF by activating TrkB, the receptor for BDNF on cells’ surfaces.

Several widely prescribed antidepressants (selective serotonin reuptake inhibitors such as fluoxetine/Prozac) increase levels of serotonin in the brain, but the connections between serotonin levels and depression are complex. Because antidepressants seem to take weeks to display their effects, scientists have proposed that their real targets are BDNF and TrkB.

“We were exploring whether the serotonin system is involved in TrkB signaling,” Ye says. “We were surprised to find that N-acetylserotonin, but not serotonin or melatonin, can activate TrkB.”

 N-acetylserotonin could stimulate TrkB even when BDNF was not present, both in cell culture dishes and in mice, Ye and his colleagues found. It could also protect neurons from overstimulation in the same way that BDNF can.

Melatonin is produced at several sites in the body: the pineal gland, the retina and the intestine. One of the most common strains of laboratory mice (C57Bl6) is deficient in making N-acetylserotonin and melatonin and develops retinal degeneration.

The authors observed that in the retinas of mice that produce adequate melatonin, TrkB is turned on at night, a pattern that matches the appearance of N-acetylserotonin. However, the pattern of TrkB activation is flat in C57Bl6 melatonin-deficient mice.

Ye’s laboratory is now investigating the mechanism by which N-acetylserotonin activates TrkB. He says that N-acetylserotonin has a short lifetime in the body but similar compounds that are more stable may be useful in treating neurological diseases.

The research was supported by the National Institutes of Health and Research to Prevent Blindness.

On the Net:

Pig Lungs One Step Closer To Human Transplant

A medical breakthrough has brought the use of pig lungs in transplants to humans a step closer.

Scientists in Melbourne Australia used a ventilator and pump to keep the animal lungs alive and “breathing” while human blood flows through them, the Telegraph recently reported.

Experts believe that within five years, this work might lead to the first animal-human transplant.

Dr Glenn Westall, who helped conduct the experiment, said: “The blood went into the lungs without oxygen and came out with oxygen, which is the exact function of the lungs.

“It showed that these lungs were working perfectly well and doing as we were expecting them to do.

“This is a significant advance compared to experiments that have been performed over the past 20 years.”

The breakthrough came when scientists moved a section of pig DNA, which made the pig organs incompatible with human blood.

A couple of years ago, scientists’ attempts to combine unmodified pig lungs and human blood ended abruptly when blood clots began forming almost immediately, causing the organs to become so blocked no blood could pass through.
 
Human DNA is now added to the pigs as they are reared to reduce clotting and the number of lungs that are rejected.

The full results of the research are to be announced in Vancouver in August.

The issue has prompted an ethical debate about the use of animals for the sake of human transplants.

Professor Tonti-Fillippini, a medical ethicist, said “It is basically a human-pig, a hybrid, or whatever you want to call it.”

“It is about whether the community is prepared to accept a part human, part animal.”

Pill Could Prolong Life

Scientists claim to have created a drug that will allow people to live beyond the age of 100.

The drug, based on three genes that extend life and prevent diseases such as Alzheimer’s, will help even the obese and smokers reach the century mark.

Being obese or a heavy smoker doesn’t limit your chances of reaching a century or preventing such diseases.

A research team from the Institute for Aging Research at the Albert Einstein College of Medicine in New York found the three genes by examining the DNA of 500 Ashkenazi Jews (Jews of German and Eastern European descent) with an average age of 100.

They were chosen after previous studies found the group to have a very specific genetic footprint because their bloodline had been kept very pure.

The study subjects shared few common ancestors who had longevity genes, otherwise the chances of living this long are extremely rare–one in 10,000, the researchers said.

Dr. Nir Barzilai, who led the New York study, said several labs were now racing to create a pill that mimics these genes and hopes the drug will be on the market in the next 3 years.

“The advantage of finding a gene that involves longevity is we can develop a drug that will imitate what this gene is doing. If we can imitate that, then long life can be terrific,” he said.

Professor Judith Phillips says, “It’s a huge opportunity because the ageing population is growing anyway. They would be a huge resource because people would be able to work longer and they would have a healthier life, and it would revolutionize the way we look at older people. And it would reduce costs in terms of care.”

Of the three longevity genes discovered, two are said to enhance the production of “good” cholesterol, which in turn minimizes the risk of heart disease, Alzheimer’s, and stroke. The third gene is linked to reduced diabetes risk.

But Andrew Ketteringham, of the Alzheimer’s Society, said, “Alzheimer’s disease, the most common dementia, is likely to be caused by a combination of genetic disposition, lifestyle and life events. Many genes are probably involved.”

On the Net:

Malaria’s Key Survival Protein Identified, Offering Drug Hope

Walter and Eliza Hall Institute researchers have identified a key protein used by the malaria parasite to transform human red blood cells, ensuring the parasite’s survival.

Their discovery means researchers have a clear target against which to develop a new class of anti-malarial drugs that destroy the parasite.

Each year more than 400 million people contract malaria, and more than one million people, mostly children, die from the disease. The most lethal form of the disease is caused by the parasite Plasmodium falciparum, which invades red blood cells and drastically modifies them so it can survive.

Professor Alan Cowman, head of the institute’s Infection and Immunity division, said the parasite remodels the red blood cells by exporting hundreds of so-called ‘effector’ proteins into the cytoplasm of the red blood cell. “These are key virulence proteins that allow the parasite to survive in the human and hide from the human immune system,” Professor Cowman said.

“There has to be a mechanism that allows these effector proteins to be exported but until now we haven’t known what it is.”

Dr Justin Boddey, Dr Tony Hodder, Dr Svenja Gunther, Dr Andrew Pearce and Professor Cowman from the institute, in collaboration with Professor Richard Simpson, Dr Heather Patsiouras and Dr Eugene Kapp of the Ludwig Institute for Cancer Research, Professor Brendan Crabb and Paul Gilson at the Burnet Institute and Dr Tania de Koning-Ward at Deakin University, have identified a protein called Plasmepsin V as being essential for effector proteins to be exported into the red blood cell.

Their research has been published today in the international journal Nature.

Professor Cowman said experimentation had shown that the action of Plasmepsin V on the effector proteins was the first step in priming the proteins to be exported beyond the parasite’s membrane into the red blood cell cytoplasm.

“Plasmepsin V is responsible for determining that all the hundreds of effector proteins are exported. If we could find drugs to block Plasmepsin V the malaria parasite would die,” he said.

Professor Cowman said because Plasmepsin V was a protease it was an attractive drug target.

“Drugs that target proteases have been very effective in combating HIV so, by analogy, drugs that impede the function of Plasmepsin V should also show promise,” he said.

On the Net:

Back To Work Policies Need Gender Awareness

New research published in Critical Social Policy

UK programs designed to help the unemployed get back to work and support young parents are losing impact because they are not designed with the participants’ gender in mind. Men as well as women can lose out as a result of ‘gender blind’ policies. This finding, based on two case studies in the North East of England, appears in Critical Social Policy this week, published by SAGE.

The paper, ‘Striking out’: Shifting labor markets, welfare to work policy and the renegotiation of gender performances, details stay-at-home dads gaining skills and negotiating gender constructs to find work in the childcare sector, only to find their communities reluctant to offer childcare work to men. Meanwhile, teenage dads are under-supported both in their new role as parents and in their position of responsibility as wage earners.

Since 1997, the English government has been committed to the inter-related policy aims of reducing health inequalities and tackling social exclusion. According to the paper’s authors, Katherine Smith from the University of Bath and Clare Bambra and Kerry Joyce from the University of Durham, initiatives have largely been focused on the supply side (aimed at potential employees), and have been ‘gender blind’. The underlying assumption is that unemployed men and women can get back to work when they receive the right combination of training and support.

The authors used data from qualitative case studies of two interventions in the North-East of England. One study offered unemployed parents childcare training, and the other provided vocational and advisory support to young parents.

The first case study investigates a ‘gender blind’ intervention aiming to get the unemployed back to work, which encouraged unemployed fathers to carry out childcare training to pursuing careers in childcare. The policy behind the intervention did not intend to challenge traditional gender roles. The participants found positive ways to deal with gender identities, so that they were able to envisage themselves gaining work in childcare. Unfortunately the intervention did not support the men in this, or in dealing with negative community attitudes to men working in childcare. The study highlights the oversight that in this labor market, working age men would be a target group for the intervention, yet the jobs available were in a sector dominated by women.

Teenage mothers often hit the headlines, but surprisingly little is known about their babies’ fathers. Despite the common stereotype of an invisible or absent teen father, recent research suggests that large numbers of teenage fathers do play a positive role in their children’s well being.

The second intervention provided vocational and advisory support to young parents. Legislation’s main focus has been on facilitating women’s dual roles as both mothers and employees, despite recent policy moves to acknowledge the importance of fathers’ involvement in re-conceptualizing relations between parents and children.

The short-sightedness of ‘gender blind’ policies has been noted before, but the authors point out that a common assumption is that gender blind means ‘male-centered’. The findings from the second case study suggest that interventions aiming to support young parents should consider the benefits of targeted support for young fathers in addition to that offered to young mothers. The first highlights the ineffectiveness of training the unemployed using a gender blind approach, such that their new skills may not be utilized.

“It is important not to assume that policy biases which favor some men in some situations will necessarily favor all men in all situations,” the authors argue. Not only should policies take socially excluded men into greater consideration, but policymakers also need to “consciously and consistently reflect on the potential impact of all social policies on gender relations.”

Although this may benefit more women than men, men cannot be ignored ““ not only because tackling inequality in gender needs their participation, but also because men’s lives are often closely intertwined with those of their families.

The UK’s Labour government is aware of the importance of gender to employment and social policy. A publication by the Social Exclusion Unit (1999) directly suggest that policy responses to some social problems need to be more sensitive to gender issues and the recent Equality Bill (2008) demonstrates a continuing awareness of the need for gender parity. This paper and other recent research suggest that this remains at best aspirational. Gender must be re-introduced and re-emphasized in social policy debate, and policies should be audited for their impact on gender.

On the Net:

Blood-Nerve Barrier Model Allows Closer Look At Diseases Affecting Peripheral Nerves

The cells regarded as the “gate-keepers” of the barrier between blood circulation and the peripheral nerves have been hard to study and even harder to isolate. However, researchers at Baylor College of Medicine have created a laboratory model that will enable scientists to study a wide variety of diseases affecting peripheral nerves.

They describe their model in the January 2010 issue of the Journal of Neuropathology and Experimental Neurology.

Specialized vascular system

“The barrier is known as the blood-nerve barrier and it regulates how peripheral nerves work. Peripheral nerves connect the central nervous system to the muscles of the limbs and sensory organs. This ‘gate keeper’ is a specialized vascular system that allows for proper nerve function by enabling the necessary nutrients in blood to flow in and unwanted material out,” said Dr. Eroboghene E. Ubogu, assistant professor of neurology and director of the Neuromuscular Immunopathology Research Laboratory at BCM.

Ubogu, who is the senior author on the study, added that very little is known about how the human blood-nerve barrier normally works or how it is altered when the peripheral nerves are diseased. The cells that make up the blood-nerve barrier are hard to study and extract because they are surrounded by a large amount of connective tissue, are present deep within the innermost layers and represent less than 1 percent of all cells found in peripheral nerves.

Ubogu and his research colleagues, including lead author research assistant Nejla Yosef and Dr. Robin H. Xia, a postdoctoral research associate, both in the department of neurology at BCM, began by isolating these specialized blood vessel cells from the sciatic nerve, the largest nerve in the body found at the back of the thighs.

“It started as trial and error since methods for this type of work had not been outlined for human peripheral nerves,” Ubogu said. “We looked at how other blood vessel and nerve cells were isolated from humans and other animals and modified those protocols until we achieved our goal.”

It took more than six tries of a process involving multiple steps before Ubogu and his team were successful in isolating the blood vessel cells that make up the blood-nerve barrier.

Prior to developing the blood-nerve barrier model, Ubogu and his colleagues used several laboratory methods to verify that these specialized blood vessel cells, called primary human endoneurial endothelial cells, were the cells that formed blood vessels within the innermost layer of peripheral nerves.

Better view of diseases

These cells were grown in laboratory dishes, and used to develop a blood-nerve barrier model system that behaves very similar to what is seen or expected in humans. This model will allow researchers to study how substances dissolved in the bloodstream, large molecules, drugs, microorganisms and white blood cells are able to enter or exit the peripheral nerves and why their movements may be restricted or permitted in times of health or disease.

“We can now see the gate, and if we understand how it is locked, opened and closed, we may be able to treat certain nerve diseases more effectively or even prevent them,” said Ubogu.

This model will give researchers a better view of how diseases such as HIV and diabetes affect the peripheral nervous system. Guillain-Barr© syndrome and chronic demyelinating inflammatory polyneuropathy (peripheral nerve inflammation that leads to a loss of movement or sensation) are also disorders that can be further investigated because of this research. A better understanding of how drugs get into peripheral nerves is also possible with this model.

“I would like research collaborations to grow from these findings,” Ubogu said. “The hope is that labs already studying peripheral nerve function and disease will be able to use the model to further their work.”

All researchers are with the Neuromuscular Immunopathology Research Laboratory at BCM.

The study was supported Guillain-Barr© Syndrome/Chronic Inflammatory Demyelinating Polyradiculoneuropathy Foundation International Research Grant and by the BCM New Investigator Start-Up Program.

On the Net: