Student Team Invents Device To Cut Dialysis Risk

Johns Hopkins University graduate students have invented a device to reduce the risk of infection, clotting and narrowing of the blood vessels in patients who need blood-cleansing dialysis because of kidney failure.

The device, designed to be implanted under the skin in a patient’s leg, would give a technician easy access to the patient’s bloodstream and could be easily opened and closed at the beginning and end of a dialysis procedure.

The prototype has not yet been used in human patients, but testing in animals has begun.

The students learned about the need for such a device last year while accompanying physicians on hospital rounds as part of their academic program. They watched as one doctor performed a procedure to open a narrowed blood vessel at a kidney patient’s dialysis access site. They learned that this narrowing was a common complication facing kidney patients.

The students discovered that kidney failure each year requires 1.5 million people globally and 350,000 in the United States alone to undergo regular hemodialysis to prevent a fatal buildup of toxins in the bloodstream. The students also learned that the three most common ways to connect the machine to a patient’s bloodstream work only for a limited time because of problems with infection, blood clots and narrowing of the blood vessels. Current dialysis access options are “grossly inadequate,” contributing to increased healthcare expenses and, in some cases, patient deaths, the students say.

To address these problems, the students developed an access port that can be implanted in the leg beneath the skin, reducing the risk of infection. The Hemova Port’s two valves can be opened by a dialysis technician with a syringe from outside the skin. The technician can similarly close the valves when the procedure is over, an approach that helps avoid infection and clotting. The device also includes a simple cleaning system, serving as yet another way to deter infections.

Currently, most dialysis access sites are in the arm or the heart. The Hemova device instead is sutured to the leg’s femoral vein, avoiding the unnaturally high blood flows that cause vessel narrowing when dialysis machines are connected to veins and arteries in the arm. The student inventors say the Hemova Port’s leg connection should allow the site to remain in use for a significantly longer period of time.

The port won a $10,000 first prize for Johns Hopkins graduate students in the 2011 ASME Innovation Showcase. The competition, involving 10 collegiate teams, was conducted in Texas earlier this month at the annual meeting of ASME, founded in 1880 as the American Society of Mechanical Engineers. Judges based their awards on technical ingenuity, quality of business plans, potential for success in the marketplace and other factors.

The five biomedical engineering students on the team were enrolled in a one-year master’s degree program in the university’s Center for Bioengineering Innovation and Design. Sherri Hall, Peter Li, Shishira Nagesh, Mary O’Grady and Thora Thorgilsdottir all recently graduated, but Li has remained in Baltimore to form a company that will continue to test and develop the project.

With help from the Johns Hopkins Technology Transfer staff, the team has filed for three provisional patents covering their technology. The patents list the five students and three medical faculty advisers as the inventors.

“Winning first-place in the ASME competition is a great honor,” Li said. “The award and the recognition will go a long way toward helping to continue further research, and we hope it will bring us closer to the day when our device is available to help dialysis patients.”

The Hemova team has applied for a $50,000 grant to conduct more animal testing in the coming months. Clinical trials involving human patients could begin as soon as 2013, the students said.

On the Net:

Intensive, Hands-On Effort Reduces Bloodstream Infections In Critically Ill Patients

Nurses drive quality initiative; save lives, money

Nurses on a surgical intensive care unit (SICU) at a large academic medical center cut bloodstream infections to zero and saved more than $200,000 during a six-month period.

The University of Maryland Medical Center SICU sustained a rate of zero central line-associated bloodstream infections (CLABSIs) for a 25-week period, eliminating 14 CLABSIs and saving 2-3 lives when compared to the same time period in the previous year, according to results of an intensive, six-month nursing initiative presented today at the 38th Annual Educational Conference and International Meeting of the Association for Professionals in Infection Control and Epidemiology (APIC).

To address the problem of higher-than-average CLABSI rates on the 19-bed unit, the hospital appointed dedicated infection control nurses (ICNs) to oversee central line catheter insertions. The effort was conducted in partnership with the director of Medical Surgical Nursing, Christina Cafeo. An ICN was present during every central line insertion and trained to call out breaks in technique, breaches in hand hygiene and to perform daily assessment of central line dressings, looking for signs of infection. The nursing staff, led by the ICN, held daily educational meetings, came up with clever reminders for best practices, and created incentive programs to keep the team motivated and engaged. They also removed excess clutter from patient rooms and hallways so it would be easier to clean them.

“It was truly a back-to-basics effort ““ these were just best practices at a granular level, led by the unit themselves. The nurses on the unit took ownership of best practices and drove the change,” said Michael Anne Preas, RN, BSN, CIC, infection preventionist at the University of Maryland Medical Center (UMMC) and co-leader of the improvement project. “When you have one of your own in the lead, and are reminding each other and encouraging each other to do your best, everybody gets on board, and that is what we saw.”

The average cost of a CLABSI is estimated to be $18,432. By eliminating 14 CLABSIs, Preas’s team saved $258,048, less $44,000 for a nurse’s salary for six months, resulting in a net savings to the hospital of $214,048. The initiative took place from July ““ December 2010.

“This is truly an example of taking infection prevention directly to the patient’s bedside,” said Russell N. Olmsted, MPH, CIC, APIC 2011 president. “Kudos to UMMC for showcasing the power of prevention right here in the host city for APIC 2011.”

On the Net:

Mammography Screening Reduces Breast Cancer Mortality

Breast cancer screening with mammograms has shown proven results in a significant reduction in the number of breast cancer deaths, according to a long-term Swedish study.

The long-running and largest-ever breast cancer screening study has shown that regular mammograms has increased the number of lives saved over time, the research team said on Tuesday.

The study, published online in the journal Radiology, followed more than 130,000 women in two communities in Sweden. An international team of researchers involved in the study found that 30 percent fewer women in the screening group died of breast cancer and the effect persisted year after year.

Now, 29 years after the study began, they found that the number of women saved from the cancer goes up with each year of screening.

“We’ve found that the longer we look, the more lives are saved,” Professor Stephen Duffy of Queen Mary, University of London, told Reuters in a statement.

“Mammographic screening confers a substantial relative and absolute reduction in breast cancer mortality risk in the long-term,” Duffy said in a statement. “For every 1,000 to 1,500 mammograms, one breast cancer death is prevented.”

The Swedish Two-County Trial was the first breast cancer screening trial to show a reduction in breast cancer mortality from screening and mammography alone. The trial randomly separated 133,065 women into two groups, one that received screening and another that received usual care. The screening phase of the trial lasted about 7 years. Women between the ages of 40 and 49 were screened, on average, every 2 years. Woman ages 50 to 74 were screened, on average, every 33 months.

Researchers, nearly 30 years after the trial began, analyzed the original data and the follow-up data to estimate the long-term effect of mammography screening on breast cancer mortality. At 29 years, this represents the longest recorded follow-up period for a mammographic screening study.

Mortality analysis at follow-up points showed a reduction in the breast cancer mortality rate in the screening population, similar to the original trial results. But while the relative effect of screening on breast cancer mortality remained stable over the follow-up period, the benefit in terms of lives saved increased with longer follow-up times. At 29 years of follow-up, the estimated number of women needed to undergo screening every 2 or 3 years over a seven-year period to prevent one breast cancer death ranged from 414 to 519.

“Most of the deaths prevented would have occurred more than 10 years after the screening started,” said Duffy. “This indicates that the long-term benefits of screening in terms of deaths prevented are more than double those often quoted for short-term follow-up.”

“Unfortunately, we cannot know for certain know who will and who will not develop breast cancer,” he noted. “But if you undergo a recommended screening regimen, and you are diagnosed with breast cancer at an early stage, chances are very good that it will be successfully treated.”

Dr. Stamatia Destounis, a radiologist at Elizabeth Wende Breast Care in Rochester, New York, who was not involved in the study, told Reuters that radiologists have been quoting results of the Swedish study for years and the new findings show breast cancer screening is “even more of a benefit than we understood.”

Sweeping changes in the US screening guidelines two years ago that scaled back recommendations on breast cancer screening caused confusion among doctors and patients about the benefits of screening. “We’ve had to do a lot of education of the patients and their doctors. This will help for that,” said Destounis.

New breast screening recommendations issued in 2009 by the US Preventive Services Task Force recommended against routine mammograms for women in their 40s and said women in their 50s should get them every other year instead of every year.

Those guidelines contradicted years of messages about the need for routine breast cancer screening starting at age 40, bringing forth protests from breast cancer experts and advocacy groups who argued the recommendation for fewer screenings would confuse women and result in more deaths from breast cancer.

The latest results from the Swedish study show the rate of false positive results was low.

“We saw the actual number of over-diagnosed cases was really very small — less than 5 percent of the total,” Robert Smith, director of cancer screening at the American Cancer Society and one of the study’s authors, told Reuters in a telephone interview.

The American Cancer Society, among others, have stuck by their long-standing guidelines of yearly breast exams for women in their 40s, stressing that the breast X-rays have been proven to save lives by spotting tumors early, when they are most easily treated.

“I think for anybody who was beginning to have their faith shaken in the value of mammography, these data show mammography is quite valuable as a public health approach to reducing deaths from breast cancer,” said Smith.

Screening women 40 to 54 every 18 months and screening women 55 and older every two years would be a reasonable plan, Duffy said.

The new findings may not speak to the frequency of screening issue, but they do make clear that screening works. “Everyone must make up their own mind, but certainly from combined results from all the screening trials, mammography in women aged 40-49 does reduce deaths from breast cancer,” he said.

Breast cancer is the second-leading cause of cancer death in US women, after lung cancer. Worldwide, more than half a million people die each year from breast cancer and 1.3 million are diagnosed.

On the Net:

US Works To Protect Businesses From Attack

The growing threat of cyber terrorism against businesses and their websites is being tackled head on by the US government, unveiling a new system of guidance on Monday with the goal of helping software behind websites, power grids and other services be less susceptible to hacking.

The US Department of Homeland Security’s (DHS) system includes an updated list of the top 25 programming errors that enable hackers to gain access to computer networks. The agency is also adding new tools to help software programmers eliminate the most dangerous types of mistakes and enable organizations to demand and buy more secure products.

The effort has been in development for three years, according to Robert A. Martin, principal engineer at Mitre, a technology nonprofit organization that conducts federal research in systems engineering, that was behind the development of the program.

The costs of programming errors that make software open to attack was highlighted by the numerous recent cyber attacks that have resulted in theft of credit card info, user names and passwords from business, government and banking websites.

During an online news conference, government officials noted that many stakeholders stressed the urgency for better training and education for people writing software. Officials said that organizations are under constant attack.

Homeland Security hopes that the program will make it easier for companies and agencies to better secure their networks and contribute to building a safer global network.

“We’re going after root cause issues,” a senior DHS official, who spoke on anonymity, told the New York Times. “You can make your enterprise more resilient from the people who would attack you.”

Jeremiah Grossman, chief technology officer for WhiteHat Security, told the New York Times that the guidance could encourage a long-awaited shift in the technology industry’s approach to computer security. Many organizations do not recognize that software security should be the focus, he said, “which is why you see the bulk of the security dollars spent on defense flowing to firewall and antivirus products, and precisely why the current wave of breaches keep happening.”

Currently, when owners of small businesses buy software or hire a firm to build a website, it is difficult to know whether the programs are really secure or not, said Alan Paller, director of research at SANS Institute, a computer-security organization.

He emphasized during the online presentation on Monday that this was a “first step” and much work still needed to be done, especially with training.

The information on the new program, which has been compiled on a special website that the public can view, will tell people what to look for in setting up a secure website and how to assess potential errors in programming, he said. It also sets up a scorecard, so that companies looking for a firm to set up a website can check their security score.

The Top 25 list, created by SANS and Mitre with the help of top software security experts in the US and Europe, includes the top programming errors that have been used in many recent attacks.

The top programming error is one that allows so-called SQL-injection attacks on Websites, which were successfully used by hacking group LulzSec. It successfully used the flaws to cause databases to deliver user names and passwords, including those from the FBI’s InfraGard program and NATO’s online bookstore.

The new framework will also highlight which programming errors are of greatest concern to banking and commerce sites.

The framework is already beginning to show up in some companies that make tools to test software for dangerous programming errors, said Paller. Eventually there will be services that help businesses evaluate whether the software they are considering has withstood rigorous scrutiny.

Avoiding programming errors is crucial in fending off today’s cyber terrorists, said Paller. “This is the only way to get around “Ëœzero days’,” referring to attacks that make use of software vulnerabilities that are unknown and, therefore, cannot be fixed quickly with patches. “The only possible defense is to stop the error from being in the software in the first place.”

On the Net:

Living Antibiotic Effective Against Salmonella

Scientists have tested a predatory bacterium ““ Bdellovibrio ““ against Salmonella in the guts of live chickens. They found that it significantly reduced the numbers of Salmonella bacteria and, importantly, showed that Bdellovibrio are safe when ingested.

The research was funded by the Biotechnology and Biological Sciences Research Council, carried out by Professor Liz Sockett’s team at The University of Nottingham, with Dr Robert Atterbury and Professor Paul Barrow at the University of Nottingham Vet School; and published in the journal Applied and Environmental Microbiology.

Researcher Dr Laura Hobley said “Bdellovibrio has the potential to be used as a living antibiotic against some major human and animal pathogens, such as E. coli and other so-called Gram-negative bacteria.”

Previous studies have shown that Bdellovibrio is very effective at invading and killing other bacterial cells in a test tube. It looks likely to provide an alternative to antibiotic medicines at a time when bacterial resistance is a significant problem to human and animal health.

Dr Hobley continued “We think that Bdellovibrio could be particularly useful as a topical treatment for wounds or foot rots but we wanted to know what might happen if it is ingested ““ either deliberately as a treatment, or by accident.”

Salmonella likes to grow in the guts of poultry and other animals and can cause food poisoning in humans. In lab experiments Bdellovibrio can kill Salmonella by breaking into the cells and destroying them from the inside. This research shows that it also works inside the gut of a bird and is safe, not harming them or changing their behaviour.

Bdellovibrio reduced the numbers of Salmonella by 90% and the birds remained healthy, grew well, and were generally in good condition.

“We concluded that Bdellovibrio aren’t long lived in the bird guts ““ they had a strong effect for about 48 hours, which dropped off after this time. If we were to use this method to completely rid the birds of Salmonella, we might have to test a program of multiple dosing. But the point of this study was really to ensure that Bdellovibrio is safe and effective when ingested,” said Dr Hobley.

Professor Douglas Kell, Chief Executive, BBSRC said “Once we have understood the fundamental nature of an extraordinary organism such as Bdellovibrio, it makes sense that we should look at potential uses for it. The impact of bacterial infections on human and animal health is significant and since antibiotic resistance is a major issue, alternatives from nature may become increasingly important.”

On the Net:

Microbiologists Discover How Cavity-Causing Microbes Invade Heart

Scientists have discovered the tool that bacteria normally found in our mouths use to invade heart tissue, causing a dangerous and sometimes lethal infection of the heart known as endocarditis. The work raises the possibility of creating a screening tool ““ perhaps a swab of the cheek, or a spit test ““ to gauge a dental patient’s vulnerability to the condition.

The identification of the protein that allows Streptococcus mutans to gain a foothold in heart tissue is reported in the June issue of Infection and Immunity by microbiologists at the University of Rochester Medical Center.

S. mutans is a bacterium best known for causing cavities. The bacteria reside in dental plaque ““ an architecturally sophisticated goo composed of an elaborate molecular matrix created by S. mutans that allows the bacteria to inhabit and thrive in our oral cavity. There, they churn out acid that erodes our teeth.

Normally, S. mutans confines its mischief to the mouth, but sometimes, particularly after a dental procedure or even after a vigorous bout of flossing, the bacteria enter the bloodstream. There, the immune system usually destroys them, but occasionally ““ within just a few seconds ““ they travel to the heart and colonize its tissue, especially heart valves. The bacteria can cause endocarditis ““ inflammation of heart valves ““ which can be deadly. Infection by S. mutans is a leading cause of the condition.

“When I first learned that S. mutans sometimes can live in the heart, I asked myself: Why in the world are these bacteria, which normally live in the mouth, in the heart? I was intrigued. And I began investigating how they get there and survive there,” said Jacqueline Abranches, Ph.D., a microbiologist and the corresponding author of the study.

Abranches and her team at the University’s Center for Oral Biology discovered that a collagen-binding protein known as CNM gives S. mutans its ability to invade heart tissue. In laboratory experiments, scientists found that strains with CNM are able to invade heart cells, and strains without CNM are not.

When the team knocked out the gene for CNM in strains where it’s normally present, the bacteria were unable to invade heart tissue. Without CNM, the bacteria simply couldn’t gain a foothold; their ability to adhere was about one-tenth of what it was with CNM.

The team also studied the response of wax worms to the various strains of S. mutans. They found that strains without CNM were rarely lethal to the worms, while strains with the protein were lethal 90 percent of the time. Then, when Abranches’ team knocked out CNM in those strains, they were no longer lethal ““ those worms thrived.

The work may someday enable doctors to prevent S. mutans from invading heart tissue. Even sooner, though, since some strains of S. mutans have CNM and others do not, the research may enable doctors to gauge a patient’s vulnerability to a heart infection caused by the bacteria.

Abranches has identified five specific strains of S. mutans that carry the CNM protein, out of more than three dozen strains examined. CNM is not found in the most common type of S. mutans found in people, type C, but is present in rarer types of S. mutans, including types E and F.

“It may be that CNM can serve as a biomarker of the most virulent strains of S. mutans,” said Abranches, a research assistant professor in the Department of Microbiology and Immunology. “When patients with cardiac problems go to the dentist, perhaps those patients will be screened to see if they carry the protein. If they do, the dentist might treat them more aggressively with preventive antibiotics, for example.”

Until more research is done and a screening or preventive tool is in place, Abranches says the usual advice for good oral health still stands for everyone.

“No matter what types of bacteria a person has in his or her mouth, they should do the same things to maintain good oral health. They should brush and floss their teeth regularly ““ the smaller the number of S. mutans in your mouth, the healthier you’ll be. Use a fluoride rinse before you go to bed at night. And eat a healthy diet, keeping sugar to a minimum,” added Abranches.

Abranches presented the work at a recent conference on the “oral microbiome” hosted by the University’s Center for Oral Biology. The center is part of the Medical Center’s Eastman Institute for Oral Health, a world leader in research and post-doctoral education in general and pediatric dentistry, orthodontics, periodontics, prosthodontics, and oral surgery.

Additional authors of the study include laboratory technician James Miller; former technician Alaina Martinez; Patricia Simpson-Haidaris, Ph.D., associate professor of Medicine; Robert Burne, Ph.D., of the University of Florida; and Abranches’ husband, Jose Lemos, Ph.D., of the Center for Oral Biology, who is also assistant professor in the Department of Microbiology and Immunology. The work was funded by the American Heart Association.

On the Net:

Scientists Decode Deep History Of Coconuts

Written in coconut DNA are two origins of cultivation, several ancient trade routes, and the history of the colonization of the Americas

By Diana Lutz, Washington University in St. Louis

The coconut (the fruit of the palm Cocos nucifera) is the Swiss Army knife of the plant kingdom; in one neat package it provides a high-calorie food, potable water, fiber that can be spun into rope, and a hard shell that can be turned into charcoal. What’s more, until it is needed for some other purpose it serves as a handy flotation device.

No wonder people from ancient Austronesians to Captain Bligh pitched a few coconuts aboard before setting sail. (The mutiny of the Bounty is supposed to have been triggered by Bligh’s harsh punishment of the theft of coconuts from the ship’s store.)

So extensively is the history of the coconut interwoven with the history of people traveling that Kenneth Olsen, a plant evolutionary biologist, didn’t expect to find much geographical structure to coconut genetics when he and his colleagues set out to examine the DNA of more than 1300 coconuts from all over the world.

“I thought it would be mostly a mish-mash,” he says, thoroughly homogenized by humans schlepping coconuts with them on their travels.

He was in for a surprise. It turned out that there are two clearly differentiated populations of coconuts, a finding that strongly suggests the coconut was brought under cultivation in two separate locations, one in the Pacific basin and the other in the Indian Ocean basin. What’s more, coconut genetics also preserve a record of prehistoric trade routes and of the colonization of the Americas.

The discoveries of the team, which included Bee Gunn, now of the Australian National University in Australia, and Luc Baudouin of the Centre International de Recherches en Agronomie pour le D©veloppement (CIRAD) in Montpellier, France, as well as Olsen, associate professor of biology at Washington University in St. Louis, are described in the June 23 online issue of the journal PLoS One.

Morphology a red herring

Before the DNA era, biologists recognized a domesticated plant by its morphology. In the case of grains, for example, one of the most important traits in domestication is the loss of shattering, or the tendency of seeds to break off the central grain stalk once mature.

The trouble was it was hard to translate coconut morphology into a plausible evolutionary history.

There are two distinctively different forms of the coconut fruit, known as niu kafa and niu vai, Samoan names for traditional Polynesian varieties. The niu kafa form is triangular and oblong with a large fibrous husk. The niu vai form is rounded and contains abundant sweet coconut “water” when unripe.

“Quite often the niu vai fruit are brightly colored when they’re unripe, either bright green, or bright yellow. Sometimes they’re a beautiful gold with reddish tones,” says Olsen.

Coconuts have also been traditionally classified into tall and dwarf varieties based on the tree “habit,” or shape. Most coconuts are talls, but there are also dwarfs that are only several feet tall when they begin reproducing. The dwarfs account for only 5 percent of coconuts.

Dwarfs tend to be used for “eating fresh,” and the tall forms for coconut oil and for fiber.

“Almost all the dwarfs are self fertilizing and those three traits “” being dwarf, having the rounded sweet fruit, and being self-pollinating “” are thought to be the definitive domestication traits,” says Olsen.

“The traditional argument was that the niu kafa form was the wild, ancestral form that didn’t reflect human selection, in part because it was better adapted to ocean dispersal,” says Olsen. Dwarf trees with niu vai fruits were thought to be the domesticated form.

The trouble is it’s messier than that. “You almost always find coconuts near human habitations,” says Olsen, and “while the niu vai is an obvious domestication form, the niu kafa form is also heavily exploited for copra (the dried meat ground and pressed to make oil) and coir (fiber woven into rope).”

“The lack of universal domestication traits together with the long history of human interaction with coconuts, made it difficult to trace the coconut’s cultivation origins strictly by morphology,” Olsen says.

DNA was a different story.

Collecting coconut DNA

The project got started when Gunn, who had long been interested in palm evolution, and who was then at the Missouri Botanical Garden, contacted Olsen, who had the laboratory facilities needed to study palm DNA.

Together they won a National Geographic Society grant that allowed Gunn to collect coconut DNA in regions of the western Indian Ocean for which there were no data. The snippets of leaf tissue from the center of the coconut tree’s crown she sent home in zip-lock bags to be analyzed.

“We had reason to suspect that coconuts from these regions “”especially Madagascar and the Comoros Islands “” might show evidence of ancient “Ëœgene flow’ events brought about by ancient Austronesians setting up migration routes and trade routes across the southern Indian Ocean,” Olsen says.

Olsen’s lab genotyped 10 microsatellite regions in each palm sample. Microsatellites are regions of stuttering DNA where the same few nucleotide units are repeated many times. Mutations pop up and persist pretty easily in these regions because they usually don’t affect traits that are important to survival and so aren’t selected against, says Olsen. “So we can use these genetic markers to “Ëœfingerprint’ the coconut,” he says.

The new collections were combined with a vast dataset that had been established by CIRAD, a French agricultural research center, using the same genetic markers. “These data were being used for things like breeding, but no one had gone through and systematically examined the genetic variation in the context of the history of the plant,” Olsen says.

Two origins of cultivation

The most striking finding of the new DNA analysis is that the Pacific and Indian Ocean coconuts are quite distinct genetically. “About a third of the total genetic diversity can be partitioned between two groups that correspond to the Indian Ocean and the Pacific Ocean,” says Olsen.

“That’s a very high level of differentiation within a single species and provides pretty conclusive evidence that there were two origins of cultivation of the coconut,” he says.

In the Pacific, coconuts were likely first cultivated in island Southeast Asia, meaning the Philippines, Malaysia, Indonesia, and perhaps the continent as well. In the Indian Ocean the likely center of cultivation was the southern periphery of India, including Sri Lanka, the Maldives, and the Laccadives.

The definitive domestication traits “”the dwarf habit, self-pollination and niu vai fruits “” arose only in the Pacific, however, and then only in a small subset of Pacific coconuts, which is why Olsen speaks of origins of cultivation rather than of domestication.

“At least we have it easier than scientists who study animal domestication,” he says. “So much of being a domesticated animal is being tame, and behavioral traits aren’t preserved in the archeological record.”

Did it float or was it carried?

One exception to the general Pacific/Indian Ocean split is the western Indian Ocean, specifically Madagascar and the Comoros Islands, where Gunn had collected. The coconuts there are a genetic mixture of the Indian Ocean type and the Pacific type.

Olsen and his colleagues believe the Pacific coconuts were introduced to the Indian Ocean a couple of thousand years ago by ancient Austronesians establishing trade routes connecting Southeast Asia to Madagascar and coastal east Africa.

Olsen points out that no genetic admixture is found in the more northerly Seychelles, which fall outside the trade route. He adds that a recent study of rice varieties found in Madagascar shows there is a similar mixing of the japonica and indica rice varieties from Southeast Asia and India.

To add to the historical shiver, the descendants of the people who brought the coconuts and rice are still living in Madagascar. The present-day inhabitants of the Madagascar highlands are descendants of the ancient Austronesians, Olsen says.

Much later the Indian Ocean coconut was transported to the New World by Europeans. The Portuguese carried coconuts from the Indian Ocean to the West Coast of Africa, Olsen says, and the plantations established there were a source of material that made it into the Caribbean and also to coastal Brazil.

So the coconuts that you find today in Florida are largely the Indian ocean type, Olsen says, which is why they tend to have the niu kafa form.

On the Pacific side of the New World tropics, however, the coconuts are Pacific Ocean coconuts. Some appear to have been transported there in pre-Columbian times by ancient Austronesians moving east rather than west.

During the colonial period, the Spanish brought coconuts to the Pacific coast of Mexico from the Philippines, which was for a time governed on behalf of the King of Spain from Mexico.

This is why, Olsen says, you find Pacific type coconuts on the Pacific coast of Central America and Indian type coconuts on the Atlantic coast.

“The big surprise was that there was so much genetic differentiation clearly correlated with geography, even though humans have been moving coconut around for so long.”

Far from being a mish-mash, coconut DNA preserves a record of human cultivation, voyages of exploration, trade and colonization.

On the Net:

Snake Venom Symptoms Slowed With Ointment

Researchers in Australia have found that a chemical compound typically used on heart patients may raise chances of survival for snakebite victims.

The study, published in Nature Medicine, claims chemical nitric oxide can slow down, by as much as 50 percent, the time it takes for snake venom to enter the bloodstream allowing time for victims to seek medical help, said lead author Dirk van Helden, professor at the School of Biomedical Sciences, University of Newcastle in Australia.

Reuters reports that although poisonous snakes are responsible for the deaths of only a handful of people in the United States each year, the World Health Organization (WHO) places the global number at about 100,000 people.

Bulky proteins from the venom of some snakes do not infiltrate the bloodstream immediately but find their way through the lymphatic system to the heart. Ideally, a pressure bandage is applied in hopes of slowing the venom’s spread until the victim can receive antivenom medicine.

Antivenom is essentially antibodies that lock onto and neutralize the poison but many victims are unable to receive antivenom in time especially if the puncture is on the face or near the neck, Science Now reports. Van Helden and colleagues attempted to find a chemical method to detain the venom.

The researchers settled on an ointment that contains glyceryl trinitrate, the compound better known as nitroglycerin that doctors have used to treat everything from tennis elbow to angina. The ointment releases nitric oxide, causing the lymphatic vessels to clench.

The researchers first injected volunteers in the foot with a harmless radioactive mixture that, like some snake toxins, moves through the lymphatic vessels. In control subjects that didn’t receive the ointment, the mixture took 13 minutes to climb to the top of the leg.

The time for the venom to take effect was lengthened to 54 minutes if the researchers immediately smeared the ointment around the injection site, the team reported. Further experiments using real toxins in rats yielded roughly the same results.

Finally, the researchers compared the survival time in rats injected with venom that were treated with the ointment against those that were not, and found that the nitric oxide rats kept breathing 50 percent longer, AFP reports.

“These results point to a new method of snakebite first aid that may also be useful for bites to the torso or head,” the researchers concluded.

On the Net:

Amazon’s Bezos Building 10,000 Year Clock: Wired

Amazon.com founder Jeff Bezos has a dream: to build a clock that will run for 10,000 years.

Bezos, a self-made billionaire has begun construction on the giant timepiece, which will be housed mainly underground, according to Wired Magazine’s Dylan Tweney.

The clock isn’t just the ultimate timepiece; it is a symbol of the power of long-term thinking. He hopes that building the mega-clock will change the way humans think about time, encouraging our distant descendants to take a longer view than we have.

The clock, if it holds true, will exist far longer than the United States. “Whole civilizations will rise and fall. New systems of government will be invented. You can’t imagine the world “” no one can “” that we’re trying to get this clock to pass through,” Bezos told Wired.

To help achieve his mission of encouraging long-term thinking, Bezos launched a website to publicize his clock. People who want to see the clock once it is finished can put their names on a waiting list — although it will be years before it is completed.

Bezos’ epic undertaking, with the help of numerous people, can be compared to the construction process of the Egyptian pyramids. And as with the ancient pharaohs, it takes a certain level of pride and arrogance to even consider taking on such a challenge.

The project is, among other things, a monumental engineering problem, challenging the makers to think about how to keep a machine intact, operational and accurate over thousands of years.

The idea for such a clock has been around since at least 1995, when Danny Hillis proposed the idea in Wired Magazine. Since then, Hillis and others have built prototypes and created the Long Now Foundation, a non-profit created just to work on the clock and promote long-term thinking. But there was no takers on actually building a full-scale 10,000-year clock until Bezos stepped up to the plate, putting some $42 million toward the unique project.

Contractors began machining components for the clock last year, which include the 8-foot stainless steel gears and the Geneva wheels that will ring the chimes. Also, computers at NASA’s Jet Propulsion Laboratory have spent months calculating the Sun’s position in the sky at noon every day for the next 10,000 years, data the clock will use to stay accurate.

Excavation in the Texas desert where the clock will be housed has also begun, and just last month, the Smithsonian agreed to let the Long Now Foundation install a 10,000-year clock in one of its Washington museums, once they can find someone to fund it.

Building a clock that will run for 10,000 years is no small feat. In Texas, builders have started drilling an access tunnel into the base of the ridge where the clock will go. A pilot hole will also be drilled, going straight down from the top of the ridge, until it meets the access tunnel. And then will bring a 12’7″ bit into the bottom and drill back up, carving out a vertical shaft.

Once that phase is complete, builders will install a movable platform holding a 2.5-ton robot arm with stonecutting saw attached at the end. The arm/saw will begin carving out a spiral staircase into the vertical shaft, from top to bottom, one step at a time.

The clock will have massive metal gears, a huge stone weight, and precise, titanium escapement inside a protective quartz box, that will go into the shaft. The work is scheduled to be complete within a few years, at which time the clock will be set in motion.

Bezos ponders about the future: “In the year 4000, you’ll go see this clock and you’ll wonder, “ËœWhy on Earth did they build this?”

The answer, he hopes, will lead you to think more profoundly about the distant future and your effects on it.

Image Caption: Entrance to a series of tunnels and chambers being created within the mountain.

On the Net:

Electronic Arts Website Hacked

Electronic Arts Inc. (EA) confirmed on its website Friday that hackers may have gotten their hands on user information such as birth dates, phone numbers and mailing addresses in what they are calling a “highly sophisticated” attack.

A computer network hosting BioWare Edmonton’s “Neverwinter Nights” game forums was the target of the attack and the company said it will continue to investigate the intrusion. EA also updated its Q&A section to give information on the recent data breach and to ensure its customers that it is doing everything possible to safeguard their system from further attacks.

EA said in its Q&A, however, that it isn’t possible to stop hackers 100 percent of the time. EA said it found out about the hack job on June 14.

“We have moved swiftly to implement additional security controls to prevent this type of breach from happening again to secure your data and are conducting further evaluations now,” EA said in a message on its website.

No credit card information was taken during the attack, according to the California-based company, although the information that was looted could be very useful for phishing attempts.

The company has disabled “potentially affected legacy BioWare accounts” and reset passwords of any affected EA accounts. It further said that anyone with a potentially affected account would receive an email with more information.

This attack is the latest in a treasure trove of cyber attacks that have targeted Sony, Nintendo, Sega, PBS, the Senate, the CIA, and even the Arizona Police.

EA offered tips to its customers and to consumers in general on how to avoid identity theft. It also directed its users to the US Federal Trade Commission’s Internet fraud website for further information.

On the Net:

Black Youth Targeted In Menthol Cigarette Marketing

Neighborhoods in California with a high population of African-American students are being targeted by tobacco companies, according to a Stanford University study.

Researchers from Stanford School of Medicine analyzed data collected in 2006 that compared pricing and advertisements for Lorillard’s Newport menthol cigarettes, the top menthol brand, as well as that of Marlboro, the top non-menthol brand.

Data from 407 stores that were within walking distance of 91 California schools were collected in the 2006 survey, according to a recent Reuters report.

Results from the analysis of the survey found that as the population of African-American students rose in a neighborhood, so did the advertising efforts of the Newport tobacco company. In addition, the prices were likely lower as well.

The study did not find similar trends for Marlboro, which is made by Altria Group Inc unit Philip Morris USA.

Head researcher Lisa Henriksen, PhD, senior research scientist at the Stanford Prevention Research Center says that the study “shows a predatory marketing pattern geared to luring young African Americans into becoming smokers.”

Research found that the preference for menthol cigarettes among teenage smokers increased to 48.3 percent in 2008 from 43.4 percent in 2004. Furthermore, menthol cigarettes were found to be more popular with African-American smokers between 12 and 17 years of age when compared to Hispanics and non-Hispanic whites of similar ages.

According to the study, for every 10-percentage-point increase in the number of African-American students at a school, an increase by 5.6 percentage points occurred with advertisements for menthol cigarettes. The odds that a discount advertised for Newport’s menthol cigarettes also increased 1.5 times.

Currently the FDA is gathering information to determine if menthol used as a flavoring agent should be banned in cigarettes.

The Tobacco Products Scientific Advisory Committee (TPSAC) reported that the use of menthol cigarettes is highest among minorities, teenagers and low-income populations, saying that “the removal of menthol cigarettes from the marketplace would benefit public health in the United States.”

Even though the TPSAC was asked by the FDA to investigate the harmful effects of tobacco use and how it is marketed to the public, the FDA does not need to follow the committee’s recommendation.

A spokesperson for the FDA says that the edited version of the report will be available on the agency’s website soon, but no deadline has been made as to when a final decision on menthol would be made.

“The committee was charged with considering a broad definition of harm to smokers and other populations, particularly youth,” Henriksen told Reuters Health.

“The tobacco companies went out of their way to argue to the Food & Drug Administration that they don’t use racial targeting,” Henriksen says. “This evidence is not consistent with those claims.”

“We think our study, which shows the predatory marketing in school neighborhoods with higher concentrations of youth and African-American students, fits a broad definition of harm,” she says.

The study was published June 24 in Nicotine & Tobacco Research.

On the Net:

Surgery, Extreme Low-calorie Diet Reverse Type 2 Diabetes

New hope for type 2 diabetes sufferers?

A new review of previous studies on bariatric surgery and its metabolic effects suggests that more than 60 percent of obese diabetics can be cured of the blood sugar disease ““ at least in the short term ““ by undergoing weight-loss surgery.  Meanwhile, a separate British study published on Friday reveals that people with obesity-related type 2 diabetes were cured of the condition by consuming an extreme, low-calorie diet for two months.

Weight Loss Surgery

Although previous studies had suggested that weight-loss surgery could reverse diabetes, researchers from the Netherlands formalized the evidence through their analysis of nine previous studies, finding that more than 60 percent of obese diabetics could be cured of the disease, and 80 percent could stop taking their diabetes medications altogether after undergoing weight loss surgery.

Type 2 diabetes has been traditionally been considered a progressive condition that is initially controlled by diet, then medications and finally, with insulin injections.  The condition is caused by too much glucose in the blood due to the pancreas not producing enough insulin – a hormone which breaks down glucose into energy in the cells ““ or due to the body not reacting to it, known as insulin sensitivity.

The Dutch researchers sifted through the data from nine previous studies of diabetics who had either undergone gastric bypass or gastric banding surgery.   Eight of the nine studies included between 23 and 177 patients, while one study monitored the outcomes of 82,000 patients.  Each study tracked the patients for at least one year after their weight loss surgery.

In a gastric banding procedure, the surgeon places a ring over the top of the stomach, which limits the amount of food the patient can consume.  In gastric bypass, food is redirected around the stomach into a small pouch, which reduces the amount of food a person can eat and hampers absorption of the food.  More than 220,000 Americans had some type of weight loss surgery in 2009, according to the American Society for Metabolic and Bariatric Surgery.  The average price per patient was about $20,000.

After conducting their analysis, the researchers found that 83 percent of diabetics who underwent gastric bypass were able to stop taking diabetes medications, some within days of the surgery. Meanwhile, 62 percent of diabetics who had the gastric banding procedure could stop taking diabetes medication while maintaining good control of their blood sugar, wrote the researchers in their report published in the June issue of Archives of Surgery.

Those outcomes surpass what can be achieved with conventional diabetes treatments, said Dr. Rick Meijer at the Institute for Cardiovascular Research at Vrije Universiteit in Amsterdam, lead author of the report.

“In standard practice, only a very minor group of individuals with an iron-will can lose enough weight to be cured from type 2 diabetes mellitus,” he told Reuters.

“The rest of patients have a chronic disease with the need of daily medication-regimens.”

Meijer said roughly 90 percent of the 18 million U.S. diabetes cases are due to excessive weight. 

However, not all obese diabetics are eligible for weight loss surgery, and it is not yet clear how long the surgery’s effect on diabetes can last.

Indeed, one study included in the analysis found that, just one third of patients whose diabetes had ended continued to have good control of their blood sugar 10 years after surgery.

Surgery also includes additional, but still low, potential for complications. For instance, one recent study that followed patients up to a month after surgery found that seven percent experienced some type of problem, ranging from minor wound infections, nausea and food intolerance to bleeding, kidney failure and other serious complications.   Additionally, some patients gain their weight back over time.

Extreme Diet

There may yet be good news for obese diabetics looking for a non-surgical route to control their condition.  

A new British study published on Friday in the journal Diabetologia found that people with obesity-related type 2 diabetes have been cured, at least in the short term, by eating an extreme, low-calorie, diet for two months.

The early stage clinical trial of 11 participants showed that all reversed their diabetes by dramatically reducing their food intake to just 600 calories a day for two months.

After three months, seven of the participants remained free of diabetes.

“To have people free of diabetes after years with the condition is remarkable – and all because of an eight week diet,” said study leader Professor Roy Taylor of Newcastle University.

“This is a radical change in understanding Type 2 diabetes.  It will change how we can explain it to people newly diagnosed with the condition,” he said.

“While it has long been believed that someone with Type 2 diabetes will always have the disease, and that it will steadily get worse, we have shown that we can reverse the condition.”

The researchers said the study demonstrates that people who keep to a very low calorie diet can remove fat that is clogging up the pancreas, allowing normal insulin secretion to resume.

The 11 study participants were put on an extreme diet of just 600 calories a day consisting of liquid diet drinks and non-starchy vegetables.  They were then matched to a control group of people without diabetes, and monitored over the course of eight weeks.

The researchers measured insulin production from the participants’ pancreas, along with fat content in both the liver and the pancreas. After just one week, the team found that the participants’ pre-breakfast blood sugar levels had returned to normal. Furthermore, a special MRI scan of their pancreas’ revealed that the fat levels had returned from an elevated level to normal (from around 8% to 6%). 

In step with this, the pancreas regained the normal ability to make insulin and, as a result, post-meal blood sugar levels steadily improved.

The researchers followed up with the participants three months later, during which time the volunteers had returned to their normal eating habits. 

Of the ten people re-tested, seven remained free of diabetes.

“We believe this shows that Type 2 diabetes is all about energy balance in the body,” explained Professor Taylor.

“If you are eating more than you burn, then the excess is stored in the liver and pancreas as fat which can lead to Type 2 diabetes in some people. What we need to examine further is why some people are more susceptible to developing diabetes than others.”

“I no longer needed my diabetes tablets” said Gordon Parmley, 67, who took part in the study.

“I love playing golf but I was finding that when I was out on the course sometimes my vision would go fuzzy and I would have trouble focusing.  It was after this that I was diagnosed with Type 2 diabetes. That was about six years ago and from then on, I had to control the diabetes with a daily combination of tablets – the diabetes drug, gliclazide and tablets for my cholesterol,” he said.

“When my doctor mentioned the trial I thought I would give it a go as it might help me and other diabetics. I came off my tablets and had three diet shakes a day and some salad or vegetables but it was very, very difficult,” he said.

“At first the hunger was quite severe and I had to distract myself with something else ““ walking the dog, playing golf ““ or doing anything to occupy myself and take my mind off food but I lost an astounding amount of weight in a short space of time.”

“At the end of the trial, I was told my insulin levels were normal and after six years, I no longer needed my diabetes tablets,” he said.

“Still today, 18 months on, I don’t take them. It’s astonishing really that a diet ““ hard as it was ““ could change my health so drastically. After six years of having diabetes I can tell the difference – I feel better, even walking round the golf course is easier,” he said.

Dr. Iain Frame, Director of Research at Diabetes UK in Britain, praised the study, but warned about the potential difficulty in keeping to such a restrictive diet.

“We welcome the results of this research because it shows that Type 2 diabetes can be reversed, on a par with successful surgery without the side effects,” he said.

“However, this diet is not an easy fix and Diabetes UK strongly recommends that such a drastic diet should only be undertaken under medical supervision. Despite being a very small trial, we look forward to future results particularly to see whether the reversal would remain in the long term.”

On the Net:

EPA: Natural Gas Drilling May Contaminate Drinking Water

The Environmental Protection Agency (EPA) said Thursday it would examine claims that water wells in five states have been contaminated because of natural gas drilling, reports the Dallas Morning News.

The sites make up one of seven cases included in a study of hydraulic fracturing’s impact on drinking water. 

The fracturing fluids in these types of gas wells are mostly water and sand mixed with chemicals.  They are injected underground to fracture rock containing natural gas.

“We’ve met with community members, state experts and industry and environmental leaders to choose these case studies,” Paul Anastas, EPA assistant administrator for research and development, said in a statement.

“This is about using the best possible science to do what the American people expect the EPA to do “” ensure that the health of their communities and families are protected,” he said.

The EPA will examine claims of water pollution related to drilling in Texas, North Dakota, Pennsylvania, Colorado and Louisiana.

The University of Texas’ Energy Institute is also conducting a multiyear study of hydraulic fracturing’s impact on groundwater. 

The oil and gas industry claims that hydraulic fracturing does not contaminate drinking water.  The industry said that fracturing occurs at a depth of 6,500 feet or more, and groundwater is typically found just several hundred feet deep.

Geologists and environmental groups said there are other ways that gas drilling could affect groundwater, outside of hydraulic fracturing.

“It is a complicated maze of potential causes and potential effects that needs to be sorted out,” Chip Groat, associate director of the UT Energy Institute, told Dallas Morning News.

“It just started as a very simple assumption” about hydraulic fracturing, Groat said. “It turned out that, yes, there probably are some problems. But what is the real cause?”

On the Net:

Heart Valve Replacement Without Opening The Chest Gives New Option For Patients With Untreatable, Non-Operative Condition

 An innovative approach for implanting a new aortic heart valve without open-heart surgery is being offered at Rush University Medical Center to patients with severe aortic stenosis who are at high-risk or not suitable candidates for open heart valve replacement surgery.

“This breakthrough technology could save the lives of thousands of patients with heart valve disease who have no other therapeutic options,” says Dr. Ziyad Hijazi, director of the Rush Center for Congenital and Structural Heart Disease and interventional cardiologist of the Rush Valve Clinic. The treatment is offered through a multi-center, phase IIb cohort study called the PARTNER II (Placement of AoRTic traNscathetER valves) trial.

Aortic valve stenosis (AS) is a type of valvular heart disease characterized by an abnormal narrowing of the aortic valve opening.  It is a condition that affects nearly 1.5 million Americans.  It causes hardening or thickening of the aortic valve leaflets, which limits leaflet motion and obstructs oxygen-rich blood flow from the heart to the rest of the body. Patients with severe AS may have symptoms of chest pain, fatigue, shortness of breath, lightheadedness or fainting. Although AS typically progresses slowly without symptoms, once symptoms occur, treatment is required. Fifty percent of patients may not survive beyond one to three years.

Traditionally, patients with symptomatic AS undergo aortic valve replacement during an open-heart surgery to alleviate symptoms, improve survival and improve quality of life. However, many patients who are at very high risk for surgery, such as elderly, frail individuals with multiple health concerns, are considered inoperable.

The PARTNER II trial will compare a pioneering technology called the Edwards SAPIEN XT valve, which is made of bovine pericardial tissue leaflets hand-sewn onto a metal frame, and a new catheter delivery system called the Edwards NovaFlex delivery system, which navigates the heart from a small incision to the femoral artery in a patient’s leg or through a small incision between the ribs and snaked up into the left ventricle.  The Edwards NovaFlex delivery system positions the catheter inside the patient’s original, collapsed valve, using a balloon to deploy the frame, which holds the artificial valve in place in order to restore normal blood flow.   Both procedures are performed on a beating heart, without the need for open, cardiopulmonary bypass and its associated risks.

Annually, some 200,000 people in the U.S. need a new heart valve, but nearly half of them do not receive a new valve for a variety of reasons.

“Past study results show conclusively that transcatheter valve replacement is a safe and effective alternative to open surgery, which remains the ‘gold standard’ for most patients,” says Hijazi.

Results from the first phase of the PARTNER trial showed that the rate of death from any cause at one year was 50.7 percent in the patients who received standard therapy, as compared to 30.7 percent of patients treated with transcatheter aortic valve replacement (TAVR).

The transcatheter valve procedures take about 90 minutes, compared with four to six hours for open-heart surgery. In open-heart surgery, the surgeon cuts through the breastbone, stops the heart, removes the valve and replaces it. Open-heart surgery can require a two- to three-month recovery period, compared to only a few days for the transcatheter approach.

The next generation Edwards SAPIEN XT valve in The PARTNER II trial was engineered to provide a better valve patterned after surgical heart valves and potentially decrease treatment complications.

The PARTNER trial is the world’s first randomized, controlled trial of a transcatheter aortic heart valve. In this clinical phase IIb cohort, patients are randomized to receive either the new Edwards Sapien XT valve using the NovaFlex delivery system or the Edwards SAPIEN Transcatheter Heart Valve.

“The primary objective of the trial is to reduce death, major stroke and repeat hospitalization in these patients,” says Hijazi. “Additionally, we hope to improve quality-of-life indicators.”

The PARTNER II trial is one of the three latest, nationwide clinical trials for minimally invasive heart valve replacement being offered through the Rush Valve Clinic, where a team of cardiac surgical and interventional experts address diseases of the aortic, mitral and pulmonary valves.

The three clinical trials include:

    * PARTNER II trial for patients with aortic stenosis
    * COMPASSION trial for patients with a dysfunctional conduit ““ A phase II clinical trial using the SAPIEN Transcatheter Heart Valve in patients who have a dysfunctional conduit between the right ventricle and the pulmonary artery
    * EVEREST II trial for patients with mitral regurgitation ““ A continued access trial using the eValve MitraClip to treat a mitral valve leak.

For more information about the PARTNER trial or any of the other clinical trials at the Rush Valve Clinic, call 312-942-6800.

On the Net:

Drug Side Effect Linked With Increased Health Risks For Over 65s

A side effect of many commonly used drugs appears to increase the risks of both cognitive impairment and death in older people, according to new research led by the University of East Anglia (UEA).

As part of the Medical Research Council’s Cognitive Function and Ageing Studies (CFAS) project, the study is the first systematic investigation into the long term health impacts of ‘anticholinergic activity’ ““ a known potential side effect of many prescription and over the counter drugs which affects the brain by blocking a key neurotransmitter called acetylcholine. The findings are published today by the Journal of the American Geriatrics Society.

Medicines with some degree of anticholinergic effect are wide-ranging and many are frequently taken by older people. The groups with the greatest impact include: anti-depressants such as Amitriptyline, Imipramine and Clomipramine; tranquilisers such as Chlorpromazine and Trifluoperazine; bladder medication such as Oxybutynin; and antihistamines such as Chlorphenamine. Other drugs with an anticholinergic effect include: Atenolol, Furosemide and Nifedipine for heart problems; painkillers such as Codeine and Dextropropoxyphene; the asthma treatment Beclometasone; the epilepsy treatment Carbamazepine; and Timolol eyedrops which are used for glaucoma.

The large cohort study was launched as part of the drive to find ways of reducing risk factors for dementia which affects 820,000 people in the UK. The UEA researchers worked in collaboration with colleagues at University of Cambridge, Indiana University and National Health Service clinicians. The project was funded by the Medical Research Council (MRC) and the US National Institute on Aging.

More than 13,000 men and women aged 65 and over from across the UK were included in the two-year study. Around half were found to use a medication with potential anticholinergic properties.

In the study, each drug taken by the participants was given a ranking based on the strength of its anticholinergic activity, or AntiCholinergic Burden (ACB) – 0 for no effect, 1 for mild effect, 2 for moderate effect and 3 for severe effect.

The key findings were:

    * Twenty per cent of participants taking drugs with a total ACB of four or more had died by the end of the two-year study, compared with only seven per cent of those taking no anticholinergic drugs – the first time a link between anticholinergics and mortality has been shown.
    * For every additional ACB point scored, the odds of dying increased by 26 per cent.
    * Participants taking drugs with a combined ACB of five or more scored more than four per cent lower in a cognitive function test than those taking no anticholinergic medications ““ confirming evidence from previous smaller studies of a link between anticholinergics and cognitive impairment.
    * The increased risks from anticholinergic drugs were shown to be cumulative, based on the number of anticholinergic drugs taken and the strength of each drug’s anticholinergic effect.
    * Those who were older, of lower social class, and with a greater number of health conditions tended to take the most anticholinergic drugs.

Lead author Dr Chris Fox, clinical senior lecturer at Norwich Medical School, University of East Anglia, said: “This is the first large scale study into the long-term impact of medicines which block acetylcholine – a common brain neurotransmitter – on humans, and our results show a potentially serious effect on mortality. Clinicians should conduct regular reviews of the medication taken by their older patients, both prescribed and over the counter, and wherever possible avoid prescribing multiple drugs with anticholinergic effects.

“Further research must now be undertaken to understand possible reasons for this link and, in particular, whether and how the anticholinergic drugs might cause the increased mortality. In the meantime, I strongly advise patients with any concerns to continue taking their medicines until they have consulted their family doctor or their pharmacist.”

Co-author Prof Carol Brayne, principal investigator of the MRC CFAS project at the University of Cambridge, said: “It is important to scrutinise medications given to older people very carefully to try to minimise harm as well as gain the desired benefit. The admirable wish to give the best possible treatment with good evidence for individual conditions has to be balanced against the fact that in many older people with multiple conditions this will lead to accumulated risk such as that shown by this scale.”

Ian Maidment, a mental health pharmacist working within the NHS, added: “One of the issues is that as we age, we tend to be prescribed more medicines which have an anticholinergic effect, increasing the overall burden.”

Dr Susanne Sorensen, head of research at the Alzheimer’s Society, said: “It is very important that we have a clear picture of the side effects of drugs commonly taken by older people with cognitive impairment and other conditions. This robust study provides valuable findings, and must be taken seriously. However it is vital that people do not panic or stop taking their medication without consulting their GP.

“We would urge people to have regular appointments with their doctor to review all drug treatments they are taking. This will help ensure they are on the best medications for their conditions, and that any side effects have been taken into consideration.”

Prof Chris Kennard, chairman of the MRC’s Neuroscience and Mental Health Board, which funded the research, said: “The Medical Research Council invests in cohort studies like CFAS because they provide vital clinical information through observation. Such projects require long-term commitment to fulfil their potential but having supported cohort studies for well over half a century, MRC funding and collaborations have made the UK an international leader in this field.”

On the Net:

Researchers Complete Pacific Ocean Predators Census

According to new research from the Census of Marine Life Tagging of Pacific Predators (TOPP), two expanses of the North Pacific Ocean are attracting an array of marine predators in predictable seasonal patterns.

The new report archives the TOPP program’s effort to track top marine predator movements in the Pacific Ocean.

The study found major hot spots for large marine predators that exist in the California Current, which flows south along the U.S. west coast, and a trans-oceanic migration highway called the North Pacific Transition Zone, which connects the western and eastern Pacific on the boundary between cold sub-arctic water and warmer subtropical water.

“These are the oceanic areas where food is most abundant, and it’s driven by high primary productivity at the base of the food chain — these areas are the savanna grasslands of the sea,” the authors wrote in the journal Nature.

“Knowing where and when species overlap is valuable information for efforts to manage and protect critical species and ecosystems.”

The researchers launched the project in 2000 as part of the Census of Marine Life.  TOPP became the world’s largest-ever biologging study, eventually involving over 75 biologists, oceanographers, engineers and computer scientists across five countries.

Barbara Block of Stanford University’s Hopkins Marine Station said in a statement: “It’s been a bit like looking down on the African savanna and trying to figure out: Where are the watering holes that a zebra and a cheetah might use? Where are the fertile valleys? Where are the deserts that animals avoid, and the migratory corridors that animals such as wildebeest use to travel from place to place? We’ve come to a vast oceanic realm in the Pacific and answered these questions for animals as diverse as bluefin tuna, blue whales and leatherback sea turtles.”

“This is the first publication that pulls all of the pieces together in one place,” says Dr. Costa, who oversaw the tracking of marine mammals, birds, and turtles. “We brought together a large team of investigators to study diverse species and look at how these organisms use the ocean. It is an unprecedented examination of so many species over such a large scale.”

The project tagged 23 species and revealed how migrations and habitat preferences overlap.  It placed 4,306 electronic tags on the species, helping the researchers to collect a huge amount of data.

The scientists spent two years synthesizing data sets with advanced statistical techniques and discerned intersecting hotspots and highways of ocean life.

“One of the challenges for this study was to take distinctly different types of location data ““ some very precise from ARGOS satellites and others far less precise from ambient light level readings and bring them together using a powerful statistical framework that enabled identification of high use areas” Dr. Ian Jonsen of Dalhousie University said in a statement.

The results suggest water temperature plays a key role to the seasonal migrations of many species.

The large ecosystem was defined by the California Current, where cool, nutrient-rich water moves south along the U.S. west coast.

The study found that the Current plays a role as a vast marine savanna to a large number of whales, sharks, seals, seabirds, turtles and tunas every year.

It shows many highly migratory marine species return to the same ocean regions, following a predictable seasonal pattern.

Block said in a statement: “For me, the homing capacity of species which routinely return to the California Current or shelf waters of North America has been the biggest surprise.”

The study also found that some species have more difficulty with poor ocean productivity.  Coastal birds depend on krill, and during an El Nino in 2006 to 2007, most of the hatching failed, the researchers said.

The TOPP study was the first ocean basin-scale study of marine predator distribution and movement ever conducted.

The study shows the importance of apex predators in different ecosystems, noting how the loss of bluefin tuna and porbeagle sharks in the Atlantic Ocean contributed to the near-extinction of cod and similar species.

The authors said for the first time, the TOPP team has been able to link the movements of tunas, sharks and blue whales north and south along the southwestern U.S. coastline with seasonal changes in temperature and chlorophyll concentrations.

“Using satellite observations of temperature and chlorophyll concentrations alone, we can now predict when and where individual species are likely to be in a given ocean region and begin to understand factors that control their movements. This is fundamental to the concept of ecosystem-based management,” Daniel Costa, professor of ecology and evolutionary biology at the University of California, said in a statement.

Barbara Block of Stanford University was lead author for the Nature paper. Other institutions participating in the study with OSU and Stanford included Dalhousie University, San Jose State University, NOAA Southwest Fisheries Science Center, University of California-Santa Cruz, and the Inter-American Tropical Tuna Commission.

Image Caption: This blue whale was encountered during a tagging expedition by the Oregon State University Marine Mammal Institute in 2006 near the Channel Islands of California. (Photo by Craig Hayslip, courtesy of OSU Marine Mammal Institute)

On the Net:

New Insights Into Origin Of Deadly Cancer

Barrett’s esophagus, often a precursor to esophageal cancer, results from residual, embryonic cells

Researchers have discovered a new mechanism for the origin of Barrett’s esophagus, an intestine-like growth in the esophagus that is triggered by chronic acid reflux and often progresses to esophageal cancer. Studying mice, the researchers found that Barrett’s esophagus arises not from mutant cells in the esophagus but rather a small group of previously overlooked cells present in all adults that can rapidly expand to cancer precursors when the normal esophagus is damaged by acid.

Decades of cancer research tells us that most of the common cancers begin with genetic changes that occur over a period of 15 to 20 years, in some cases leading to aggressive cancers. However, for a subset of cancers that appear to be linked to chronic inflammation, this model might not hold.

Barrett’s esophagus, which was first described by the Australian surgeon Norman Barrett in 1950, affects two to four million Americans. In this condition, tissue forms in the esophagus that resembles the intestinal tissue normally located much farther down the digestive tract. As a result, a person’s chances of developing a deadly esophageal adenocarcinoma increase by 50- to 150-fold. Late stage treatment is largely palliative, so it is important to understand how acid reflux triggers it in the first place.

Research from the laboratory of Frank McKeon, Harvard Medical School professor of cell biology, together with Wa Xian, a postdoctoral researcher at Brigham and Women’s Hospital and the Institute of Medical Biology, Singapore, along with an international consortium including Christopher Crum, director of Women’s and Perinatal Pathology at Brigham and Women’s Hospital, has shown that Barrett’s esophagus originates from a minor population of non-esophageal cells left over from early development.

For the past decade, McKeon and his laboratory have been using mouse models to investigate the role of p63, a gene involved in the self-renewal of epithelial stem cells including those of the esophagus. McKeon joined forces two years ago with Wa Xian, an expert in signal transduction in cancer cells, to tackle the vexing problem of the origin of Barrett’s esophagus.

At that time, the dominant hypothesis for Barrett’s was that acid reflux triggers the esophageal stem cells to make intestine cells rather than normal esophageal tissue. However, McKeon and Xian felt the support for this concept was weak. Taking a different track, they studied a mouse mutant lacking the p63 gene and mimicked the symptoms of acid reflux. As a result, the entire esophagus was covered with a Barrett’s-like tissue that proved to be a near exact match with human Barrett’s at the gene expression level.

The researchers were particularly surprised by the sheer speed with which this Barrett’s esophagus appeared in the mice.

“From the speed alone we knew we were dealing with something different here,” said Xia Wang, postdoctoral fellow at Harvard Medical School and co-first author of this work.

Yusuke Yamamoto, a postdoctoral fellow at the Genome Institute of Singapore and also co-first author, added that, “we just had to track the origins of the Barrett’s cells back through embryogenesis using our markers from extensive bioinformatics.”

In essence, the investigators tracked the precancerous growth to a discrete group of leftover embryonic cells wedged between the junction of the esophagus and the stomach–precisely where endoscopists have argued Barrett’s esophagus begins. As predicted by the mouse studies, the researchers identified a group of embryonic cells exactly at the junction between the esophagus and the stomach in all normal humans.

“Barrett’s arises from this discrete group of pre-existing, residual embryonic cells present in all adults that seemingly lie-in-wait for a chance to take over when the esophagus is damaged,” said McKeon. Added Xian, “We know these embryonic cells have different gene expression patterns from all normal tissues and this makes them inviting targets for therapies to destroy Barrett’s before it progresses to cancer.”

The therapeutic opportunities of this work are potentially immense.

“We are directing monoclonal antibodies to cell surface markers that can identify these precursor cells, so we may have a new opportunity to intervene therapeutically and prevent Barrett’s esophagus in at-risk patients,” said Wa Xian.

“Additionally,” noted McKeon, “we are cloning the stem cells for both these precursors and for Barrett’s esophagus itself, and these should represent critical targets for both monoclonal antibodies and small molecule inhibitors.”

Finally, there is reason to believe that this unusual mechanism might apply to a subset of other lethal cancers with unsure origins.

Crum noted that “some very aggressive cancers arise at junctions of two tissues and these deserve closer scrutiny to get at their origins if we are to surmount these diseases.”

On the Net:

New Application For IPhone May Support Monitoring And Research On Parkinson’s Disease

Researchers at the Georgia Tech Research Institute (GTRI) have developed a novel iPhone application that may enable persons with Parkinson’s disease and certain other neurological conditions to use the ubiquitous devices to collect data on hand and arm tremors and relay the results to medical personnel.

The researchers believe the application could replace subjective tests now used to assess the severity of tremors, while potentially allowing more frequent patient monitoring without costly visits to medical facilities.

The program ““ known as iTrem ““ could be offered later this year by the App Store, an Apple Inc. website that sells iPhone applications. But iTrem will first undergo a clinical study at Emory University and must receive any required approvals from the Food and Drug Administration.

“We expect iTrem to be a very useful tool for patients and their caregivers,” said Brian Parise, a research scientist who is principal investigator for the project along with Robert Delano, another GTRI research scientist. “And as a downloadable application, it also promises to be convenient and cost-effective.”

iTrem utilizes the iPhone’s built-in accelerometer to collect data on a patient in his or her home or office. The application directly tracks tremor information currently, and in the future will use simple puzzle games to record tremor data, which will then be processed and transmitted.

The researchers expect the clinical trial to show that data gathered by the program would allow physicians to remotely monitor the degree of disability, progression and medication response among patients with tremor-related conditions. In addition, iTrem offers a social component that allows people to share stories, pictures and data.

iTrem’s developers are working with the Advanced Technology Development Center (ATDC) to form a startup company based on iTrem and future applications that might take advantage of iPhone capabilities. ATDC is a startup accelerator based at Georgia Tech that helps Georgia entrepreneurs launch and build successful technology companies.

The GTRI team plans ongoing development of iTrem’s interface, based on responses from doctors and patients. They’re also investigating other consumer technologies with diagnostic potential, including the tiny gyroscopes now available in some cellular phones.

Future developments will include the addition of several other Parkinson’s related tests and investigation of gait analysis in a joint effort with the University of South Florida and the James A. Haley Veterans’ Hospital in Tampa, Fla. Additional developments may utilize the phone for detecting and analyzing dyskinesia, a movement disorder.

More than 10 million people in the U.S. have tremor-related disease, including Parkinson’s, essential tremor and multiple sclerosis, Delano said. Data collected by iTrem could enhance research on tremor disorders, in addition to supporting treatment for current patients, he added.

Most current measurement techniques used by doctors are subjective and are performed infrequently, Delano said. Complex diagnostic procedures such as electroencephalography and electromyography are objective and thorough, but are rarely performed because they’re lengthy, expensive and require a clinical setting. The result is that little data about tremor has been available to track the effectiveness of medication and therapy over time.

By contrast, he said, the ease of gathering tremor data with iTrem could help lead to a significant expansion of research in this area, as a wealth of objective data is collected and analyzed.

“Even factoring in the cost of an iPhone, using iTrem is likely to be more convenient and less expensive for patients than office visits, and the data are accurate and abundant,” Delano said.

A clinical study involving iTrem use is expected to start soon at Emory University’s Movement Disorder Clinic. The study will be led by Dr. Stewart Factor, a researcher in the field of Parkinson’s disease at the Emory School of Medicine.

The GTRI development team presented a paper on iTrem in January at the 2011 International Conference on Health Informatics.

Delano explained that the development of iTrem was linked to his own diagnosis with Parkinson’s disease several years ago. He eventually became frustrated with the subjective approaches commonplace in the characterizing of patient tremor symptoms.

On the Net:

Are Americans Eating Too Many Potato Chips?

New research suggests that instead of counting calories and fixating on how much food you are consuming, it is much more important to concentrate on eating healthy foods.

According to a Harvard University analysis of the dietary habits of 120,000 Americans, leaving out just a single 1-ounce bag of potato chips each day can lead to a loss of 1.69 pounds of weight gain every four years.

People in the US gain nearly 1 pound every year on average, with those consuming lots of potato chips packing on the most pounds, according to the report published in the New England Journal of Medicine.

In three studies spanning 20 years of nearly 120,000 Americans, of which about 82 percent were women, researchers discovered that extra helpings of yogurt, nuts, fruit, whole grains and vegetables were all linked to weight loss.

The team, from Harvard School of Public Health, quantified the effect that eating particular types of food daily had on weight gain or loss.

The potato chip is the single biggest threat to the pound-per-year weight gain that plagues so many. It is a bigger threat than candy, ice cream, and even soda. And the reason is partly due to that advertising clich©: You can’t eat just one.

“They’re very tasty and they have a very good texture. People generally don’t take one or two chips. They have a whole bag,” obesity expert Dr. F. Xavier Pi-Sunyer of the St. Luke’s-Roosevelt Hospital Center in New York told The Associated Press (AP).

What we eat has far more impact than exercise and most other habits do on long-term weight gain, according to the Harvard scientists. The study is the most comprehensive look yet at the effect of individual foods and lifestyle choices like sleep time and quitting smoking.

Obesity in the US is an epidemic. Nearly 68 percent of American adults are overweight or obese. Childhood obesity has tripled in the past three decades. Pounds are put on gradually over decades, and many people struggle to limit weight gain without realizing what is causing it.

“There is no magic bullet for weight control,” said study leader Dr. Frank Hu. “Diet and exercise are important for preventing weight gain, but diet clearly plays a bigger role.”

The 120,000+ participants in the study were all medical/health professionals who were not obese at the start of the study. Their weight was measured every four years for up to 20 years, and they detailed their diet on surveys and questionnaires.

On average, the volunteers gained nearly 17 pounds over the study period.

For each four-year period, food choices contributed nearly 4 pounds. Exercise, for those who did it, cut less than 2 pounds.

Pound for pound, the study participants gained more weight eating potato chips — 1.69 pounds on average over four years — than eating sweets and other desserts — 0.41 pounds over four years.

For other potato sources, the gain was 1.28 pounds over four years. French fries added more weight gain than boiled, baked or mashed potatoes, mainly because a serving of fries contains between 500 and 600 calories compared to a serving of baked potato at 280 calories.

Soda added a pound every four years. One alcoholic drink per day added 0.41 pounds, watching TV for one hour per day added 0.31 pounds, and recent smoking cessation added 5 pounds. Also, people who slept more or less than six to eight hours per night gained more weight.

Surprisingly, eating more yogurt and nuts every day had a bigger effect on weight loss than fruits and vegetables — scientists suggest its possibly because they keep people fuller for longer.

They found that people who ate an extra portion of yogurt each day, compared to the study group as a whole, lost an average 0.82 pounds every four years. For nuts, the figure was 0.57 pounds, fruits 0.49 pounds, whole grains 0.37 pounds, and vegetables 0.22 pounds lost every four years.

The authors noted that this did not mean people could simply eat large amounts of these foods and lose weight.

“Conventional wisdom often recommends “Ëœeverything in moderation,’ with a focus only on total calories consumed, rather than the quality of what is consumed,” said study author Dariush Mozaffarian, an associate professor of medicine and epidemiology at Harvard School of Public Health, in an email to Bloomberg. “Our results demonstrate that the quality of the diet, the types of foods and beverages that one consumed, is strongly linked to weight gain.”

“These findings underscore the importance of making wise food choices in preventing weight gain and obesity,” said Hu in a statement. “The idea that there are no “Ëœgood’ or “Ëœbad’ foods is a myth that needs to be debunked.”

The studies included people who were largely female, educated and mostly white. The authors said more research is needed to see if the results are similar in other populations.

“It’s another way to support a healthy lifestyle and that also includes more physical activity, being less sedentary, less beverages, whether it’s sweetened or alcohol, things that we’ve know before,” Pi-Sunyer, who was not involved in the study, told Bloomberg in an interview. “It’s nice to have such a long study confirming all this.”

The study was funded by the National Institutes of Health and a foundation. Several researchers reported receiving fees from drug and nutrition companies.

The federal government, issuing new dietary guidelines earlier this year, recently ditched the food pyramid that has been a longtime symbol of healthy eating. They moved in favor of a dinner plate divided into four sections containing fruits, vegetables, proteins and grains.

On the Net:

Women With Implants Likely To Need Additional Surgery

U.S. health regulators said Wednesday that women who get silicone breast implants are likely to need additional surgery within 10 years to address complications like rupturing of the device.

The Food and Drug Administration said it will work to revise safety labels for silicone breast implants after reviewing data from long-term studies.

The agency said the studies confirmed its decision that implants can be used safely, but said the conclusions could be limited because some women dropped out.

“The key point is that breast implants are not lifetime devices,” Jeff Shuren, director of the FDA’s Center for Devices and Radiological Health, said in a statement. “The longer you have the implant, the more likely you are to have complications.”

According to the American Society of Plastic Surgeons, there were about 400,000 breast enlargements or reconstruction procedures in the U.S. in 2010.

Studies showed that about 70 percent of all women who received surgery due to disease or trauma needed another operation within 10 years.

The FDA approved silicone gel-filled breast implants sold by Allergan and Johnson & Johnson’s Mentor unit in 2006.  The agency banned silicone implants for most U.S. women in 1992 after some complained the devices leaked and make them ill.

Both companies were required to conduct post-approval studies of 40,000 women for 10 years as a condition of taking the device to market. 

The FDA said that based on the data, the most common complications were localized, such as the hardening of the breast area around the implant, rupture or deflation of it, and the need for additional surgeries.

Other complications include implant wrinkling, asymmetry, scarring, pain, and infection at the incision site.

They reported a small correlation between implants and anapestic large-cell lymphoma (ALCL), which is a form of cancer that affects about 3,000 Americans each year.

Shuren said from 1997 to 2010 there were about 60 cases of ALCL reported for women worldwide out of about 5 million to 10 million women who had breast implants.

“If there’s a true association between that cancer and implants, it’s very, very rare,” he said in a statement.

The FDA said most long-term studies confirmed initial results.

“Most women reported high levels of satisfaction with their body image and the shape, feel and size of their implants,” the FDA report said.

The agency cautioned that post-approval study data could be limited due to the low response rates.

Allergan has collected preliminary two-year data for 60 percent of participants in the large 10-year studies of 80,000 women.

Diane Zuckerman, president of the advocacy group National Research Center for Women & Families, said most medical journals would not publish the studied cited by FDA because of the missing data.

“So many of the women dropped out before the research was completed that it is impossible to say what percentage of women need additional surgery or have health problems five or 10 years after getting breast implants,” she said in a statement.

“This raises questions about FDA’s reliance on studies required after a product is approved,” she added.

The FDA said “both manufacturers have communicated to the FDA the difficulties in following women who have received silicone gel-filled breast implants.”

The FDA said it is working with Allergan and Mentor to increase participation and follow-up.

On the Net:

Yale Researchers Uncover Source Of Mystery Pain

An estimated 20 million people in the United States suffer from peripheral neuropathy, marked by the degeneration of nerves and in some cases severe pain. There is no good treatment for the disorder and doctors can find no apparent cause in one of every three cases.

An international team of scientists headed by researchers from Yale University, the Veterans Affairs Medical Center in West Haven and the University Maastricht in the Netherlands found that mutations of a single gene are linked to 30 percent of cases of unexplained neuropathy. The findings, published online June 22 in the Annals of Neurology, could lead to desperately needed pain treatments for victims of this debilitating disorder.

“For millions of people, the origin of this intense pain has been a frustrating mystery,” said Stephen Waxman, the Bridget Marie Flaherty Professor of Neurology and professor of neurobiology and of pharmacology and a senior co-author of the paper. “All of us were surprised to find that these mutations occur in so many patients with neuropathy with unknown cause.”

The study focused upon mutations of a single gene ““ SCNA9 ““ which is expressed in sensory nerve fibers. Waxman’s group had discovered that mutations in this gene’s product ““ the protein sodium channel Nav1.7 ““ cause a rare disorder called “Man on Fire Syndrome,” characterized by excruciating and unrelenting pain. Colleagues in the Netherlands carefully scrutinized neuropathy patients to rule out all known causes of the neuropathy, such as diabetes, alcoholism, metabolic disorders and exposure to toxins. Researchers then did a genetic analysis of 28 patients with neuropathy with no known cause. They found 30 percent of these subjects had mutations in the SCN9A gene. The researchers found that the mutations cause nerve cells to become hyperactive, a change they believe eventually leads to degeneration of nerve fibers.

“These findings will help us as clinicians to a better understanding of our patients with small fiber neuropathy and could ideally have implications for the development of future specific therapies,” said Catharina G. Faber, who is a lead author of the study along with Ingemar Merkies of the Netherlands.

On the Net:

Nudging Doctors In Intensive Care Unit Reduces Deaths

Physicians for critically ill need ‘copilots’ to remind them of important details

Caring for patients in a medical intensive care unit in a hospital and flying a 747 are complicated tasks that require tracking thousands of important details, some of which could get overlooked. That’s why the pilot has a checklist and a copilot to make sure nothing slips by.

A new Northwestern Medicine study shows the attending physician in the intensive care unit could use a copilot, too. The mortality rate plummeted 50 percent when the attending physician in the intensive care unit had a checklist ““ a fairly new concept in medicine — and a trusted person prompting him to address issues on the checklist if they were being overlooked. Simply using a checklist alone did not produce an improvement in mortality.

“Attending physicians are good at thinking about big picture issues like respiratory failure or whatever diagnosis brought a patient to the intensive care unit, but some important details are overlooked because it’s impossible for one person to remember and deal with all those details,” said Curtis Weiss, M.D., the lead investigator and a fellow in pulmonary and critical care medicine at Northwestern University Feinberg School of Medicine.

Weiss conducted the study in the medical intensive care unit at Northwestern Memorial Hospital. The study was published online in the American Journal of Respiratory and Critical Care Medicine and will appear in an upcoming print issue.

“We showed the checklist itself is just a sheet of paper,” Weiss said. “It’s how doctors interact with it and best implement it that makes it most effective. That’s how we came up with prompting.”

A checklist is a useful tool only if a physician gets continual reminders to use it to promote decision making, Weiss said, “rather than just being a piece of paper that gets shoved in someone’s face like busy work.”

For the study, Weiss and colleagues developed a checklist to be used by physicians in the medical intensive care unit. The checklist focused on important issues the researchers believed were being overlooked by physicians during daily rounds.

The checklist included six important parameters often overlooked such as testing whether a patient can be taken off a ventilator and the duration of empiric antibiotics (for suspected but not confirmed infections) and central venous catheters.

“We observed that physicians sometimes wrote information on the checklist but were not using it to improve their decision making,” Weiss explained.

The study was designed to determine whether prompting physicians to use the checklist would affect the decisions they made about managing their patients’ care.

One team of physicians had face-to-face, frequent prompting by a resident physician to address issues on the checklist, only if the issues were overlooked during daily rounds. The other team of physicians continued to use the checklist without such prompting.

The prompted physician team oversaw the care of 140 patients; the unprompted team oversaw 125 patients.

The prompting by a physician not actively involved in the patients’ care reduced mortality by 50 percent over three months. The saved lives may have resulted in part from reducing the time patients were on ventilators (thus reducing cases of ventilator-associated pneumonia) as well as reducing the number of days patients were on empiric antibiotics and central catheters. Prompting also cut patients’ intensive care unit length of stay, on average, by more than one day.

Researchers also wanted to see if using a checklist alone (without prompting) made any difference. They compared a pre-study group of almost 1,300 patients to patients in the study whose physicians used the checklist alone. The results: a checklist alone did not improve mortality or reduce the length of stay.

Having a subtle approach with the physicians was one key to the success of the prompting, Weiss said.

“We didn’t mandate that they had to change their management; it was nuanced,” Weiss said. “It was ‘do you plan to continue the antibiotics today?’ not ‘you should stop the antibiotics.'”

Weiss concedes hospitals aren’t likely to hire physicians just to be prompters. But perhaps nurses or even an electronic version of the verbal prompting could be equally effective, he said.

“It should be fresh eyes or someone from the existing team who is assigned to concentrate on these issues,” Weiss said. “What matters is that someone is specifically thinking about these issues.”

On the Net:

Jack In The Box To Remove Toys From Kids’ Meals

A spokesman said Tuesday that fast-food chain Jack in the Box pulled toys from the kids’ meals.

According to Reuters, the move comes after fast-food companies are under pressure to stop using toys to market children’s meals that are high in calories, sugar, fat and salt.

Lawmakers in San Francisco and nearby Santa Clara County passed laws that will require kids’ meals to meet certain nutritional standards before they can add toys to the mix.

“While we’ve been aware of efforts to ban the inclusion of toys in kids’ meals, that did not drive our decision,” Jack in the Box spokesman Brian Luscomb said in a statement.

“Our advertising and promotions have focused exclusively on the frequent fast-food customer, not children,” added Randy Carmical, also a Jack in the Box spokesman.

Carmical said the company has been more focused on the food in its meal for children, such as grilled cheese sandwiches or grilled chicken strips.  He said the company pulled toys from the meals when it began offering parents the option of substituting sliced apples with caramel sauce instead of French fries.

“We believe that providing these kinds of options is more appealing to a parent than packaging a toy with lower-quality fare,” Carmical said in a statement.

Organizations and advocates critical of the fast-food industry have praised Jack in the Box’s efforts.

“It’s terrific that Jack in the Box has taken this step,” Margo Wootan, nutrition policy director at the Center for Science in the Public Interest (CSPI), told Reuters.  “It’s really a monumental step that I hope their competitors will emulate.”

CSPI sued McDonald’s Corp. in December to stop it from using Happy Meal toys to lure children into its restaurants.

Consumer and health advocates are using the announcement to put pressure on McDonald’s, Burger King, Taco Bell and other fast food chains that still include toys in kids’ meals.

According to Wootan, toy giveaways make up over half the marketing expenditures in the fast-food industry, with $360 million spent annually to put toys in kids’ meals.

Jack in the Box has about 2,200 restaurants in the U.S. and is the fifth-largest hamburger chain.

On the Net:

Tyrannosaurus Rex Was A Pack Hunter

According to new research, the Tyrannosaurus Rex, which has traditionally been seen as a lone wolf kind of dinosaur, hunted in packs.

The new research based on finds in the Gobi Desert suggests that the species was equipped with the build and speed for pack hunting, but also the brain capacity to work together as a team.

Dr. Philip Currie of the University of Alberta said evidence from 90 skeletons of Tarbosaurus Bataar, a cousin of Tyrannosaurs Rex, suggests that about half a dozen of the dinosaurs were part of a social group that died together.

He said Tyrannoasaurids’ hunting technique may have involved juveniles chasing and catching prey.

Currie said younger Tyrannosaurids’ skeletons show they would have been faster and more agile than adults, which were slower but much heavier and more powerful.

The similarities between the Tyrannosaurid family mean that Tyrannosaururses would have likely been capable of the same behavior as their cousins, according to Currie.

“We now have a lot of sites worldwide which show these Tyrannosaurids were grouping animals which at certain times did get together into gangs, either to hunt or move from one region to another,” he said in a statement to The Telegraph.

“Moving in gangs suggests that they were behaviorally more complex than we think dinosaurs should be, and CAT scans also show their brain size was about three times what you would expect for an animal of that size.”

“A dinosaur like the Tyrannosaurus Rex would have a much larger brain in proportion to its body size than a crocodile, and three times that of a plant-eating dinosaur like a Triceratops of the same size.”

The new research will be explained in a document “Dino Gangs” on the Discovery Channel Sunday evening.

On the Net:

Whining Proves To Be Most Distracting, Annoying Sound

A new study found that whining was greater than any other noise to use when trying to distract someone.

The researchers had people do subtraction problems while listening to an infant crying, regular speech, silence, whining, a high-pitched table saw and adult baby talk.

They used a foreign language for speech samples in the study to ensure people were not distracted by the words themselves.

People made more mistakes per math problems completed when listening to the whines than any of the other speech patterns or noises.

“You’re basically doing less work and doing it worse when you’re listening to the whines,” study co-author Rosemarie Sokol Chang, a professor of psychology at SUNY New Paltz, said in a statement. “It doesn’t matter if you’re a man or a woman, everybody’s equally distracted.”

People completed fewer subtraction problems while listening to the whining, crying and baby talk than when it was completely quiet.

Chang said there were no differences based on gender or parental status for the number of problems completed.

The paper was published in the Journal of Social, Evolutionary, and Cultural Psychology.

On the Net:

A Wise Man’s Treatment For Arthritis: Frankincense?

The answer to treating painful arthritis could lie in an age old herbal remedy – frankincense, according to Cardiff University scientists.

Cardiff scientists have been examining the potential benefits of frankincense to help relieve and alleviate the symptoms of the condition.

“The search for new ways of relieving the symptoms of inflammatory arthritis and osteoarthritis is a long and difficult one,” according to Dr Emma Blain, who leads the research with her co-investigators Professor Vic Duance from Cardiff University’s School of Biosciences and Dr Ahmed Ali of the Compton Group.

“The South West of England and Wales has a long standing connection with the Somali community who have used extracts of frankincense as a traditional herbal remedy for arthritic conditions.

“What our research has focused on is whether and how these extracts can help relieve the inflammation that causes the pain,” she added.

The Cardiff scientists believe they have been able to demonstrate that treatment with an extract of Boswellia frereana ““ a rare frankincense species ““ inhibits the production of key inflammatory molecules which helps prevent the breakdown of the cartilage tissue which causes the condition.

Dr Ali adds: “The search for new drugs to alleviate the symptoms of conditions like inflammatory arthritis and osteoarthritis is a priority area for scientists. What our research has managed to achieve is to use innovative chemical extraction techniques to determine the active ingredient in frankincense.

“Having done this we are now able to further characterise the chemical entity and compare its success against other anti-inflammatory drugs used for treating the condition.”

The research comes as a result of a seedcorn project, funded by the Severnside Alliance for Translational Research (SARTRE), through the MRC Developmental Pathway Funding Scheme devolved portfolio.

SARTRE is a joint project between Cardiff University and the University of Bristol to combine and accelerate translational research.

On the Net:

NIU Scientists Discover Simple, Green And Cost-Effective Way To Produce High Yields Of Highly Touted Graphene

Scientists at Northern Illinois University say they have discovered a simple method for producing high yields of graphene, a highly touted carbon nanostructure that some believe could replace silicon as the technological fabric of the future.

The focus of intense scientific research in recent years, graphene is a two-dimensional material, comprised of a single layer of carbon atoms arranged in a hexagonal lattice. It is the strongest material ever measured and has other remarkable qualities, including high electron mobility, a property that elevates its potential for use in high-speed nano-scale devices of the future.

In a June communication to the Journal of Materials Chemistry, the NIU researchers report on a new method that converts carbon dioxide directly into few-layer graphene (less than 10 atoms in thickness) by burning pure magnesium metal in dry ice.

“It is scientifically proven that burning magnesium metal in carbon dioxide produces carbon, but the formation of this carbon with few-layer graphene as the major product has neither been identified nor proven as such until our current report,” said Narayan Hosmane, a professor of chemistry and biochemistry who leads the NIU research group.

“The synthetic process can be used to potentially produce few-layer graphene in large quantities,” he said. “Up until now, graphene has been synthesized by various methods utilizing hazardous chemicals and tedious techniques. This new method is simple, green and cost-effective.”

Hosmane said his research group initially set out to produce single-wall carbon nanotubes. “Instead, we isolated few-layer graphene,” he said. “It surprised us all.”

“It’s a very simple technique that’s been done by scientists before,” added Amartya Chakrabarti, first author of the communication to the Journal of Materials Chemistry and an NIU post-doctoral research associate in chemistry and biochemistry. “But nobody actually closely examined the structure of the carbon that had been produced.”

Other members of the research group publishing in the Journal of Materials Chemistry include former NIU physics postdoctoral research associate Jun Lu, NIU undergraduate student Jennifer Skrabutenas, NIU Chemistry and Biochemistry Professor Tao Xu, NIU Physics Professor Zhili Xiao and John A. Maguire, a chemistry professor at Southern Methodist University.

The work was supported by grants from the National Science Foundation, Petroleum Research Fund administered by the American Chemical Society, the Department of Energy and Robert A. Welch Foundation.

On the Net:

Researchers Find Process Of Cervical Ripening Differs Between Term And Preterm Birth

Cervical ripening that instigates preterm labor is distinct from what happens at the onset of normal term labor, researchers at UT Southwestern Medical Center have found.

The findings challenge the conventional premise that premature cervical ripening and remodeling is likely just an accelerated version of the term labor process, and that normal term ripening is caused primarily by activation of inflammatory responses.

Cervical remodeling is the process by which the cervix is transformed to open sufficiently during the birth process.

“Premature cervical remodeling can occur by more than one mechanism and is not necessarily an acceleration of the physiologic process in term labor. Depending on the cause of preterm birth, that mechanism can vary,” said Dr. Mala Mahendroo, associate professor of obstetrics and gynecology and the Cecil H. and Ida Green Center for Reproductive Biology Sciences at UT Southwestern, and senior author of the study published in a recent issue of Endocrinology.

The study has been selected by the Faculty of 1000 ““ an international group of more than 10,000 leading scientists and researchers ““ to be in its top 2 percent of published articles in biology and medicine.

Previous studies suggest that in term or preterm labor, white blood cells influx into the cervix and release enzymes that break down tissue support and remodel the cervix, allowing a baby to pass through the birth canal. That’s only half-right, researchers in this investigation report.

“The immune system or inflammatory response is sufficient to cause cervical ripening, but it’s not absolutely necessary for it to happen,” said Dr. Brenda Timmons, research scientist in obstetrics and gynecology and co-lead author of the study.

Nearly 13 percent of all births in the U.S. are preterm. Premature infants can suffer respiratory distress, intraventricular hemorrhage and even cerebral palsy. Identified risk factors for preterm birth include smoking, alcohol consumption, advanced maternal age, genetics, cervical insufficiency, previous preterm birth and infection.

“In about half of all preterm births, the cause is unknown. It’s critical to determine the multiple causes of preterm birth so that effective therapies can be developed for each kind,” said Dr. Roxane Holt, a maternal-fetal medicine fellow and co-lead author of the study.

“When patients present in preterm labor, we don’t have a lot of therapy to stop the labor,” she said.

UT Southwestern researchers compared preterm birth models in mice. They injected lipopolysaccharide (LPS) to promote infection-like conditions and an inflammatory response in one model. In the other, they administered mifepristone (RU486) to simulate the withdrawal of the gestation-supporting hormone progesterone, which normally takes place at the end of a pregnancy.

Researchers report that cervical changes in inflammation-induced conditions are caused by an influx of white blood cells and an increased expression of pro-inflammatory markers with no increase in the expression of genes induced in term ripening. Preterm ripening induced by progesterone withdrawal results from the combined activation of processes that occur during term ripening and shortly postpartum.

“These findings, if translatable in women, suggest one therapy may not be effective for all preterm births, and that early identification of the cause of prematurity is necessary to determine the correct therapy,” Dr. Mahendroo said.

On the Net:

Barrett’s Esophagus and Cancer Risk

(Ivanhoe Newswire) — Patients with Barrett’s esophagus may have a lower risk of esophageal cancer than previously thought, according to this study.

Barrett’s esophagus is a precancerous condition, and patients who have it are often advised to have regular endoscopies to watch for signs of esophageal adenocarcinoma, the most common kind of esophageal cancer. However, how often Barrett’s esophagus progresses to cancer has not been clear.

In this study, Shivaram Bhat, B.Ch., MRCP, of Queens University Belfast and colleagues followed 8,522 patients in the Northern Ireland Barrett’s Esophagus Registry, one of the largest registries in the world of people with the condition. After an average follow-up time of 7 years, 79 patients were diagnosed with esophageal cancer, 16 with cancer of the gastric cardia (the part of the stomach closest to the esophagus), and 36 with precancerous changes known as high-grade dysplasia. In the entire group, the incidence of these three conditions combined was 0.22% per year. Previous studies have reported an incidence of cancer among people with Barrett’s esophagus between 0.58 percent and 3 percent per year.

Men were significantly more likely to progress to malignancy than women, and people age 60-69 had a higher risk than those under 50 or those age 80 and over. The highest rates of progression were among patients with low-grade dysplasia (1.40 percent) or specialized intestinal metaplasia (0.38 percent) at their initial endoscopy and biopsy.

The authors conclude that the risk of Barrett’s esophagus progressing to esophageal cancer is less than previously reported and that this finding has implications for clinical practice.

“Current recommendations for surveillance are based on higher estimates of cancer risk among patients with [Barrett’s esophagus] than were seen in this study and therefore, they may not be justified,” they were quoted as saying.

SOURCE: Journal of the National Cancer Institute, published online June 16, 2011

Improving Access To Essential Medicines Through Public-Private Partnerships

How the private sector can improve the distribution of essential health products to remote areas of Sub-Saharan Africa

A report released today by the International Vaccine Access Center (IVAC) at Johns Hopkins Bloomberg School of Public Health asks why products like Coca-Cola can reach remote villages in developing nations while essential medicines like antibiotics cannot always be found. The report, entitled Improving Access to Essential Medicines Through Public-Private Partnerships documents the poor availability of essential health products (EHPs) in Sub-Saharan Africa and explores how to improve EHP distribution via collaborations with the private sector.

Focusing on the distribution stage in the EHP supply chain, the report examines the causes of bottlenecks at this stage. Distributors of consumer packaged goods (CPGs), such as food, beverages, tobacco, and mobile phone refill cards, have been more successful at reaching remote locations under difficult conditions than distributors of essential medicines. In the most remote villages of Africa, a person is more likely to find a kiosk with mobile phone cards in stock than a clinic with the basic antibiotics in stock.

“Global efforts to improve access to essential medicines and vaccines have often focused on procurement and financing but not enough on distribution, especially to ‘last-mile’ populations,” said Orin Levine, Executive Director of IVAC. “By capturing best practices from the private sector, we think we can improve distribution systems and enhance access while saving both lives and money.”

In 2007, 151 million vaccine doses were wasted in developing countries due to improper refrigeration. A study by the GAVI Alliance suggested that 25 million doses of pentavalent DTP-HepB-Hib vaccine, valued at $80 million, could be saved in developing countries by eliminating unnecessary wastage from heat damage, freeze damage or disposal of unused portions of multidose vials.

“Companies selling soda and mobile phone cards work in the same hard-to-reach markets in Sub-Saharan Africa as essential medicine distributors, but they have been far more successful at modifying their systems and aligning incentives to overcome distribution barriers,” said Kyla Hayford, a PhD Candidate in the Department of International Health at Johns Hopkins Bloomberg School of Public Health and an author of the report. Public-private partnerships between the global health community and private sector can leverage the strengths of CPG companies to improve availability of essential medicines via knowledge exchange, shared infrastructure, generating appropriate performance monitoring metrics, and investing in product innovation.

In Sub-Saharan Africa, 30-50% of the population lacks access to essential medicines. “The stakes for getting distribution of essential medicines right in the developing world are huge,” said Lois Privor-Dumm, Director of Alliances & Information at the International Vaccine Access Center (IVAC) at the Johns Hopkins Bloomberg School of Public Health. “There appears to be a lot we can learn from both the formal and informal structures of consumer packaged goods companies that could ultimately save lives and prevent unnecessary suffering.”

On the Net:

GPs Missing Early Dementia

People presenting with memory problems and mild dementia are often not diagnosed promptly in primary care

New research from the University of Leicester demonstrates that general practitioners (GPs) are struggling to correctly identify people in the early stages of dementia resulting in both missed cases (false negatives) and misidentifications (false positives).

Researchers from the University of Leicester in the UK and National Collaborating Centre for Mental Health, London, UK and the Department of General Practice, Dusseldorf, Germany examined 30 previous studies involving 15,277 people seen in primary care for cognitive disorders, including 7109 assessed for dementia.

Although GPs managed to identify eight out of ten people with moderate to severe dementia, most patients with early dementia were not recognized. Only 45% of people with early dementia and mild cognitive impairment were identified. Mild cognitive impairment is a condition that may precede dementia in some people.

Across the whole spectrum, GPs identified 3 out of 5 of people attending for broadly defined memory problems.

Dr Alex Mitchell, a consultant psychiatrist with the Leicestershire Partnership NHS Trust and a researcher at the University, said: “This study highlights for the first time that GPs trying to identify dementia actually make more false positive errors, with misidentifications outnumbering missed cases at least two to one.”

“GPs working in busy settings struggle to identify early dementia and prodromal conditions based on their initial clinical judgement. This was particularly the case for patients living alone where no informant was available and when patients had relatively preserved daily function. Furthermore, GPs’ attitudes towards dementia may play an important role in dementia recognition. A project within the German Competence Network Degenerative Dementias (CNDD) at the University of Dusseldorf is currently investigating this.

“Conversely patients with depression or hearing problems were more at risk of being misidentified with dementia. However, the main influence is severity. Patients with mild dementia may not volunteer troubling memory problems and GPs are often unsure about the value of screening tests. Given the problem of false positives and false negatives we found that the application of a simple cognitive screening test after a clinical diagnosis would help GPs to achieve about 90% accuracy. We report separately which screening test may be best in Am J Geriatr Psychiatry 2010;18:759.”

On the Net:

Research Reveals That 10% Of Middle-Aged Europeans Are On Antidepressants

New research from the University of Warwick and the IZA Institute in Bonn shows that 10% of middle-aged Europeans took antidepressants in 2010. The researchers looked in detail at the lives of a randomly selected sample of nearly 30,000 Europeans. The study covered 27 countries.

Andrew Oswald, an economics professor at the University of Warwick, and co-author of the study, described the results as concerning, he said: “Antidepressants are a relatively new kind of commodity. We are only starting to get proper data on who takes them. But as we live in the richest and safest era in the history of humans, perhaps we are going to have to ask ourselves why one in ten of Europe’s middle-aged citizens need a pill to cope with life. That is an awful lot of people relying on chemical happiness.”

In detail, the authors of the report find:

(i) One in thirteen of adult European citizens — and 10% of middle-aged Europeans — took an antidepressant in the previous twelve months;

(ii) The rates of antidepressant use are markedly greatest in Portugal, but also noticeably higher than the European norm in Lithuania, France and the UK;

(iii) The probability of taking an antidepressant is greatest among those middle-aged, female, unemployed, with low levels of education, and divorced or separated;

(iv) A strong hill-shaped age pattern is found — both for males and females and in Western and Eastern Europe — that peaks in people’s late 40s. The study adjusts for whether individuals have young children, so children are not the cause of the midlife low in well-being.

(v) This pattern is consistent with, and independently helps corroborate, the recent finding across the world that happiness and mental health follow an approximate U-shape through life. The scientific explanation for that midlife low is still unknown.

The new study, “Antidepressants and Age”, by David G. Blanchflower and Andrew J. Oswald, can be downloaded from this page and as a Discussion Paper from the Publications section of the IZA Institute site.

On the Net:

70% Of Cocaine In The US Is Contaminated

Doctors warned in a report on Monday that a veterinary medication is being used to dilute 70 percent of the cocaine in the U.S., causing serious skin reactions after use.

The report said six patients developed purpled-colored patches of necrotic skin on their ears, nose, cheeks and other parts of their body and suffered permanent scarring after they used cocaine.

Doctors previously reported two similar cases in San Francisco.  Others reported on users of contaminated cocaine who developed a related life-threatening immune-system disorder called agranulocytosis, which kills 7 percent to 10 percent of patients.

The U.S. Department of Justice reported that up to 70 percent of cocaine in the U.S. is contaminated with the drug levamisole, which is widely available and commonly used for deworming livestock.  Levamisole had been prescribed for humans in the past but was discontinued after developing side effects.

“We believe these cases of skin reactions and illnesses linked to contaminated cocaine are just the tip of the iceberg in a looming public health problem posed by levamisole,” Noah Craft, MD, PhD, a Los Angeles Biomedical Research Institute at Harbor-UCLA Medical Center (LA BioMed) principal researcher and author of the report, said in a statement.

“We published this report to educate the public to the additional risks associated with cocaine use and to increase awareness among physicians who may see patients with these skin reactions that are a clue to the underlying cause of the disease.

“Because this reaction can commonly be mistaken as an autoimmune disease called vasculitis, it is important for physicians to know about this new disease entity.”

Craft said he and colleagues were initially baffled by the severity of the skin damage. 

The researchers said they began discussing the skin damage seen in emergency rooms in New York and Los Angeles during a conference call and realized they were all seeing similar patterns.

The cases were pooled and added into the professional database immediately so other physicians were able to see this new diagnosis.

“We have had several more cases since we wrote this report,” he said in a statement. “In one of the more interesting ones, the patient used cocaine again and developed the same skin reaction again. He then switched drug dealers and the problem cleared up.”

The report was published in the Journal of the American Academy of Dermatology.

Image 2: Doctors warn of a potential public health epidemic after treating patients with serious skin reactions after the patients had smoked or snorted cocaine believed to be contaminated with a veterinary medication. Credit: Dr. Noah Craft

On the Net:

Driving Causes Higher Risk Of Skin Cancer In Left Arm

According to a new study, people in the U.S. are more susceptible to skin cancer on the left side of their bodies, possibly due to driving.

Researchers from the University of Washington in Seattle said driving may be to blame because the left arm receives more UV rays.

The researchers said that when skin cancer occurred in one side of the body, 52 percent of melanoma cases and 53 percent of merkel cell cases developed on the left side.

The study provides the strongest evidence to date of a left side bias in skin cancer cases in the U.S.

The National Cancer Institute says that in 2010, over 68,000 people were diagnosed with melanoma, and 8,700 people died from the disease.

Researchers found during a 1986 study that Australian men were more likely to show precancerous growths on the right side of their bodies.

Car windows do offer some protection by blocking most UVB rays, which is an intense form of UV that often causes sunburns.

“The reality is that any of the glass in the car will get out most of the bad UV,” study co-author Paul Nghiem said in a statement.

He said that UVA rays penetrate glass and can still cause damage to the skin over time.

Nghiem says for most people who drive with their side window closed, there is no reason to apply sunscreen before driving.  However, the study says for drivers prone to skin cancer who spend large amounts of time driving, sunscreen may be a good idea.

A 2003 study on UV exposure in cars advised professional drivers to keep their windows rolled up and to use air conditioning. 

“Truckers would certainly be a group who would want to be aware of UV exposure while driving,” Kelly Paulson, a co-author of the University of Washington study, said in a statement.

The study was published online in April by the Journal of the American Academy of Dermatology.

On the Net:

International Team Works Out Secrets Of One Of World’s Most Successful Patient Safety Programs

How dramatic reduction in infection at 100 intensive care units was achieved is revealed

A team of social scientists and medical and nursing researchers in the United States and the United Kingdom has pinpointed how a programme, which ran in more than 100 hospital intensive care units in Michigan, dramatically reduced the rates of potentially deadly central line bloodstream infections to become one of the world’s most successful patient safety programmes.

Funded in part by the Health Foundation in the UK, the collaboration between researchers at the Johns Hopkins University, the University of Leicester and the University of Pennsylvania, has led to a deeper understanding of how patient safety initiatives like the Michigan programme can succeed.

“Explaining Michigan: developing an ex post theory of a quality improvement programme” by Mary Dixon-Woods and Emma-Louise Aveling of the University of Leicester; Charles Bosk of the University of Pennsylvania and Christine Goeschel and Peter Pronovost of Johns Hopkins University, is published in the June 2011 edition of Milbank Quarterly.

“We knew this programme worked. It not only helped to eliminate infections, it also reduced patient deaths,” said programme leader Peter Pronovost of the Johns Hopkins University School of Medicine, who was named as one of Time Magazine’s 100 most influential people in 2008 and was the recipient of a MacArthur Fellowship, or ‘genius grant,’ from the John D. and Catherine T. MacArthur Foundation. “The challenge was to figure out how it worked”.

The researchers found that one of the Michigan programme’s most important features is that it explicitly outlined what hospitals had to do to improve patient safety, while leaving specific requirements up to the hospital personnel. A critical aspect of the programme was convincing participants that there was a problem capable of being solved together.

“It was achieved by a combination of story-telling about real-life tragedies of patients who came to unnecessary harm in hospital, and using hard data about infection rates,” said co-author Charles Bosk, a professor of sociology in Penn’s School of Arts and Sciences and a senior fellow in the Center for Bioethics at Penn.

Infection rates were continuously monitored at hospitals participating in the programme, making it easier for hospital workers to track how well they were doing and where they needed to improve.

The authors conclude that that there are important lessons for others attempting patient safety improvements. Checklists were an essential component, but not necessarily the most important element of the Michigan programme.

“The programme was much more than a checklist,” said lead author Mary Dixon-Woods, professor of medical sociology at the University of Leicester, “It involved a community of people who over time created supportive relationships that enabled doctors and nurses in many hospitals to learn together, share good practice, and exert positive pressure on each other to achieve the best outcomes for patients.”

“What we have learned is that it is the local teams that deliver the results”, said Dr Bosk. “But they need to be well supported by a core project team, who have to focus on enabling hospital workers to get things right. That means providing them with scientific expertise to justify the changes they are being asked to make, and standardising measures so they are all collecting the same data. It also means trying to figure out why simple changes that make life better are so difficult for health care delivery systems to do. Getting the whole programme to work, rather than compliance with a single one component, is the key to making health care safer for patients.”

“No one discipline has the answer to patient safety problems. We have to bring together contributions from clinical medicine and the social sciences to make real progress in this area” added Dr Provonost. This month, Dr. Pronovost was named director of Johns Hopkins’ newly formed Armstrong Institute for Patient Safety and Quality and senior vice president for patient safety and quality.

On the Net:

Stem Cell Trials To Begin For Blindness Treatment

The first two patients with common but incurable diseases of the eye that can lead to blindness have been enrolled early in a two phase groundbreaking clinical trial of therapy that researchers are hoping will heal the damage caused by the conditions.

Advanced Cell Technology Inc. (ACT) announced Thursday the enrollment of the patients in the trials for Stargardt’s Macular Dystrophy (SMD) and Dry Age-Related Macular Degeneration (Dry AMD). The patients were enrolled at the Jules Stein Eye Institute at the University of California, Los Angeles.

ACT won approval by the US Food and Drug Administration in January to use human embryonic stem cells for treating macular degeneration, a common cause of vision loss. That followed FDA approval in November for scientists to test the stem cells to treat people with Stargardt’s Macular Dystrophy. The new trials will test the safety and tolerability of retinal pigment epithelial, or RPE cells, which ACT makes from the human embryonic stem cells.

“The enrollment of the first patients in our two clinical trials marks an important step forward for the field of regenerative medicine,” said Gary Rabin, interim chairman and CEO of ACT. “We are very pleased with the progress that has been made toward bringing this ground-breaking technology to the patients who need it most.”

A total of 24 patients have entered the two separate trials, said representatives from the Massachusetts-based ACT. Starting in July, the remaining 22 participants will be officially enrolled into the study, according to a company spokeswoman.

The medical team hopes to slow, halt or even reverse the effects of the conditions by injecting the healthy RPE cells into the eye.

“These trials mark a significant step toward addressing what is one of the largest unmet medical needs of our time, treatments for otherwise untreatable and common forms of legal blindness,” lead investigator Steven Schwartz at University of California Los Angeles Jules Stein Eye Institute, told AFP.

Each of the two studies will have 12 patients, with groups of three testing different doses of the RPE cells. Dr. Robert Lanza, chief scientific officer of ACT, said in an email that the company planned to start stem cell transplants within the next few weeks.

“After a decade of extensive research and preclinical studies, it is very satisfying to finally be moving into the clinic,” Lanza told Reuters in a statement. “We hope that these cells will, in the future, provide a treatment not only for these two untreatable diseases — Stargardt’s disease and macular degeneration — but for patients suffering from a range other debilitating eye diseases.”

Dry AMD is the most common form of macular degeneration and the leading cause of blindness in the developed world, according to Dr. Schwartz. The number of cases is expected to double over the next 20 years as the population ages, he said.

Currently, there is no cure for AMD, which affects more than 10 million Americans and another 10 million in Europe, ACT said.

Stargardt’s disease causes blindness by destroying the pigmented layer of the retina. After that follows degradation of photoreceptors, which are the cells in the retina that detect light. Patients with Stargardt’s often experience blurred vision, difficulty seeing in low-light conditions and eventually most lose their ability to see at all. The disease can be inherited by a child when both parents carry the gene mutation that causes it.

The trails announcement is a milestone for ACT, which has been developing the therapy for the past decade.

The first trial will be given to patients with dry AMD. The second focuses on Stargardt’s, which generally strikes younger people between the ages of 10 and 20. The early-stage trials will be assessed by doctors over a 12-month period.

If the treatment works as well as doctors hope, the injected RPE cells will grow and eventually restore the retina to a healthy state able to support light-sensitive cells required for eyesight.

“We hope that these cells will, in the future, provide a treatment not only for these two untreatable diseases, Stargardt’s disease and macular degeneration, but for patients suffering from a range of other debilitating eye diseases,” Lanza told the Guardian.

“If these therapies work as we hope they will, particularly with small volumes of cells, then we should be in an excellent position to take advantage of our patented techniques for manufacturing large numbers of doses of RPE cells that can be conveniently stored and shipped to clinicians following the basic manufacturing and distribution systems already familiar to pharmaceutical and biotech companies,” Rabin said.

Animal studies have reportedly shown that injecting fresh RPE cells into the eye could bring about a substantial improvement in eyesight. In other studies, scientists said mice with eye disease recovered near-normal vision after receiving the therapy.

Geron Corp last October enrolled its first patient in an approved study of human embryonic stem cells with the goal of treating people with spinal cord injuries.

Scientists around the world hope to be able to use the stem cells to address not only spinal cord injuries and eye diseases, but also for cancer, diabetes, Alzheimer’s and Parkinson’s diseases.

Opponents of human embryonic stem cell research object to their use because in order to get the cells, someone has to take apart a human embryo. The Obama administration last year overturned the strictest limitations on using federal funds for the research, but the policy was challenged by two researchers. In April, a US appeals court ruled that funding can continue.

In addition to the Jules Stein Eye Institute at UCLA, the Casey Eye Institute (CEI) at Oregon Health & Science University (OHSU) in Portland, OR, is also open for enrollment of patients with SMD.

As additional sites are ready to enroll patients with SMD and dry AMD, they will be listed on the Clinical Trials page on ACT’s Web site and at http://www.clinicaltrials.gov.

Surgery Restores Sight After 55 Years

After being hit by a thrown rock in the 1950s when he was just eight-years-old, a man had his vision restored after being treated for glaucoma, The Telegraph is reporting.

The unnamed man, now 63 years of age, entered the New York Eye and Ear Infirmary complaining of persistent pain and redness in the blind eye. Doctors found he had glaucoma and high eye pressure.

Once his eye pressure had stabilized they treated the neovascular glaucoma using monoclonal antibody therapy and found that against all odds the patient regained light perception.  Encouraged, the doctors suggested reattaching the retina.

After an operation to do so, the man had recovered his sight to such an extent that he could count fingers more than 5 yards away, BBC news reports. Infirmary physician Dr. Olusola Olawoye explained, “To the best of our knowledge this is the first report of visual recovery in a patient with long-standing traumatic retinal detachment.”

“This is not only a great result for our patient but has implications for restoring eyesight in other patients. In the future retinal reattachments after long periods could be aided with the use of stem cells to regenerate diseased retinas,” he added.

After a year, the patient required further retinal surgery due to the scars inside his eye forcing parts of the retina to become detached again. However this second surgery was also successful.

How Does Identification With An Organization Enhance Values?

Strongly identifying with an organization or workplace can change people’s lives in profound ways, according to a new study in the Journal of Consumer Research.

“Managers often hope that consumers identify with organizations they regularly patronize, and firms sometimes encourage labor to encourage employees to identify with firms they work for, because in both cases organizations benefit from such identification,” write authors Melea Press and Eric J. Arnould (both University of Wyoming, Laramie). The authors focus on identification formation from the perspective of consumers, whose personal, economic, and social lives are affected by organizations.

The authors conducted interviews with consumers who had recently joined a Community Supported Agriculture (CSA) program. They also interviewed employees at an advertising agency, at all levels from receptionist to CEO.

In the interviews, the authors learned how consumers learn to integrate values and behaviors from within and beyond the organization, often in life-changing ways. “So, a new CSA member learns how and why he should appreciate locally grown organic vegetables, and then begins to find additional opportunities to buy other organic and locally made products more generally,” the authors explain.

“Similarly, an employee learns the value of making clear, considered, and creative choices and brings that value into her personal life as she reduces her consumer debt and even makes better choices for romantic partners,” the authors explain.

For some consumers, identification comes suddenly, as an epiphany, whereas others take more gradual paths, emulating mentors who are forging new ways of living. For example, over time, a CSA member who was a workaholic could use his experience with the organization to help assess his lifestyle and end up dramatically cutting back his work schedule to spend time with his family.

On the Net:

Low-Carbohydrate, High-Protein Diets May Reduce Both Tumor Growth Rates And Cancer Risk

Eating a low-carbohydrate, high-protein diet may reduce the risk of cancer and slow the growth of tumors already present, according to a study published in Cancer Research, a journal of the American Association for Cancer Research.

The study was conducted in mice, but the scientists involved agree that the strong biological findings are definitive enough that an effect in humans can be considered.

“This shows that something as simple as a change in diet can have an impact on cancer risk,” said lead researcher Gerald Krystal, Ph.D., a distinguished scientist at the British Columbia Cancer Research Centre.

Cancer Research editor-in-chief George Prendergast, Ph.D., CEO of the Lankenau Institute for Medical Research, agreed. “Many cancer patients are interested in making changes in areas that they can control, and this study definitely lends credence to the idea that a change in diet can be beneficial,” said Prendergast, who was not involved with the study.

Krystal and his colleagues implanted various strains of mice with human tumor cells or with mouse tumor cells and assigned them to one of two diets. The first diet, a typical Western diet, contained about 55 percent carbohydrate, 23 percent protein and 22 percent fat. The second, which is somewhat like a South Beach diet but higher in protein, contained 15 percent carbohydrate, 58 percent protein and 26 percent fat. They found that the tumor cells grew consistently slower on the second diet.

As well, mice genetically predisposed to breast cancer were put on these two diets and almost half of them on the Western diet developed breast cancer within their first year of life while none on the low-carbohydrate, high-protein diet did. Interestingly, only one on the Western diet reached a normal life span (approximately 2 years), with 70 percent of them dying from cancer while only 30 percent of those on the low-carbohydrate diet developed cancer and more than half these mice reached or exceeded their normal life span.

Krystal and colleagues also tested the effect of an mTOR inhibitor, which inhibits cell growth, and a COX-2 inhibitor, which reduces inflammation, on tumor development, and found these agents had an additive effect in the mice fed the low-carbohydrate, high-protein diet.

When asked to speculate on the biological mechanism, Krystal said that tumor cells, unlike normal cells, need significantly more glucose to grow and thrive. Restricting carbohydrate intake can significantly limit blood glucose and insulin, a hormone that has been shown in many independent studies to promote tumor growth in both humans and mice.

Furthermore, a low-carbohydrate, high-protein diet has the potential to both boost the ability of the immune system to kill cancer cells and prevent obesity, which leads to chronic inflammation and cancer.

On the Net:

Curbing Soot, Smog Could Help Limit Global Temperature Rise

A United Nations report released Tuesday calls for fast action on reducing emissions of black carbon, ground level ozone and methane to help limit near term global temperature rise preventing the Earth from overheating, reports AFP.

Fast action could also reduce losses of mountain glaciers and reduce projected warming in the Arctic over the coming decades by as much as two thirds.

Cutting out these pollutants now could also boost global food output and save millions of human lives lost to heart and lung disease, according to the report from the UN Environment Program (UNEP) and the World Meteorological Organization (WMO).

Close to 2.5 million premature deaths from outdoor air pollution could on average also be avoided annually by 2030, with many of those lives saved being in Asia, according to estimates.

Even as nations continue to be in a stalemate over climate change responsibilities, parallel action now on black carbon particles and ground-level ozone would buy precious time in limiting projected global temperature increases of 3.6 degrees Fahrenheit in the coming decades, the report said.

Record output in 2010 of CO2 emissions and levels in the atmosphere suggest that efforts to maintain the temperature cap may already be too late, according to climate scientists.

On the world’s current trajectories, temperatures are set to rise 2.3 degrees Fahrenheit by 2050. Add in an already 1.6 F jump since human-induced warming came into play and that would bring the total rise to 4.0 degrees in worldwide average temperature increase.

Cutting out these “short-lived” climate forcers would have an immediate benefits in respect to climate, health and agriculture, the report says. This is because, unlike CO2, which remains in the atmosphere for centuries, black carbon only exists for a few days or weeks. But even as curbing black carbon and ground-level ozone could play a key role in limiting near-term climate changes, cutting back CO2 levels is crucial for the long-term temperature outlook.

“There are clear and concrete measures that can be undertaken to help protect the global climate in the short and medium term,” said Drew Shindell, a researcher at NASA’s Goddard Institute for Space Studies and one of the 50 scientists behind the new assessment.

“The win-win here for limiting climate change and improving air quality is self-evident and the ways to achieve it have become far clearer,” said Shindell.

The report was released in Bonn as the 190-nation UN Framework Convention on Climate Change (UNFCCC) struggles to move forward in the deadlocked climate negotiations.

Black carbon, found in soot, is a byproduct of incomplete burning of fossil fuels, wood and biomass. The most common offenders are auto emissions, primitive wood stoves, forest fires and industry.

Soot suspended in the air accelerates global warming by absorbing sunlight. When it covers snow and ice, the Sun’s radiative force is unable to be reflected back into space and is instead soaked up on Earth, speeding up the melting of mountain glaciers, ice sheets, and the Arctic ice cap.

Soot has also been linked to premature death from heart disease and lung cancer.

Ground-level ozone, a huge ingredient in urban smog, is a powerful greenhouse gas and also a noxious air pollutant. It is formed from methane and other gases. Methane itself is a powerful driver of global warming.

“The science of short-lived climate forcers has evolved to a level of maturity that now requires … a robust policy response by nations,” said Achim Steiner, Executive Director of UNEP.

Recommended procedures for cutting black carbon include the use of mandatory diesel filters on vehicles, phasing out wood-burning stoves in wealthy countries, use of clean-burning biomass stoves for cooking and heating in developing nations, and a ban on the open burning of agricultural waste.

For cutting ground-level ozone, the report calls for the curbing of organic waste, requirements of water treatment facilities to recover gas, reduce methane emissions from coal and oil industries, and promote anaerobic digestion of manure from cattle and pigs, both of which are huge methane sources.

Curbing ground-level ozone could also avoid losses in global maize, rice, soybean, and wheat production, the report concludes.
ã

On the Net:

NIH Researchers Find New Clues About Aging

Genetic splicing mechanism triggers both premature aging syndrome and normal cellular aging

National Institutes of Health researchers have identified a new pathway that sets the clock for programmed aging in normal cells. The study provides insights about the interaction between a toxic protein called progerin and telomeres, which cap the ends of chromosomes like aglets, the plastic tips that bind the ends of shoelaces.

The study by researchers from the National Human Genome Research Institute (NHGRI) appears in the June 13, 2011 early online edition of the Journal of Clinical Investigation.

Telomeres wear away during cell division. When they degrade sufficiently, the cell stops dividing and dies. The researchers have found that short or dysfunctional telomeres activate production of progerin, which is associated with age-related cell damage. As the telomeres shorten, the cell produces more progerin.

Progerin is a mutated version of a normal cellular protein called lamin A, which is encoded by the normal LMNA gene. Lamin A helps to maintain the normal structure of a cell’s nucleus, the cellular repository of genetic information.

In 2003, NHGRI researchers discovered that a mutation in LMNA causes the rare premature aging condition, progeria, formally known as known as Hutchinson-Gilford progeria syndrome. Progeria is an extremely rare disease in which children experience symptoms normally associated with advanced age, including hair loss, diminished subcutaneous fat, premature atherosclerosis and skeletal abnormalities. These children typically die from cardiovascular complications in their teens.

“Connecting this rare disease phenomenon and normal aging is bearing fruit in an important way,” said NIH Director Francis S. Collins, M.D., Ph.D., a senior author of the current paper. “This study highlights that valuable biological insights are gained by studying rare genetic disorders such as progeria. Our sense from the start was that progeria had a lot to teach us about the normal aging process and clues about more general biochemical and molecular mechanisms.”

Collins led the earlier discovery of the gene mutation responsible for progeria and subsequent advances at NIH in understanding the biochemical and molecular underpinnings of the disease.

In a 2007 study, NIH researchers showed that normal cells of healthy people can produce a small amount of progerin, the toxic protein, even when they do not carry the mutation. The more cell divisions the cell underwent, the shorter the telomeres and the greater the production of progerin. But a mystery remained: What was triggering the production of the toxic progerin protein?

The current study shows that the mutation that causes progeria strongly activates the splicing of lamin A to produce the toxic progerin protein, leading to all of the features of premature aging suffered by children with this disease. But modifications in the splicing of LMNA are also at play in the presence of the normal gene.

The research suggests that the shortening of telomeres during normal cell division in individuals with normal LMNA genes somehow alters the way a normal cell processes genetic information when turning it into a protein, a process called RNA splicing. To build proteins, RNA is transcribed from genetic instructions embedded in DNA. RNA does not carry all of the linear information embedded in the ribbon of DNA; rather, the cell splices together segments of genetic information called exons that contain the code for building proteins, and removes the intervening letters of unused genetic information called introns. This mechanism appears to be altered by telomere shortening, and affects protein production for multiple proteins that are important for cytoskeleton integrity. Most importantly, this alteration in RNA splicing affects the processing of the LMNA messenger RNA, leading to an accumulation of the toxic progerin protein.

Cells age as part of the normal cell cycle process called senescence, which progressively advances through a limited number of divisions in the cell lifetime. “Telomere shortening during cellular senescence plays a causative role in activating progerin production and leads to extensive change in alternative splicing in multiple other genes,” said lead author Kan Cao, Ph.D., an assistant professor of cell biology and molecular genetics at the University of Maryland, College Park.

Telomerase is an enzyme that can extend the structure of telomeres so that cells continue to maintain the ability to divide. The study supplied support for the telomere-progerin link, showing that cells that have a perpetual supply of telomerase, known as immortalized cells, produce very little progerin RNA. Most cells of this kind are cancer cells, which do not reach a normal cell cycle end point, and instead replicate out of control.

The researchers also conducted laboratory tests on normal cells from healthy individuals using biochemical markers to indicate the occurrence of progerin-generating RNA splicing in cells. The cell donors ranged in age from 10 to 92 years. Regardless of age, cells that passed through many cell cycles had progressively higher progerin production. Normal cells that produce higher concentrations of progerin also displayed shortened and dysfunctional telomeres, the tell-tale indication of many cell divisions.

In addition to their focus on progerin, the researchers conducted the first systematic analysis across the genome of alternative splicing during cellular aging, considering which other protein products are affected by jumbled instructions as RNA molecules assemble proteins through splicing. Using laboratory techniques that analyze the order of chemical units of RNA, called nucleotides, the researchers found that splicing is altered by short telomeres, affecting lamin A and a number of other genes, including those that encode proteins that play a role in the structure of the cell.

The researchers suggest that the combination of telomere fraying and loss with progerin production together induces cell aging. This finding lends insights into how progerin may participate in the normal aging process.

On the Net:

Tool Developed To Predict Violence And Aggression In Children And Teens

Researchers hope to combine questionnaire with ‘biomarkers of aggression’ to better predict pediatric aggression and violence

Researchers at Cincinnati Children’s Hospital Medical Center have developed a tool to rapidly assess the risk of aggressive and violent behavior by children and adolescents hospitalized on psychiatric units. Ultimately, they hope to use the questionnaire to improve treatment and prevention of aggressive behavior in schools and in the community.

A study providing preliminary validation of the Brief Rating of the Child and Adolescent Aggression (BRACHA) tool is published online in the Journal of the American Academy of Psychiatry and the Law.

“Using the BRACHA could help hospitals cut down on violence,” says Drew Barzman, MD, a child and adolescent forensic psychiatrist at Cincinnati Children’s and lead author of the study.

The study involved 418 children and teens who had been hospitalized on psychiatric units at Cincinnati Children’s. Prior to hospitalization, they were evaluated in the emergency department by psychiatric social workers who administered the BRACHA questionnaire. A total of 292 aggressive acts were committed by 120 of the hospitalized patients (or 29 percent). Fourteen of the 16 items on the survey were significantly associated with aggression by children and teens.

The researchers expect to further validate the updated 14-item BRACHA questionnaire in a larger study of about 1,000 to 1,500 patients in their database.

“The BRACHA may ultimately help doctors improve safety in hospitals, reduce the use of seclusion and restraint in the inpatient setting and focus interventions on reducing aggression-related risk,” says Dr. Barzman. “The long-term goal is to prevent kids from going down a criminal path. If we can find high risk children before they become involved with the juvenile justice system, which is why we are studying 7 to 9 year olds, we can hopefully provide more effective treatment and prevention.”

The BRACHA study was funded by grants from the National Institutes of Health and Cincinnati Children’s.

Combining Questionnaire with New Research

Dr. Barzman and fellow researchers also are now examining two dozen 7- to 9-year-old psychiatric inpatients to determine whether levels of three hormones in their saliva (biomarkers of pediatric aggression) ““ testosterone, cortisol and DHEAS ““ can be combined with the BRACHA questionnaire to even better predict aggressive behavior in the hospital and also improve treatment and prevention outside hospital walls.

“In previously published studies, investigators linked levels of these hormones with levels and types of aggression and violence,” says Dr. Barzman. “We’re hoping our current salivary study, in conjunction with the BRACHA questionnaire findings, will provide even more meaningful results.”

On the Net:

Drinking, Cannabis Use And Psychological Distress Increase

The latest survey of Ontario adults from the Centre for Addiction and Mental Health (CAMH) shows increasing rates of daily drinking and cannabis use and high levels of psychological distress. The results of the 2009 CAMH Monitor survey, the longest running survey tracking mental health and addiction indicators among adults in Ontario, were published today.

Alcohol

The proportion of adults reporting daily drinking increased from 5.3% in 2002 to over 9% in 2009. The average number of drinks consumed weekly among drinkers has also increased from 3 drinks to 4.6 drinks, and the proportion of adults exceeding low-risk drinking guidelines remains at elevated levels (22%). However, there were also some encouraging findings: there was a significant decline in binge drinking from 12.6% in 2006 to 7.1% in 2009, and the decline was evident especially among young adults, from 24% to 11.5%.

Although driving within an hour of consuming two or more drinks has shown a steady decline in the past years, from 13.1% in 1996 to 6.9% in 2009, there is evidence that this trend has reversed among young adults. Driving after drinking posted a significant increase among 18 to 29 year olds, from 7.7% in 2005 to 12.8% in 2009.

“The data tell us that while the number of people who drink alcohol has not changed, the way they are drinking has — people are drinking more often and may be consuming more alcohol when they do drink, although there may be fewer binge occasions,” said Dr. Robert Mann, CAMH Senior Scientist and lead investigator on the study. “We know that the more access people have to alcohol, the more people will drink, leading to more instances of drinking and driving. Measures such as Random Breath Testing and lowering legal limits to .05% can reduce drunk driving deaths. The implementation of .05% legislation in British Columbia appears to have resulted in a 50% decrease in drinking and driving deaths in that province.”

Cannabis

The prevalence of cannabis use has been steadily increasing from 8.7% in 1996 to 13.3% in 2009, for both men and women and among all age groups. Along with this, there was almost a 2-fold increase in cannabis use among those aged 18-29, from 18.3% to 35.8%.

“These increases are of concern to us,” stated Dr. Mann. “We know that cannabis use may increase the risk of psychosis for people who are predisposed to schizophrenia, and may worsen the symptoms of other mental illnesses.” Another noticeable change was the large increase in use of cannabis among older adults. Use by those aged 50 years and older increased more than 3-fold from 1.4% to 4.7% between 1996 and 2009 and, among past year cannabis users, the proportion of users aged 50 years and older increased from 1.9% to 13.9% during the same time period.

Tobacco

Some positive findings are that the percentage of adult Ontarians reporting smoking cigarettes declined from 19.7% in 2008 to 18.6% in 2009. Though 14% of Ontarians still report daily smoking, it is a positive sign that cigarette smoking has steadily declined since 1996, from 26.8% to 18.6% in 2009. Dr. Mann notes that the provincial government’s commitment to anti- smoking legislation through smoke-free Ontario probably played an important role in the decrease in smoking rates.

Mental Health

One in seven Ontario adults (14.7%), representing 1,400,000 people, reported symptoms of elevated psychological distress, and almost 6% reported that their overall mental health was poor. Those aged 30-39 were the most likely to report poor mental health, and those over age 65 reported the lowest rates of poor mental health. Mental health was strongly correlated with education. Those who had not graduated high school reported higher levels of poor mental health, and those who had graduated from university reported lower rates. “These results suggest that the social determinants of health, such as income, play as important a role in mental health as they do in physical health,” said Mann.

The use of anti-anxiety medication has remained stable over the past few years, but trend data shows that over the past 10 years, use of these medications has risen from 4.5% to nearly 7% of Ontario adults. The same pattern can also be seen in the use of antidepressant medication, which has trended upward from 3.6% in 1999 to the current rate of 6.6%. Added Dr. Mann, “Though these are marked increases, they may also be showing that more people experiencing mental health problems are seeking and receiving help, which is a positive step.”

Despite several differences, there was no strong dominant pattern in regional differences. Those from Northern Ontario were the most likely to be current smokers and to smoke daily; those from Toronto were the least likely to drink alcohol; those from the South West region of the province reported the highest average number of drinks consumed per week; and driving after drinking was most likely in the South West and in the Central South regions.

On the Net:

Children Who Live With Pets Less Likely To Develop Allergies

A new study suggests that children who live with dogs and cats are less likely to develop allergies to those animals later in life, but only if the pet is under the same roof while the child is still an infant.

The researchers found that, compared to babies born into cat-free homes, those who grew up with cats were roughly half as likely to be allergic to them as teenagers.

They found that growing up around a dog reduced the risk of dog allergies by about the same amount for boys, but not for girls.

Being exposed to pets anytime after the first year of life appeared to have no effect on allergy risks, which indicates that timing may be everything when it comes to trying to prevent allergies.

The researchers suspect that early exposure to pet allergens and pet-related bacteria strengthens the immune system, accustoms the body to allergens, and helps the child build up a natural immunity.

“Dirt is good,” lead researcher Ganesa Wegienka, MS, PhD, of the Department of Public Health Sciences, Henry Ford Hospital, said in a statement. “Your immune system, if it’s busy with exposures early on, stays away from the allergic immune profile.”

This is not the first study to find that having a household pet may protect kids from allergies.

David Nash, M.D., clinical director of allergy and immunology at the Children’s Hospital of Pittsburgh, said in a statement that previous studies have had mixed results so it is too early to recommend getting a dog or cat to ward off allergies in an infant.

“In the end, we’ll probably find out that there are periods of opportunity when exposure to allergens, for some people, is going to have a protective effect,” Dr. Nash, who was not involved with the new study, said in a statement. “But we’re a long way from figuring out who it’s protective for and when that optimal period is.”

The team found that teenagers who lived with a cat during their first year of life had a 48 percent lower risk of cat allergy than their peers.  Also, the teen boys who lived with a dog had a 50 percent lower risk of allergy.

The authors say infant girls may not develop the same immunity as boys because they may interact differently with dogs than infant boys.

The researchers collected information from 566 children and their parents about the kids’ exposure to indoor pets and their history of allergies for the study.

The study is published the journal Clinical & Experimental Allergy.

On the Net:

Babies Who Are Breastfed Less Likely To Die Of SIDS

According to a new study, babies who are breastfed are less likely to die of sudden infant death syndrome (SIDS).

The authors write in Pediatrics that other explanations seem unlikely.

“Breastfeeding is the best method of feeding infants,” Dr. Fern Hauck, the study’s lead author from the University of Virginia School of Medicine in Charlottesville, said in a statement to Reuters Health.

SIDS is defined as a sudden and unexplained death in a baby less than a year old.  According to the National Institute of Health, it is most common in infants between two and four months old and kills about 2,500 infants in the U.S. each year.

Researchers are not sure what causes SIDS, but they do know African American and male babies are more likely to die from SIDS.

Hauck said one theory is the cause of SIDS is that it happens in babies sleeping with their faces down or heads covered who do not turn their heads or cry like most babies would, and slowly suffocate.

The authors said breastfeeding could be linked to SIDS because it protects infants against minor infections that have also been shown to make sudden death more likely.

The World Health Organization recommends that mothers breastfeed their babies for the first six months of life.

Hauck and her colleagues said data from 18 studies asked mothers of infants who had or had not died of SIDS about whether they breastfed the infants.

The researchers found the rate of SIDS was 60 percent lower among infants who had any amount of breastfeeding compared to those who did not breastfeed.

However, they said more research is needed to see if the duration of breastfeeding affects the risk of SIDS.

The analysis does not definitively show that there is a cause and effect relationship between breastfeeding and SIDS risk. 

“We found a protective effect even after controlling for factors that could explain the association,” Hauck said in a statement.

He said babies who sleep in the same room as their parents and those who use a pacifier while sleeping also have a smaller risk of sudden death.

The authors said the findings underscore the importance of promoting the positive effects of breastfeeding for both moms and babies.

On the Net:

Parents Give Their Children 60 New Genome Mutations

Researchers have discovered that parents give their children about 60 new genome mutations.

The new value is reported in the first-ever direct study of new mutations coming from mother and father.

Researchers measured directly the number of mutations in two families, using whole genome sequences from the 1000 Genomes Project.  The results also reveal that human genomes, like all genomes, are changed by the forces of mutation.

New mutations are the ultimate source from which new variation is drawn. 

Professor Philip Awadalla, from the University of Montreal and co-leader for the project, said in a statement: “Today, we have been able to test previous theories through new developments in experimental technologies and our analytical algorithms. This has allowed us to find these new mutations, which are like very small needles in a very large haystack.”

The team wrote that finding new mutations is extremely technically challenging because only 1 in every 100 million letters of DNA is altered each generation.

“We human geneticists have theorized that mutation rates might be different between the sexes or between people,” Dr Matt Hurles, Senior Group Leader at the Wellcome Trust Sanger Institute, said in a statement. “We know now that, in some families, most mutations might arise from the mother, in others most will arise from the father.

“This is a surprise: many people expected that in all families most mutations would come from the father, due to the additional number of times that the genome needs to be copied to make a sperm, as opposed to an egg.”

The findings come from a study of two families consisting of both parents and one child.  The researchers looked for new mutations present in the DNA from the children that were absent from the parents’ genome.  They also looked at about 6000 possible mutations in the genome sequences.

The team sorted the mutations into those that occurred during the production of sperm or eggs of the parents and those that may have occurred during the life of a child. 

According to the findings, in one family, 92 percent of the mutations derived from the father, where as only 36 percent of the mutations derived from the father in the other family.

The number of mutations passed on from a parent to a child varied between parents by as much as tenfold. 

A person with a high natural mutation rate might be at greater risk of misdiagnosis of a genetic disease because the samples used for diagnosis might contain mutations that are not present in other cells in their body.

Reference: Conrad DF et al. (2011) Variation in genome-wide mutation rates within and between human families. Nature Genetics, published online 12 June 2011
doi:1038/ng.856

On the Net:

Formaldehyde Listed As Cancer Causing Agent

The US Department of Health and Human Services’ (HHS) and National Toxicology Program’s (NTP) official “Report on Carcinogens” had an additional eight commonly used substances added to it, after health officials said they may put people at an increased risk of developing cancer.

HHS on Friday added the industrial chemical formaldehyde and a botanical substance known as aristolochic acids to that list. Two other compounds, including certain glass wool fibers and styrene — used in Styrofoam — were added to the list as substances “reasonably anticipated to be human carcinogen.” The list also included captafol, cobalt-tungsten carbide, o-nitrotoluene and riddelliine.

Formaldehyde had already been listed on a previous report, but has now been upgraded as a “known carcinogen” as it has been found to cause nasal cancer in rats. Formaldehyde is found in a wide range of products including plastics, synthetic fibers and textile finishes.

In a report prepared for the Secretary of the HHS, researchers working for the National Institutes of Health (NIH) warned that people with higher exposure to formaldehyde were more at risk for nasopharyngeal cancer, myeloid leukemia and other types of cancers.

Aristolochic acid may cause cancer of the urinary tract and permanent kidney failure, the report stated. It is commonly used in traditional Chinese herbal remedies and in some weight-loss herbal supplements.

The government says that styrene is a component of tobacco smoke, and NIH says the greatest exposure to styrene comes from smoking cigarettes.

The American Chemistry Council (ACC) lashed out against the report, saying it was concerned that politics may have corrupted the scientific process.

“Today’s report by HHS made unfounded classifications of both formaldehyde and styrene and will unnecessarily alarm consumers,” Cal Dooley, president and CEO of the ACC, told Reuters in a statement.

Jennifer Sass of the National Resources Defense Council, a U.S. environmental group, applauded the government for releasing the report in the face of what she described as pressure from chemical companies to prevent it from being publicized.

“The chemical industry fought the truth, the science, and the public — but, in the end our government experts came through for us, giving the public accurate information about the health risks from chemicals that are commonly found in our homes, schools, and workplaces,” Sass wrote in a blog.

A warning by the US Food and Drug Administration a decade ago advised consumers to discontinue using botanical products containing aristolochic acids. But they are still widely available on the Internet and abroad.

The report said, however, that the listed substances do not always cause cancer. It mainly depends on the length and type of exposure and a person’s genetic makeup. The American Cancer Society estimates that only about 6 percent of cancers are related to environmental causes and most of that is on-the-job occupational exposure.

The report is available at ntp.niehs.nih.gov/go/roc12.

Scientist Says Global Warming Is Now Significant

Phil Jones, the United Kingdom scientist who was targeted in the “ClimateGate” affair, now says global warming since 1995 is statistically significant, a year after telling BBC News that post-1995 warming was not significant.

He noted that a year worth of data had pushed the trend past the threshold typically used to assess whether trends are in fact “real.” Jones said this shows the importance of using longer records for analysis.

Scientists generally use a minimum threshold of 95 percent to assess whether a trend is likely to be down to an underlying cause, rather than emerging by chance. If a trend meets the threshold, it generally means that the odds of it being down to chance are less than one in twenty.

Jones said last year’s analysis, which went back to 2009, did not reach this threshold — but adding data for 2010 puts it over that line.

“The trend over the period 1995-2009 was significant at the 90% level, but wasn’t significant at the standard 95% level that people use,” Jones told BBC News.

“Basically what’s changed is one more year [of data],” he noted. “That period 1995-2009 was just 15 years – and because of the uncertainty in estimating trends over short periods, an extra year has made that trend significant at the 95% level which is the traditional threshold that statisticians have used for many years.”

“It just shows the difficulty of achieving significance with a short time series, and that’s why longer series – 20 or 30 years – would be a much better way of estimating trends and getting significance on a consistent basis,” he added.

Jones’ 2010 comment in a BBC News interview, is routinely quoted inaccurately — as demonstration that the Earth’s surface temperature is not rising.

HadCRUT3, one of the main global temperature records used by bodies such as the Intergovernmental Panel on Climate Change (IPCC), is compiled by a joint effort between the Climatic Research Unit (CRU) at the University of East Anglia (UEA), where Jones is based, and the UK Met Office.

HadCRUT3, which Jones helps compile, shows warming 1995 – 2010 of 0.19C — consistent with the other major records, which all use slightly different ways of analyzing the data in order to compensate for issues such as the dearth of measuring stations in polar regions.

Shortly before the Copenhagen Climate Summit in December 2009, Jones found himself at the center of the affair that came to be known as “ClimateGate,” which saw the release of more than 1,000 emails taken from a CRU server.

Many critics alleged that the emails showed CRU scientists attempting to undermine the usual processes of science, and of manipulating data in order make global warming seem much worse. Subsequent inquiries found the scientists did fall short of best practice in some areas, but found no evidence of manipulation.

Since then, nothing has emerged through mainstream science to challenge the IPCC’s picture of a world warming through greenhouse gas emissions. And a new initiative to develop a global temperature record, based at Stanford University in California whose funding includes “climate skeptical” organizations, has reached early conclusions that match established records closely.

On the Net:

Carbon Release, Global Warming Now And In The Ancient Past

The present rate of greenhouse carbon dioxide emissions through fossil fuel burning is higher than that associated with an ancient episode of severe global warming, according to new research. The findings are published online this week by the journal Nature Geoscience.

Around 55.9 million years ago, the Earth experienced a period of intense global warming known as the Palaeocene”“Eocene Thermal Maximum (PETM), which lasted for around 170,000 years. During its main phase, average annual temperatures rose by around 5°C.

Scientists believe that the warming may have been initially triggered by an event such as the baking of organic-rich sediments by igneous activity that released the potent greenhouse gas, methane. This initial temperature increase warmed ocean bottom waters which allowed the break down of gas hydrates (clathrates), which are found under deep ocean sediments: this would have greatly amplified the initial warming by releasing even more vast volumes of methane. As the methane diffused from the seawater into the atmosphere it would have been oxidized to form carbon dioxide, another potent and longer-lived greenhouse gas.

Adam Charles and his PhD supervisor, Dr Ian Harding, both palaeoceanographers at the University of Southampton’s School of Ocean and Earth Science (SOES) based at the National Oceanography Centre, Southampton, co-authored the report. Dr Harding said: “The PETM has been seen by many as a natural test bed for understanding modern man-made global warming, despite it not being a perfect analogy.  However, the total amount of carbon released during this climatic perturbation and its rate of release have been unclear.”

To help fill this gap in knowledge, the researchers measured carbon isotope ratios of marine organic matter preserved in sediments collected in Spitsbergen. The sedimentary section is important because it records the entirety of the PETM, from its initiation to through the recovery period, and as such is the most complete record of the warming event so far known in high northern latitudes.

Based on their carbon isotope measurements and computer simulations of the Earth system, the researchers estimated that the rate of carbon emissions during the PETM peaked at between 300 million and 1,700 million metric tons per year, which is much slower than the present carbon emission rate.

“Our findings suggest that humankind may be causing atmospheric carbon dioxide to increase at rates never previously seen on Earth, which would suggest that current temperatures will potentially rise much faster than they did during the PETM,” concluded Dr Harding.

The authors of report published by Nature Geoscience are Ying Cui, Lee Kump, Christopher Junium, Aaron Diefendorf, Katherine Freeman and Nathan Urban (Pennsylvania State University), Andy Ridgwell (University of Bristol), and Adam Charles and Ian Harding (SOES). This research was supported by The Worldwide Universities Network, Pennsylvania State University, and the US National Science Foundation..

Cui, Y.,  Kump, L. R.,  Ridgwell, A. J.,  Charles, A. J.,  Junium, C. K., Diefendorf, A. F.,  Freeman, K. H., Urban, N. M. & Harding, I. C. Slow release of fossil carbon during the Palaeocene”“Eocene Thermal Maximum. Nature Geoscience (Published online, 5 June 2011). DOI: 10.1038/NGEO1179

Image Caption: Core shed in Spitsbergen

On the Net:

Dangerous And Under The Radar

New study from Concordia and University of Windsor examines ways to protect sex workers

Sex work is unprotected, increasingly dangerous and needs to be decriminalized, according to a new report published in the Canadian Review of Sociology. Co-authored by Concordia University and University of Windsor researchers, the study calls for sweeping changes to sex work performed on and off the streets.

“We must not only change our laws, we must also revamp our attitudes and implement policies that protect the social, physical and psychological rights of sex workers,” says first author Frances Shaver, chair and professor in Concordia’s Department of Sociology and Anthropology. “Regardless of where and how they conduct their business, sex workers are left on their own to ensure their health and safety on the job.”

Along with colleagues Jacqueline Lewis and Eleanor Maticka-Tyndale, from the University of Windsor, Shaver compiled data from over 450 interviews conducted with sex workers. The team also gathered intelligence from 40 law enforcement officials and public health advocates on the perils of the trade. “Even when victimized by others, sex workers are not afforded the rights of protection and redress that any other person in Canada can expect,” Shaver observes.

Marginalized and denied protection

In 2007, sex workers launched legal challenges in the Ontario and British Columbia Superior Courts against sections of the Canadian Criminal Code. They sued, claiming federal laws put them at higher risk, intensified their marginalization and violated the Charter of Rights and Freedoms. While a ruling in the B.C. case is pending, the Ontario court agreed the provisions in the Canadian Criminal Code deny sex workers protection and resources to ensure their well-being.

“Sex workers are out of sight and out of mind,” she deplores, noting assaults include rape, gay bashing, robbery and harassment. “They’ve been pushed into industrialized or isolated neighborhoods, where lighting, access to public places and even people are sparse.”

Shaver says the 2010 Ontario ruling brought to light issues most people never consider. “The public needs to be educated on this industry. Canadians generally don’t know much about sex workers and that’s created unwarranted fears,” she says. “What little is known comes from media reports on crises, such as underage girls forced into sex rings. The reality is only a small number are in crisis.”

The vast majority of sex workers are consenting adults who enter the field in order to pay their bills. “Most get into the business because they know someone who knows someone,” says Shaver. “It’s rare that boyfriends force girlfriends into sex work.”

Most sex work conducted off streets

By most estimates, only 10 to 20 per cent of sex workers solicit clients off the street. The majority “” 80 to 90 per cent “” work from home, brothels and private establishments such as escort agencies, strip clubs or massage parlors.

That’s why federal laws need to be amended. “Sharing and referring clients to each other makes the world safer for a sex worker but both involve procuring,” she says, adding home-based practice is illegal, too. “That’s considered operating a bawdy house. Indoor sex work is safer yet it involves breaking our current laws if the location is fixed or shared with others.”

New Zealand decriminalized its sex industry without negative consequence, although Shaver cautions against adopting that model. “You can’t just pick policy from another county and move it in,” she says. “It has to be developed as it was in New Zealand: in consultation with all stakeholders including sex workers, the ministry of health, other government organizations, police and citizens.”

As for the number of sex workers who operate in Canada, no figures have ever been put forward. “It’s hard to know just how many sex workers there are across the country, since many work under the radar,” Shaver says.”But one thing’s for certain “” until new rules are in place, it will continue to be dangerous under the radar.”

On the Net: