What Makes Patients Complex? Ask Their Primary Care Physicians

Physician-defined patient complexity differs from current diagnosis-based measures

As Americans live longer with multiple medical conditions, managing their care is becoming increasingly challenging. Being able to define and measure patient complexity has important implications for how care is organized, how physicians and health care systems are paid, and how resources are allocated. In an article in the Dec. 20 Annals of Internal Medicine, a team of Massachusetts General Hospital (MGH) researchers report finding that primary care physicians define patient complexity using a broader range of factors — including mental health, social factors and financial issues — than do commonly used approaches based only on diagnoses and prior costs.

“Simply counting the number of co-morbid conditions does not really capture whether a patient is complex,” explains Richard W. Grant, MD, MPH, the paper’s lead author. “All primary care physicians can point to patients of theirs with very complicated medical histories who are relatively straightforward to manage, whereas other patients can be a real challenge despite relatively few medical diagnoses. Our results emphasize the importance of social and behavioral contexts that can create important barriers to delivering high-quality primary care.”

The study enrolled 40 primary care physicians from 12 MGH-affiliated practices and community health centers. Participating physicians used a web-based tool to review a list of 120 of their own patients and indicated those who, in their view, were complex. For those complex patients, they were asked to indicate which of five domains — medical decision-making, coordinating care, mental health or substance abuse problems, health-related behaviors, and social or economic circumstances — were involved in that determination.

The authors found that primary care physicians designated about one-quarter of their patients as complex — with older, more experienced physicians and those working in community health centers reporting higher proportions of complex patients. Compared to non-complex patients, complex patients were older, more often women, and had more clinic visits to many different providers. Complex patients were also prescribed more medicines — including prescriptions for anti-psychotic medicines — were more likely to miss appointments, and were more likely to live in neighborhoods with lower income and educational levels. The authors then found that the results of physician assessment differed substantially from those of other common methods for assessing complexity.

“Managing complex patients requires greater clinician effort, increased health care resources, and substantial family and community support,” says Grant, who recently joined the division of Research at Kaiser Permanente Northern California (http://www.dor.kaiser.org). “In order to redesign our health care systems to more effectively care for complex patients, we need a better handle on exactly who they are. By asking primary care physicians about their experiences with their own patients in a systematic and quantitative way, we were able to bring out the importance of social and behavioral factors, in addition to specific medical problems. This work may help guide efforts to redesign health care systems so that we can deliver high quality, cost-effective care tailored to individual patient needs.”

On the Net:

Fake-Tan Lotions Influence Women To Stop Exposure To UV

According to a new study, 40 percent of women who use fake-tan lotions tend to cut back on the time they expose themselves in the sun.
The study found that products may be a way to convince women seeking a tan to reduce ultraviolet (UV) radiation exposure, thus reducing their risk of skin cancer.
“The message I give is, your natural skin color is where you were born to be, but if you really want to be tan get it out of a bottle,” Suephy Chen, a dermatologist at Emory University School of Medicine in Atlanta, told Reuters. “Getting a tan out of a bottle is incredibly safe, whereas getting tanned from tanning beds and lying out is not.”
The team surveyed 415 women living in or around the Emory University campus who were between the ages 18 to 71, with an average age of 26.
They asked the participants how often they tanned outside or in tanning beds, or used fake-tan lotions.
About half of the respondents said they had used sunless tanning lotions, gels or spray-ons in the past year, while at least 70 percent reported tanning in the sun.  A quarter of the respondents said they had recently used a tanning bed.
While women who used tanning products said they were also more likely to seek other types of tans, about 40 percent said they had decreased their sun exposure or tanning bed use.
The researchers said that the top reasons for using sunless tanning products was safety and to avoid wrinkles.
They found that about 93 percent of the women said they believe tanned skin is more attractive than pale skin.  They also said that about 79 percent of the respondents said they felt better about themselves when tan.
The World Health Organization said two years ago that tanning beds were in the top cancer risk category and are “carcinogenic to humans.”
The U.S. National Cancer Institute believes that there were over a million new cases of non-melanoma skin cancer in 2010 in the U.S. and less than 1,000 deaths.  The institute also said that 2011 was expected to see about 70,000 new cases and close to 9,000 deaths.
The research was published in the December 19 online edition of the Archives of Dermatology.

On the Net:

New Hormone Pill Has Potential To Help Women With Menopause

Researchers say that a hormone pill may help women through menopause and give their sex lives a boost.

They said they found the first robust evidence that low doses of a hormone called DHEA can help sexual function and menopausal symptoms.

The team said that larger studies are still needed in order to help confirm their preliminary findings.

“We must bear in mind that this is a pilot study with a small sample,” Anna Fenton, co-editor of the journal Climacteric, said in statement. “We can’t yet say that this study means that DHEA is a viable alternative to HRT, but we should be looking to do larger studies to confirm these initial results.”

DHEA is a natural steroid hormone made in the adrenal glands and has shown to have a variety of functions in therapy.

A combination of the hormones estrogen and progesterone (HRT) is currently the approved treatment for women going through the menopause.

Sales of the drug have dropped since a study in 2002 found higher rates of ovarian cancer, breast cancer and strokes in women who took the pills.

Researchers said in January that the antidepressant Lexapro significantly cut the number and severity of hot flashes in menopausal women, as well as other antidepressants.

For the current study, a team of researchers led by Andrea Genazzani of the University of Pisa followed a group of 48 post-menopausal women with symptoms.

They had 12 women take DHEA, while 12 others took vitamin D and calcium, 12 took the standard HRT, and 12 took a synthetic steroid known as tibolone.

The women’s menopausal symptoms, such as sexual interest and activity, were measured using a standard questionnaire.

The researchers asked the women questions regarding sexual satisfaction and the frequency in which they had sex.

After 12 months, the women on hormone replacements had improvements in menopausal symptoms, but those taking vitamins D and calcium did not show any significant improvement.

All the groups at the start of the trial had similar sexual activity, but after a year the team found that those who took calcium and vitamin D scored lower on the questionnaire scale.

The researchers said that the results for the HRT group were similar, and both the HRT DHEA groups showed a higher level of sexual intercourse in comparison to the control group.

Genazzani said that the initial results from the study show that DHEA has potential.

“But this is a small study, a proof of concept. What we need to do now is to look at a larger study, to confirm that these initial results are valid,” she said in a statement.

The study was published in the Journal of the International Menopause Society, Climacteric.

On the Net:

Reproductive Disorder Linked To Increased Risk Of Inflammatory Bowel Disease

Increased risk of inflammatory bowel disease in women with endometriosis: a nationwide Danish cohort study

Women with endometriosis are up to twice as likely to develop inflammatory bowel disease as those without this reproductive disorder, suggests a large study published online in Gut.

And the effect can last for up to 20 years after their diagnosis of endometriosis–a condition in which cells from the womb lining implant in other areas of the body.

Endometriosis is relatively common, and thought to affect as many as one in 10 women during their child bearing years.

The researchers tracked the long term health of more than 37,000 Danish women who had been admitted to hospital with endometriosis between 1977 and 2007.

During a monitoring period, which lasted an average of 13 years, 320 of these women developed inflammatory bowel disease–228 ulcerative colitis and 92 Crohn’s disease. These figures equate to an increased risk of developing inflammatory bowel disease of 50% compared with women in the general population.

When the analysis was restricted to women whose endometriosis was verified surgically, the risk increased further to 80%, and the effect persisted for more than 20 years after the initial diagnosis.

The average time lag between a diagnosis of endometriosis and the development of inflammatory bowel disease was around 10 years.

Both inflammatory bowel disease and endometriosis are chronic inflammatory disorders, which typically begin in young adulthood, affect the bowel, and cause abdominal pain.

The authors conclude that the explanation for the link between the two disorders may lie in common causes or the effects of treatment for endometriosis.

Some research suggests that long term use of oral contraceptives, which are frequently used to treat endometriosis, may increase the risk of developing inflammatory bowel disease.

On the Net:

New Insight Into Why Locusts Swarm

Protein associated with learning implicated in causing grasshoppers to swarm

New research has found that a protein associated with learning and memory plays an integral role in changing the behavior of locusts from that of harmless grasshoppers into swarming pests.

Desert Locusts are a species of grasshopper that have evolved a Jekyll-and-Hyde disposition to survive in their harsh environment. In their solitary phase, they avoid other locusts and occur in very low density. When the sporadic rains arrive and food is more plentiful, their numbers increase.

However, as the rains cease the locusts are driven onto dwindling patches of vegetation. This forced proximity to other locusts causes a little-understood transformation into their ‘gregarious phase’: they rapidly become very mobile, actively seek the company of other locusts, and thus form huge swarms that sweep the landscape in their search for food.

The new research, led by Dr Swidbert Ott from the University of Cambridge in collaboration with the University of Leuven, explored the role of a specific signaling protein in the locusts’ brain, known as Protein Kinase A, in this transition. They found that this protein, which is typically associated with learning in other animals, has been co-opted to control the transition from solitary to gregarious behavior in locusts.

They hypothesize that the process whereby locusts ‘remember’ the experience of crowding and modify their behavior resembles learning. The ‘learning’ protein acts as a molecular switch in a social feedback loop, because gregarious behavior ensures that crowding is maintained. The new results indicate that the biochemical mechanism that triggers locust swarming is similar to what enables humans and other animals to respond to social change.

Dr Ott added: “Learning is when you change your behavior in the light of new experience, and this is what a locust needs to do when it gets caught up in the crowd. What is amazing is that the parallels don’t just end there, they extend to the specific proteins that bring about the behavioral changes.”

Desert locusts (Schistocera gregaria) are one of the most devastating insect pests, affecting 20% of the world’s land surface through periodic swarms containing billions of locusts stretching over many square kilometers. Different species of locust continue to inflict severe economic hardship on large parts of Africa and China. In November 2008, swarms six kilometers long plagued Australia.

The research will be published this week in the journal PNAS.

On the Net:

Researchers Say Sunlight May Curb Spread Of Chickenpox

According to a study published this week by scientists at the University of London, children who soak up more rays from the sun may actually be less likely to spread the virus associated with chickenpox.

The results of the study were published in the scientific journal Virology and indicated that the UV rays in sunlight may serve to ℠deactivate´ the viruses on the surface of the skin, thus making it more difficult for the virus to spread to new hosts.

Researchers compared various geographic regions and found that those with higher levels of sunlight also had a lower incidence of the virus, which predominantly affects children between the ages of 4 and 10.

Scientists have insisted, however, that the role of sunlight in curbing the spread of the virus is still largely speculative, as other factors such as relative temperature and humidity could also play a currently unknown role in the virus´ epidemiology.

The varicelli-zoster virus which causes chickenpox is an airborne disease which can  easily spread through coughing or sneezing, particularly during the first 2-3 days of infection. The most common source of infection, however, is from contact with the red blisters and spots that are most commonly associated with the illness. An infected individual is already contagious one to two days before the rash appears and will typically remain infectious until all of the red blisters have formed a crust.

Researchers have long been aware that UV light has a repressive effect on a variety of different viruses types, and  the University of London´s Dr. Phil Rice suspects that this is the key to understanding the lower rate of chickenpox infections in tropical countries. Dr. Rice also points out that his theory is further corroborated by the increased incidence of the virus during summer months when people spend less time outside and tend to wear more clothing.

Rice´s research team collected and examined data from some 25-years worth of chicken pox studies conducted all around the world. After plotting the data in various graphs and looking for patterns, the scientists say that their attention was drawn a very obvious correlation between the prevalence of the varicelli-zoster virus and levels of UV exposure.

The connection between sunlight and infection levels had been essentially sitting right under researchers noses for years, said Rice.

“No-one had considered UV as a factor before, but when I looked at the epidemiological studies they showed a good correlation between global latitude and the presence of the virus.”

Still, Professor Judy Breuer of nearby University College London remains cautious, noting that while UV may indeed play a significant role, it is likely that there are a number of other factors which figure into the disparate infection rates seen in tropical regions compared to their northern neighbors.

“Lots of things aside from UV could affect it ,” Breuer told BBC News, such as “heat, humidity and social factors such as overcrowding. It´s quite possible that UV is having an effect, but we don´t have any firm evidence showing the extent this is happening.”

Regardless of the extent of the role played by sunlight in the virus´ epidemiology, Rice´s team has provided researchers in the field with a focal point for years to come.

On the Net:

What Makes A Drunk Become Aggressive?

Drinking enough alcohol to become intoxicated increases aggression significantly in people who have one particular personality trait, according to new research.
But people without that trait don´t get any more aggressive when drunk than they would when they´re sober.
That trait is the ability to consider the future consequences of current actions.
“People who focus on the here and now, without thinking about the impact on the future, are more aggressive than others when they are sober, but the effect is magnified greatly when they´re drunk,” said Brad Bushman, lead author of the study and professor of communication and psychology at Ohio State University.
“If you carefully consider the consequences of your actions, it is unlikely getting drunk is going to make you any more aggressive than you usually are.”
Peter Giancola, professor of psychology, at the University of Kentucky, co-authored the paper with Bushman and led the experiments used in the study.  Other co-authors were Dominic Parrott, associate professor of psychology at of Georgia State University and Robert Roth, associate professor of psychiatry, at Dartmouth Medical School.  Their results appear online in the Journal of Experimental Social Psychology and will be published in a future print edition.
Bushman said it makes sense that alcohol would make present-focused people more aggressive.
“Alcohol has a myopic effect — it narrows your attention to what is important to you right now.  That may be dangerous to someone who already has that tendency to ignore the future consequences of their actions and who is placed in a hostile situation.”
The study involved 495 adults, with an average age of 23, who were social drinkers.  Before participating, the participants were screened for any past or present drug, alcohol and psychiatric-related problems.  Women were tested to ensure they weren´t pregnant.
All participants completed the “Consideration of Future Consequences scale.”  They indicated how much they agreed with statements like “I only act to satisfy immediate concerns, figuring the future will take care of itself.”  Scores on this measure determined how much participants were present-focused or future-focused.
Half the participants were put in the alcohol group, where they received alcohol mixed with orange juice at a 1:5 ratio.  The other half were given orange juice with just a tiny bit of alcohol.  The rims of the glasses were also sprayed with alcohol so that they thought they were consuming a full alcoholic beverage.
Participants in the alcohol group had a mean blood alcohol level of 0.095 just before aggression was measured and 0.105 following, meaning they were legally drunk and that their alcohol levels were rising during the measurement of their aggressive behavior.
Those in the placebo group had mean blood alcohol levels that didn´t exceed 0.015, meaning they had very little alcohol in their systems and were well below standards of intoxication.
The aggression measure used in this study was developed in 1967 to test aggressiveness through the use of harmless but somewhat painful electric shocks.  The researchers measured the participants´ threshold to the electric shock pain before the experiment began to ensure that no one received a shock that exceeded what they could take.
Each of the participants was told that he or she was competing with a same-sex opponent in a computer-based speed reaction test, with the winner delivering an electrical shock to the loser.  The winner determined the intensity and the length of the shock delivered to the loser.
In actuality, there was no opponent.  There were 34 trials, and the participant “won” half of them (randomly determined).  Each time they “lost,” the participants received electric shocks that increased in length and intensity over the course of the trials, and the researchers measured if they retaliated in kind.
“The participants were led to believe they were dealing with a real jerk who got more and more nasty as the experiment continued,” Bushman said.  “We tried to mimic what happens in real life, in that the aggression escalated as time went on.”
Results were clear, Bushman said.
“The less people thought about the future, the more likely they were to retaliate, but especially when they were drunk.  People who were present-focused and drunk shocked their opponents longer and harder than anyone else in the study,” he said.
“Alcohol didn´t have much effect on the aggressiveness of people who were future-focused.”
Men were more aggressive than women overall, but the effects of alcohol and personality were similar in both sexes.  In other words, women who were present-focused were still much more aggressive when drunk than were women who were future-focused, just like men.
Bushman said the results should serve as a warning to people who live only in the moment without thinking too much about the future.
“If you´re that kind of person, you really should watch your drinking.  Combining alcohol with a focus on the present can be a recipe for disaster.”
The study was supported by grants from the National Institute on Alcohol Abuse and Alcoholism and from the National Center for Research Resources.

On the Net:

Scientists Call For New Children Penicillin Guidelines

Scientists and clinicians are saying that there needs to be a review of penicillin dosing guidelines for children.

Current guidelines for penicillin use in children have remained unchanged for nearly 50 years.

A new study published in the British Medical Journal found that some children may not be receiving effective doses, which could potentially lead to failed treatment and contribute to antibiotic resistance.

Oral penicillins account for about 4.5 million of the total 6 million annual prescriptions for antibiotics given to treat childhood bacterial infections each year.

Current UK dosing guidelines for penicillin are provided by the British National Formulary for Children (BNFC) and are mainly based on ages.

The doses recommended have not changed in almost 50 years, and the guidelines do not take into account the increase in the average weight of children over time.

Experts say reviewing these guidelines are essential in order to ensure all children who require penicillin are receiving effective doses.

Researchers found that the age guidelines set in 1963 were accompanied by average weights, and doses are based on fractions of the widely used adult doses.

The structured dosing are not based on current weight values, which are up 20 percent compared to those in 1963.  Under-dosing a child could lead to sub-therapeutic concentrations.

Researchers also said that adult penicillin recommendations have been re-evaluated to take modern weights into consideration.

“We were surprised at the lack of evidence to support the current oral penicillins dosing recommendations for children, as it is such a commonly used drug,” Dr Paul Long, Senior Lecturer in Pharmacognosy at King’s College London, said in a press release.

“Children’s average size and weight are slowly but significantly changing, so what may have been adequate doses of penicillin 50 years ago are potentially not enough today.”

On the Net:

Endophenotype Strategies For The Study Of Neuropsychiatric Disorders

The identification of genes that contribute to a susceptibility to complex neuropsychiatric disorders such as schizophrenia, major depression and bipolar disorders has been not very successful using conventional genetic approach. There are several problems associating with this conventional approach including carriers of genes cannot be identified in the absence of manifest symptoms and the heterogeneity of neuropsychiatric disorders. A new direction that appears encouraging is the identification of neurobiological or neurobehavioral characteristics associated with these complex neuropsychiatric disorders, or endophenotypes, that may be more closely linked to gene expression. Endophenotype is a biomarker associating with genetic components as well as the clinical symptoms of neuropsychiatric disorders. It plays an important role to bridge the gap between microscopic level and macroscopic level of neuropsychiatric disorders. The identification of endophenotype, along with the advanced genetic analysis such genomewise association studies, is very crucial to the identification of genes that predispose someone to neuropsychiatric disorders. Therefore, the study of endophenotype is of particular useful for us to understand the underlying mechanism of the illness process of neuropsychiatric disorders, aiding the clinicians to make accurate diagnosis and for early detection purposes.

With the supported from the National Basic Research Programme of China (973 Progamme) (2007CB512302), The Key Laboratory of Mental Health of the Institute of Psychology, Chinese Academy of Sciences has organized a strategic symposium for endophenotypes, titled “”Endophenotype strategy for psychotic disorders and summit meeting of Key Laboratory of Mental Health, Institute of Psychology, CAS”. International renowned scholars of this field presented their updated findings on endophenotypes. Professor Irving Gottesman, the founder of the endophenotype for neuropsychiatric disorders, was also giving a video lecture to all the participants. Their work was published in the special issue of Chinese Science Bulletin, 2011, Vol. 56(32).

Cognitive deficits have been widely recognized as core features of schizophrenia, and as major contributors to the clinical outcome of the disorder. They are also studied widely as ‘endophenotypes’, reflecting a growing consensus that schizophrenia is a broader, more multidimensional illness than the diagnostic criteria required for its formal diagnosis. Stone and Hsi adopted this evolving view of cognition underlying its utilization in recent initiatives for intervention and assessment in schizophrenia, and illustrated it with the use of the MATRICS Cognitive Consensus Battery, a standardized battery of neuropsychological tests developed to assess the effectiveness of cognitive enhancing treatments in schizophrenia. They then further provided evidence to show the utilization of neuropsychological deficits in the identification, validation and remediation of a liability syndrome for schizophrenia (‘schizotaxia’). Taken together, the inclusion of cognition in broader consortium and other collaborative efforts to assess interrelationships across multiple dimensions of function will provide important catalysts for progress in each individual dimension. Utilization of cognition underscores its functional importance in the clinical outcome of schizophrenia. Moreover, it helps to illuminate indicators of liability for schizophrenia that might be amenable to remediation.

On the other hand, Prof. Pak Sham and colleagues have addressed the statistical issues and approaches in endophenotype research. In this paper, they argued that in reality, a putative endophenotype is unlikely to be a perfect representation of the genetic component of disease liability. The magnitude of the correlation between a putative endophenotype and the genetic component of disease liability can be estimated by fitting multivariate genetic models to twin data. A number of statistical methods have been developed for incorporating endophenotypes in genetic linkage and association analyses with the aim of improving statistical power. The most recent of such methods can handle multiple endophenotypes simultaneously for the greatest increase in power. In addition to increasing statistical power, endophenotype research plays an important role in helping to understand the mechanisms which connect the associated genetic variants with disease occurrence. The causal pathways are likely to be very complicated and involve endophenotypes at multiple levels: from RNA expression profiles and patterns of protein expression, through neuronal and synaptic properties, to neurophysiological and neurocognitive function. Novel statistical approaches may be required for the analysis of the complex relationships between endophenotypes at different levels and how they converge to cause the occurrence of disease.

In the paper of Consortium for the Human Information and Neurocognitive Endophenotype (CHINE) in mainland China: An example from neurological soft signs for neuropsychiatric disorders, Chan made a strategic paper on endophenotypes and argued the need for establishing a central consortium for neuropsychiatric disorders in mainland China, namely the CHINE. The CHINE is intentionally established to pave the roadmap for neuropsychiatric disorders research. It not only identifies the biosignatures for neuropsychiatric disorders but also serves as the central databank for examining the etiologies of major complex neuropsychiatric disorders as well as serving as the main basis for corresponding treatment development. The CHINE emphasizes on two main features, i.e., the supposedly universal basic cognitive functioning such as attention, and the supposedly culturally specific social cognitive functioning such as emotion perception and expression. In this consortium, data collected highlights the genetic level (susceptibility genes associating with major neuropsychiatric disorders, neuroanatomical level (structural and functional imaging data), and behavioral level (neurocognitive function performances, social cognitive functioning, neurological and clinical manifestations). Target groups include both the clinically diagnosed patients suffering from neuropsychiatric disorders (mainly schizophrenia and bipolar disorders at the current moment, but will be extended to other clinical groups later), non-psychotic first-degree relatives of the patients, and healthy controls. Chan specifically illustrated an example of a promising endophenotype for schizophrenia, namely neurological soft signs, in detailing the steps for building the consortium. It is also noteworthy that the potential translational usage of neurological soft signs as a quick, quantifiable, sensitive and user-friendly bedside early detection and screening tool for clinical practice.

On the Net:

It Could Be Dangerous Living In Ambridge

A series of unfortunate events? Morbidity and mortality in a Borsetshire village

With a risk of traumatic death far higher than the national average, rural life may not be so idyllic in Ambridge, the fictitious village in the BBC radio series, The Archers, finds research in the Christmas issue published on bmj.com today.

However, the study also shows that the overall mortality rate in Ambridge is slightly lower than the country as a whole, so characters should not worry unduly. Those who do not perish in a hideous accident have a good chance of living for a long time.

The author, Rob Stepney, wanted to investigate whether The Archers was any more true to life (and death) than TV soap operas. Established research has concluded that characters in Coronation Street and EastEnders have a higher risk of death than bomb disposal experts and racing drivers.

Could life on a radio soap opera be safer wondered Stepney? He reviewed deaths, childbirth and episodes of serious illness over a 20 year period in the series (1991 to September 2011).

Of the 15 deaths recorded in Ambridge over the 20 years, nine were of male characters and six of female characters. This equates to a mortality rate of 7.8 per 1,000 population per year for men compared with 8.5 per 1,000 in England and Wales mid-way through the study period. For women in Ambridge, the mortality rate was 5.2 deaths per 1,000 compared with 5.8 per 1,000 nationally.

However the chance of accidental death or suicide in Ambridge is worryingly high. During the study period there was a road traffic accident, a death when a tractor overturned, Nigel Pargetter fell from a roof, and there was a suicide from a self-inflicted gunshot wound.

These figures translate into 27% of the total mortality in Ambridge. But in real life, in the year 2000, accidents accounted for only 4% of deaths in men.

To compensate for the 15 deaths in the past 20 years, 13 children have been born to the 115 characters. The annual live birth rate in Ambridge in 1992-2011 was 5.6 per 1,000 compared with 11.4 per 1,000 in England and Wales.

Access to healthcare appears to be good in Ambridge, however. One character, Elizabeth Archer, who was born with a heart defect, recently had a life-saving implant operation. She was lucky as nationally fewer than 100 such implants were carried out in 2009.

In conclusion, Stepney says that, if characters in The Archers steer clear of accidents, Ambridge appears to be a safe place to live.

On the Net:

A Black Hole’s Dinner is Fast Approaching (Part 2)

Astronomers using ESO’s Very Large Telescope have discovered a gas cloud with several times the mass of the Earth accelerating towards the black hole at the centre of the Milky Way. This is the first time ever that the approach of such a doomed cloud to a supermassive black hole has been observed. This Video News Release shows the new results and includes spectacular simulations of how the cloud will break up over the next few years.  credit: ESO

Behavioral Interventions Can Increase Condom Use, Reduce Sexually Transmitted Infections

Behavioral interventions aimed at reducing sexual risk behaviors, such as unprotected sex, are effective at both promoting condom use and reducing sexually transmitted infections (STIs) long after the initial intervention, according to a new report in the December 15 issue of the Journal of Acquired Immune Deficiency Syndromes.

Lead author Lori A. J. Scott-Sheldon, Ph.D., of The Miriam Hospital’s Centers for Behavioral and Preventive Medicine, and colleagues at the University of Connecticut conducted a meta-analysis of 42 studies evaluating the effectiveness of HIV-related behavioral interventions. The studies included in the meta-analysis assessed behavior changes, including increased condom use, and biological outcomes, such as a subsequent STI or HIV diagnosis. The studies were conducted worldwide and included a number of at-risk populations.

Researchers found that behavioral interventions — which included HIV education, motivation and skills-based training aimed at negotiating safer sex behaviors — were successful at improving condom use and reducing incident STIs, including HIV, for up to four years. This meta-analysis is believed to be the first to examine the incidence of HIV in a wide range of at-risk populations.

Scott-Sheldon says that while it may seem intuitive that behavioral changes, such as increased condom use, will result in fewer STIs, previous studies have been unable to support that assertion.

“The association between behavioral and biological outcomes is complex, since transmission of STIs depends on a number of factors, including partner type, characteristics, and perceptions of partner safety,” she says. “Examining both outcomes, and factors associated with sexual risk behaviors, should be important in determining the efficacy of behavioral interventions.”

The meta-analysis evaluated the findings of 67 behavioral interventions in 42 studies. While most studies were conducted in North America, 17 percent took place in Asia, 14 percent in Africa, 5 percent in Europe and 2 percent in South America. In most cases, participants were randomized to receive an HIV-related behavioral intervention or a control group. Interventions were provided in both group and individual settings.

Interventions were found to be more successful at improving condom use when social, cultural and economic barriers were addressed. Researchers also observed that, contrary to expectations, self-management training targeting risky sexual behavior did not significantly impact condom use one year after the initial intervention.

They also noted that participants were less likely to acquire STIs following the behavioral intervention if they were diagnosed with an STI or HIV at the time they entered the study. In addition, interventions were more successful at reducing the incidence of HIV when they sampled more Latinos. The authors note that, globally, Latinos are disproportionately affected by HIV and interventions targeting this group are urgently needed to prevent HIV infection in this population.

“HIV infections cost the United States billions of dollars annually,” Scott-Sheldon says. “In the absence of an effective HIV vaccine, safer sexual practices and expanded prevention efforts are required to prevent new infections and reduce the burden of HIV. Translation and widespread dissemination of effective behavioral interventions within a wide range of population groups should be a high priority.”

Co-authors include Michael P. Carey, Ph.D., director of The Miriam Hospital’s Centers for Behavioral and Preventive Medicine; and Tania B. Huedo-Medina, Ph.D., Michelle R. Warren and Blair T. Johnson, Ph.D., of the University of Connecticut.

The principal affiliation of Lori A. J. Scott-Sheldon, Ph.D., is The Miriam Hospital (a member hospital of the Lifespan health system in Rhode Island). Dr. Scott-Sheldon also is an assistant professor of psychiatry & human behavior (research) at The Warren Alpert Medical School of Brown University.

On the Net:

The Beginning Of The End For Comet Lovejoy

The SOHO spaceborne solar observatory today captured comet Lovejoy in its field of view for the first time, indicating that the icy body is on its final destructive plunge towards the Sun.

Announced on 2 December, the newly discovered comet Lovejoy is on a near-collision course with the Sun and is expected to plunge to its fiery fate late on 15 December.

At its closest approach, it will pass just 140 000 km above the solar surface. At that distance, the icy comet is not expected to survive the Sun´s fierce heat.

Indeed, comets are such tenuous collections of ice and rocks that it could disintegrate at any moment.

If the comet does stay the course, we will not see its demise because its closest approach will take place on the far side of the Sun.

The ESA—NASA SOHO spacecraft is an exceptional discoverer of comets, spotting 2110 since its launch in 1995.

However, comet C/2011 W3 was discovered from the ground by the Australian astronomer Terry Lovejoy, hence it is now carries his name.

Terry was an early pioneer of using SOHO data over the Internet to discover comets. He can now claim to be the first person to discover a Sun-grazer from both ground and space telescopes.

Comet Lovejoy is from the ℠Kreutz group´ — believed to be a fragment of a previous comet that broke up centuries ago.

Other fragments of that great comet have become some of the brightest in history: comet Ikeya—Seki became so bright in 1965 that it was visible even in the daytime sky.

Unfortunately, comet Lovejoy is not expected to become as bright as Ikeya—Seki.

“On average, new Kreutz-group comets are discovered every few days by SOHO, but from the ground they are much rarer to see or discover,” says Karl Battams, Naval Research Laboratory, who curates the Sun-grazing comets webpage.

“This is the first ground-based discovery of a Kreutz-group comet in 40 years, so we really can’t be sure just how bright it will get. “However, I do think that it will be the brightest Kreutz-group comet SOHO has ever seen.”

Comet Lovejoy´s spectacular progress can be monitored via the web at SOHO´s LASCO instrument page.

Image Caption: Comet Lovejoy is the bright streak at the bottom of this image, taken by SOHO´s Large Angle Spectrometric Coronagraph (LASCO) C3 instrument. SOHO’s LASCO instrument is a coronagraph. It blocks out the light from the Sun’s disc, creating an artificial eclipse. With the central glare removed, fainter objects closer to the Sun can be seen clearly by the instrument. Credits: SOHO/LASCO (ESA/NASA). See time-lapse here

On the Net:

Russia Blames HAARP Transmitter For Phobos-Grunt Failure

After abandoning efforts to save its Martian moon probe, Phobos-Grunt, that has been stranded in Earth´s orbit since early November, Russia is now focusing on where the blame lies for the expensive mishap. The accused: Alaska´s High-frequency Active Auroral Research Program (HAARP) transmitter.

Phobos-Grunt, now considered a 7.5 ton heap of space debris, is expected to plunge back to Earth around January 9, two months after it became stranded, according to Russian space agency, Roscosmos.

Russian president Dmitry Medvedev has called for criminal prosecution for those responsible for failing to fulfill the country´s dream to finally launch a successful mission to Mars.

Shortly after Medvedev´s statement, a former Russian general found a more convenient scapegoat, placing blame on the often controversial radio facility outside Gakona for stopping Russia´s mission to the Red Planet.

Lt. Gen. Nikolay Rodionov, a retired commander of Russia´s ballistic missile early warning system, said US technology could have been the cause of the malfunction.

In a November 24 interview with the Russian news agency Interfax, Rodionov said “powerful American radars” in Alaska “could have influenced the control systems of our interplanetary rover.”

Rodionov was quoted as saying the US wants to use the ionosphere as part of its missile defense, although he did not elaborate. India´s ℠The Hindu´ newspaper reported the Rodionov was likely referring to the US´s HAARP observatory, established in 1993.

However, analysis of the timing and physics involved shows that there is little basis for this claim.

The HAARP observatory sits on an Air Force-owned site in Gakona, Alaska. It is used periodically by scientists to run experiments in the ionosphere, usually a couple times per year. It was last operated on September 3 and was not turned on when the Phobos-Grunt probe had its malfunction, according to program director Craig Selcher, with the Air Force Research Laboratory, at Kirtland Air Force Base, New Mexico.

Even if HAARP was turned on, a full-power blast would have hit the Russian probe with no more than 1.03 milliwatts of radio energy per square centimeter — about the same as pointing a 60-watt light bulb at it from 69 feet away, according to Selcher.

There are similar radar facilities operating in Norway, Russia, Peru and other locations, but HAARP is one of the most powerful. It is far more likely that effects of solar weather that constantly bombards the ionosphere could play a role in disabling Phobos-Grunt than any man-made object on the ground.

Image Caption: Color image of Phobos, imaged by the Mars Reconnaissance Orbiter in 2008. Credit: NASA

On the Net:

Keep Arms And Legs Hairy To Keep Bed Bugs Away

Looking to avoid confrontations with bed bugs? You will be better off not shaving your legs according to researchers in the UK.
Twenty-nine volunteers bravely tested the theory by Michael Siva-Jothy, from Sheffield University´s Department of Animal and Plant Sciences, BBC News is reporting. Siva-Jothy found that more layers of both long and short body hair near the surface appeared to work as a deterrent to the blood-sucking insects, with the finer hairs acting as an early warning system.
Siva-Jothy explains, “Our findings show that more body hairs mean better detection of parasites – the hairs have nerves attached to them and provide us with the ability to detect displacement.”
“The results have implications for understanding why we look the way we do, what selective forces might have driven us to look the way we do, and may even provide insight for better understanding of how to reduce biting insects´ impact on humans.”
However do not make the assumption that men will be less bothered by the insects than women, who generally have less body hair than men. Research is proving that men do not appear to be bitten less often, The Daily mail reports.
Siva-Jothy added that extreme hairiness might also be more of a disadvantage than an advantage despite the logic of the theory, “If you have a heavy coat of long thick hairs it is easier for parasites to hide, even if you can detect them.”
“Our proposal is that we retain the fine covering because it aids detection and if we lost all hair, even the relatively invisible fine hair, our detection ability goes right down.”
Mark Pagel, an evolutionary biologist and professor at the University of Reading, said that biting parasites remain a major cause of disease and death worldwide, making them a potentially enormous evolutionary pressure on early man.
“This vellus hair is certainly no use for anything else, so it is a reasonable hypothesis that it developed in response to a strong selective pressure in our past. Mammals are unique in developing this wonderful fur, and humans are the only mammals to jettison it, so there must have been a very good reason to do so.”
The researchers are investigating the biology, reproduction and immunity of blood-sucking insects with the aim to find more effective controls for parasitic insects and the diseases they spread.
Results of the research are published in the Royal Society journal Biology Letters.

On the Net:

Gas Cloud En Route To Milky Way’s Black Hole

[ Video 1 ] | [ Video 2 ] | [ Video 3 ] |

The massive black hole at the center of our Milky Way Galaxy is destined to be invaded by a gas cloud, creating a violent encounter, according to astronomers.

The supermassive black hole in the Milky Way is close enough for astronomers to study in detail, so the encounter could provide a unique chance for scientist to observe what until now has only been theorized.

The astronomers plan to find out how a black hole eats up gas, dust and stars as it grows even bigger.

“When we look at the black holes in the centers of other galaxies, we see them get bright and then fade, but we never know what is actually happening,” Eliot Quataert, a theoretical astrophysicist and University of California, Berkeley professor of astronomy, said in a press release.

“This is an unprecedented opportunity to obtain unique observations and insight into the processes that go on as gas falls into a black hole, heats up and emits light. It’s a neat window onto a black hole that’s actually capturing gas as it spirals in.”

Reinhard Genzel, professor of physics at both UC Berkeley and the Max Planck Institute for Extraterrestrial Physics (MPE) in Garching, Germany, said that he is interested in finding out what the front row spectacle of the supermassive black hole will bring.

“The next two years will be very interesting and should provide us with extremely valuable information on the behavior of matter around such massive objects, and its ultimate fate,” Genzel said in a press release.

Genzel and colleagues have seen the gas cloud about three times the mass of Earth speeding up and falling deeper into the gravitational whirlpool of the black hole since 2008.

They said that the edges of the gas cloud are beginning to fray.

“It is not going to survive the experience,” first author Stefan Gillessen of the MPE said in a press release.

Gillessen built the infrared detector on the European Southern Observatory’s Very Large Telescope in Chile, which is used to observe the movement of stars and gas in the center of the Milky Way Galaxy.

Scientists say that by 2013, outbursts of X-rays and radio waves will be emitted as the cloud gets hotter and is obliterated by the black hole.  The cloud is mostly made up of hydrogen and helium gas.

The Chandra X-ray satellite has already scheduled its largest single chunk of observation time in 2012 near the Milky Way’s central black hole.

Since MPE astronomers began observing the black hole in 1992, they have only seen two stars as close as this gas cloud to the black hole.

The difference between those stars and the gas cloud is that those stars “passed unharmed through their closest approach, (while) the gas cloud will be completely ripped apart by the tidal forces around the black hole,” Gillessen said.

The cloud may have formed when gas pushed by stellar winds from two nearby stars collided.  The cloud is glowing under the strong ultraviolet radiation from surrounding hot stars.

As the cloud falls towards the black hole at a velocity of 1,460 miles per second, it will interact with the hot gas present in the accretion flow around the black hole and become disrupted by turbulent interaction.

The scientists were able to simulate the time evolution of the cloud, and predict that the temperature of the gas cloud should increase rapidly to several million Kelvin, or 12,599,540 degrees Fahrenheit, near the black hole.  The cloud is currently at 550 Kelvin, or 530 degrees Fahrenheit.

The research will be published in the journal Nature.

Image 1: This view shows a simulation of how a gas cloud that has been observed approaching the supermassive black hole at the center of the galaxy may break apart over the next few years. This is the first time ever that the approach of such a doomed cloud to a supermassive black hole has been observed and it is expected to break up completely during 2013. The remains of the gas cloud are shown in red and yellow, with the cloud’s orbit marked in red. The stars orbiting the black hole are also shown along with blue lines marking their orbits. This view simulates the expected positions of the stars and gas cloud in the year 2021. Credit: ESO/MPE/Marc Schartmann

Image 2: This simulated view shows a gas cloud (just above center, with its orbit shown in red) that has been observed approaching the supermassive black hole at the center of the Milky Way. This is the first time ever that the approach of such a doomed cloud to a supermassive black hole has been observed and it is expected to break up completely during 2013. The stars orbiting the black hole are also shown along with blue lines marking their orbits. The stars and the cloud are shown in their actual positions in 2011. Credit: ESO

Image 3: These images taken over the last decade using the NACO instrument on ESO´s Very Large Telescope show the motion of a cloud of gas that is falling towards the supermassive black hole at the center of the Milky Way. This is the first time ever that the approach of such a doomed cloud to a supermassive black hole has been observed and it is expected to break up completely during 2013. Credit: ESO/MPE

On the Net:

A Black Hole’s Dinner is Fast Approaching

Astronomers using ESO’s Very Large Telescope have discovered a gas cloud with several times the mass of the Earth accelerating towards the black hole at the centre of the Milky Way. This is the first time ever that the approach of such a doomed cloud to a supermassive black hole has been observed. This ESOcast explains the new results and includes spectacular simulations of how the cloud will break up over the next few years.  credit:  ESO

Alzheimer’s Drug Candidate Could Prevent Disease Progression

[ Watch the Video ]
Salk scientists develop new drug that improves memory and prevents brain damage in mice
A new drug candidate may be the first capable of halting the devastating mental decline of Alzheimer’s disease, based on the findings of a study published Dec. 14 in PLoS ONE.
When given to mice with Alzheimer’s, the drug, known as J147, improved memory and prevented brain damage caused by the disease. The new compound, developed by scientists at the Salk Institute for Biological Studies, could be tested for treatment of the disease in humans in the near future.
“J147 enhances memory in both normal and Alzheimer’s mice and also protects the brain from the loss of synaptic connections,” says David Schubert, the head of Salk’s Cellular Neurobiology Laboratory, whose team developed the new drug. “No drugs on the market for Alzheimer’s have both of these properties.”
Although it is yet unknown whether the compound will prove safe and effective in humans, the Salk researchers’ say their results suggest the drug may hold potential for treatment of people with Alzheimer’s.
As many as 5.4 million Americans suffer from Alzheimer’s, according to the National Institutes of Health. More than 16 million will have the disease by 2050, according to Alzheimer’s Association estimates, resulting in medical costs of over $1 trillion per year.
The disease causes a steady, irreversible decline in brain function, erasing a person’s memory and ability to think clearly until they are unable to perform simple tasks such as eating and talking, and it is ultimately fatal. Alzheimer’s is linked to aging and typically appears after age 60, although a small percentage of families carry a genetic risk for earlier onset. Among the top ten causes of death, Alzheimer’s is the only one without a way to prevent, cure or slow disease progression.
Scientists are unclear what causes Alzheimer’s, which appears to emerge from a complex mix of genetics, environment and lifestyle factors. So far, the drugs developed to treat the disease, such as Aricept, Razadyne and Exelon, only produce fleeting memory improvements and do nothing to slow the overall course of the disease.
To find a new type of drug, Schubert and his colleagues bucked the trend within the pharmaceutical industry of focusing exclusively on the biological pathways involved in the formation of amyloid plaques, the dense deposits of protein that characterize the disease. To date, Schubert says, all amyloid-based drugs have failed in clinical trials.
Instead, the Salk team developed methods for using living neurons grown in laboratory dishes to test whether or not new synthetic compounds were effective at protecting the brain cells against several pathologies associated with brain aging. Based on the test results from each chemical iteration of the lead compound, which was originally developed for treatment of stroke and traumatic brain injury, they were able to alter its chemical structure to make a much more potent Alzheimer’s drug.
“Alzheimer’s is a complex disease, but most drug development in the pharmaceutical world has focused on a single aspect of the disease–the amyloid pathway,” says Marguerite Prior, a research associate in Schubert’s lab, who led the project along with Qi Chen, a former Salk postdoctoral researcher. “In contrast, by testing these compounds in living cell cultures, we can determine what they do against a range of age-related problems and select the best candidate that addresses multiple aspects of the disease, not just one.”
With a promising compound in hand, the researchers shifted to testing J147 as an oral medication in mice. Working with Amanda Roberts, a professor of molecular neurosciences at The Scripps Research Institute, they conducted a range of behavioral tests that showed that the drug improved memory in normal rodents.
The Salk researchers went on to show that it prevented cognitive decline in animals with Alzheimer’s and that mice and rats treated with the drug produced more of a protein called brain-derived neurotrophic factor (BDNF), a molecule that protects neurons from toxic insults, helps new neurons grow and connect with other brain cells, and is involved in memory formation.
Because of the broad ability of J147 to protect nerve cells, the researchers believe that it may also be effective for treating other neurological disorders, such as Parkinson’s disease, Huntington’s disease and amyotrophic lateral sclerosis (ALS), as well as stroke.
The research was funded by the Fritz B. Burns Foundation, the National Institutes of Health, the Bundy Foundation and the Alzheimer’s Association.

Image Caption: Salk scientists develop J147, a synthetic drug shown to improve memory and prevent brain damage in mice with Alzheimer’s disease. Image: Courtesy of Salk Institute for Biological Studies

On the Net:

Simulation of Gas Cloud Approaching Black Hole

This simulation shows the future behaviour of a gas cloud that has been observed approaching the supermassive black hole at the centre of the Milky Way. This is the first time ever that the approach of such a doomed cloud to a supermassive black hole has been observed and it is expected to break up completely during 2013.  credit:  ESO

FBI Claims More Municipal Systems Under Attack By Hackers

The Federal Bureau of Investigation (FBI) announced recently that key infrastructure systems of three US cities had been accessed by hackers. Such systems – commonly known as Supervisory Control and Data Acquisition (SCADA) – are increasingly being targeted by hackers, following reports that they rely on weak security, BBC News is reporting.
Theoretically the cyber break-ins could have resulted in sewage dumped into a lake or the power could have been shut off at a nearby mall, announced Michael Welch, deputy assistant director of the FBI´s cyber division at a recent cyber security conference.
Welch did not elaborate or name the cities where these break-ins occurred.
“We just had a circumstance where we had three cities, one of them a major city within the US, where you had several hackers that had made their way into SCADA systems within the city,” Welch told delegates at the Flemings Cyber Security conference.
“Essentially it was an ego trip for the hacker because he had control of that city´s system and he could dump raw sewage into the lake, he could shut down the power plant at the mall – a wide array of things,” he added.
Welch´s announcement follows two alleged break-ins to city water supplies. The first, in Springfield, Illinois, was later dismissed when the FBI could find no evidence of cyber-intrusion.
In the city of South Houston, Texas, a hacker named pr0f claimed to have broken into a control system that supplied water to the town. Pr0f claimed the system had only been protected by a three-character password which “required almost no skill” to get around, reports Chester Wisniewski, writing for the blog Naked Security.
Security experts, such as Graham Cluley, senior security consultant at Sophos, predict there will be a rise in such attacks, “Such systems have become a target partly because of all the chatter about the lack of security. Hackers are doing it out of curiosity to see how poorly they are protected,”
Cluley continued expressing concern citing the use of easily-cracked default passwords and that information about some of these passwords was “available for download online”.
However the firms that run SCADA systems, such as Siemens, often advise against changing passwords because the threat from malware is not a big a problem as if passwords are changed. “Not changing passwords is obviously slightly crazy. Proper security needs to be in place otherwise it is laughable,” said Cluley.

On the Net:

Elephant Seal Travels 18,000 Miles: WCS

The Wildlife Conservation Society tracked a southern elephant seal for an astonishing 18,000 miles — the equivalent of New York to Sydney and back again.

WCS tracked the male seal from December, 2010, to November, 2011. The animal — nicknamed Jackson — was tagged on the beach in Admiralty Sound in Tierra del Fuego in southern Chile. WCS conservationists fitted Jackson with a small satellite transmitter that recorded his exact location when he surfaced to breathe.

Jackson swam 1,000 miles north, 400 miles west, and 100 miles south from the original tagging location, meandering through fjords and venturing past the continental shelf as he foraged for fish and squid.

During this tracking, the WCS team analyzed the data to better understand elephant seal migratory routes.

Elephant seals are potential indicators of the health of marine ecosystems and may show how climate change influences the distribution of prey species that serve as the basis of Patagonia’s rich marine ecosystem. To protect this vast region, conservationists need to know how wildlife uses it throughout the year.

“Jackson’s travels provide a roadmap of how elephant seals use the Patagonian Coast and its associated seas,” said Caleb McClennen, WCS Director for Global Marine Programs. “This information is vital to improving ocean management in the region, helping establish protected areas in the right places, and ensuring fisheries are managed sustainably without harming vulnerable marine species like the southern elephant seal.”

The information WCS gathers will serve as a foundation for a new model of private-public, terrestrial-marine conservation of the Admiralty Sound, Karukinka Natural Park (a WCS private protected area), and Alberto de Agostini National Park. It will help build a broader vision for bolstering conservation efforts across the Patagonian Sea and coast.

“The Wildlife Conservation Society has a long history of working in the spectacular Patagonia region to establish protected areas and advance conservation of its rich wildlife,” said Julie Kunen, WCS Director of Latin America and Caribbean. “Individual stories like Jackson’s are awe-inspiring, and also inform the science that will ultimately help protect this region.”

WCS reports that Jackson has returned to Admiralty Sound, the site of the original tagging. Each year, elephant seals haul ashore in colonies to molt and find mates. The satellite transmitter is expected to work until early next year, when it will eventually fall off.

WCS has tracked more than 60 southern elephant seals via satellite on the Atlantic side of the Southern Cone since the early 1990s. Jackson represented the first southern elephant seal tagged from the Pacific side of the Southern Cone.

Elephant seals are among the largest pinnipeds in the world, reaching weights of up to 7,500 pounds and lengths of 20 feet.

Image Caption: WCS track the epic journey of “Jackson,” a young male elephant seal. Elephant seals are potential indicators of marine ecosystem health and may show how climate change influences the distribution of prey species in Patagonia´s oceans. © Wildlife Conservation Society

On the Net:

Insect Cuticle Inspires Low-Cost Material With Exceptional Strength, Toughness

“Shrilk” could one day replace plastic in consumer products, be used to suture wounds, and serve as scaffolding for tissue regeneration

Researchers at the Wyss Institute for Biologically Inspired Engineering at Harvard University have developed a new material that replicates the exceptional strength, toughness, and versatility of one of nature’s more extraordinary substances — insect cuticle. Also low-cost, biodegradable, and biocompatible, the new material, called “Shrilk,” could one day replace plastics in consumer products and be used safely in a variety of medical applications.

The research findings appeared Dec. 13 in the online issue of Advanced Materials. The work was conducted by Wyss Institute postdoctoral fellow, Javier G. Fernandez, Ph.D., with Wyss Institute Founding Director Donald Ingber, M.D., Ph.D. Ingber is the Judah Folkman Professor of Vascular Biology at Harvard Medical School and Children’s Hospital Boston and is a Professor of Bioengineering at the Harvard School of Engineering and Applied Sciences.

Natural insect cuticle, such as that found in the rigid exoskeleton of a housefly or grasshopper, is uniquely suited to the challenge of providing protection without adding weight or bulk. As such, it can deflect external chemical and physical strains without damaging the insect’s internal components, while providing structure for the insect’s muscles and wings. It is so light that it doesn’t inhibit flight and so thin that it allows flexibility. Also remarkable is its ability to vary its properties, from rigid along the insect’s body segments and wings to elastic along its limb joints.

Insect cuticle is a composite material consisting of layers of chitin, a polysaccharide polymer, and protein organized in a laminar, plywood-like structure. Mechanical and chemical interactions between these materials provide the cuticle with its unique mechanical and chemical properties. By studying these complex interactions and recreating this unique chemistry and laminar design in the lab, Fernandez and Ingber were able to engineer a thin, clear film that has the same composition and structure as insect cuticle. The material is called Shrilk because it is composed of fibroin protein from silk and from chitin, which is commonly extracted from discarded shrimp shells.

Shrilk is similar in strength and toughness to an aluminum alloy, but it is only half the weight. It is biodegradable and can be produced at a very lost cost, since chitin is readily available as a shrimp waste product. It is also easily molded into complex shapes, such as tubes. By controlling the water content in the fabrication process, the researchers were even able to reproduce the wide variations in stiffness, from elasticity to rigidity.

These attributes could have multiple applications. As a cheap, environmentally safe alternative to plastic, Shrilk could be used to make trash bags, packaging, and diapers that degrade quickly. As an exceptionally strong, biocompatible material, it could be used to suture wounds that bear high loads, such as in hernia repair, or as a scaffold for tissue regeneration.

“When we talk about the Wyss Institute’s mission to create bioinspired materials and products, Shrilk is an example of what we have in mind,” said Ingber. “It has the potential to be both a solution to some of today’s most critical environmental problems and a stepping stone toward significant medical advances.”

Image Caption: Shrilk is similar in strength and toughness to an aluminum alloy, but it is only half the weight. Shown here is a replica of an insect wing, which was made with the new material.

On the Net:

ADHD Drugs Do Not Up Risk For Heart Problems

Researchers, funded by the U.S. Food and Drug Administration and the Agency for Health Research Quality, recently studied the cardiovascular risk of ADHD medications in adults.
The study found little evidence that the medications increased the risk of heart events, such as heart attack, sudden cardiac death, and stroke. The concerns stem from the fact that some of the medications are able to slightly increase heart rate and blood pressure levels.
According to the study, “Placebo-controlled studies in children and adults indicate that stimulants and atomoxetine [a medication used to tread ADHD] elevate systolic blood pressure levels by approximately 2 to 5 mm Hg and diastolic blood pressure levels by 1 to 3 mm Hg and also lead to increases in heart rate.” But the researchers warn that, “these effects would be expected to slightly increase risk for myocardial infarction, sudden cardiac death, and stroke, clinical trials have not been large enough to assess risk of these events.”
The study was conducted using computerized records from 4 study sites, starting in 1986 and ending in 2005. Participants were adults 25 through 64 years of age who were given prescriptions for methylphenidate, amphetamine, or atomoxetine.
When the cases were followed up, the researchers found 1,357 heart attacks, 296 cases of sudden cardiac death, and 575 cases of stroke. When the data was analyzed they found that current or new use of ADHD medications, compared with nonuse or remote use, was not associated with an increased risk of cardiovascular events.
Laurel Habel, PhD, from the Kaiser Permanente Northern California Division of Research and lead author of the study notes, “It´s important to note that this is an observational study and not a randomized clinical trial.”
She notes that most of the drug use in the study occurred for less than a year so the results may not apply to long term ADHD drug users. Also the study did not follow patients that were over the age of 65.
The study is published in the current issue of the Journal of the American Medical Association (JAMA).

On the Net:

Exercise/Memory Research For Parkinson’s

Study to see if walking and/or memory training may prevent memory problems in people with Parkinson’s disease

Researchers from the University of Maryland School of Medicine and the Baltimore VA Medical Center have launched a study of exercise and computerized memory training to see if those activities may help people with Parkinson’s disease prevent memory changes. The type of memory that will be examined is known as “executive function;” it allows people to take in information and use it in a new way. Many Parkinson’s patients develop problems with executive function, which can prevent them from working and may eventually require a caregiver to take over more of the complex cognitive tasks of daily living.

“Studies of normal aging show that memory and executive function can be improved with exercise, such as walking several days a week,” explains Karen Anderson, M.D., principal investigator and an assistant professor of neurology and psychiatry at the University of Maryland School of Medicine. Dr. Anderson is also a neuro-psychiatrist at the Maryland Parkinson’s Disease and Movement Disorders Center at the University of Maryland Medical Center and a clinician in mental health at the Baltimore VA Medical Center.

She adds, “We want to see if exercise can slow or reverse some of these memory changes in Parkinson’s patients. We will also investigate whether a computer game designed to improve executive function may make a difference as well. The other question is, what happens when you put the two interventions together — if there is memory improvement, will it be even better than with one of the interventions? Or is it more efficient to do just one or the other? We really do not know.”

The researchers, who received funding through a VA Merit Award, plan to enroll about 90 patients who will be divided randomly into three groups: exercisers walking on a treadmill, memory game players and those doing both exercise and memory games. Participants in each group will receive a memory assessment at the beginning of the study. They will come in three times a week for their training for three months and will be then be tested again. Three months after that, the researchers will test the participants again to see if there may be longer term benefits to the training.

With both the treadmill walking and the memory game, the exercise or video game will become more challenging as the participant improves. The memory training works like a video game with players advancing to a higher level of difficulty. For the exercisers, trainers may increase the speed or slope of the treadmill to make it more aerobically challenging.

“This new study builds on our experience from a previous study of exercise for gait and mobility in Parkinson’s disease. Since both motor function and cognitive function are important for mobility and performance of daily activities, this new study will investigate the individual and combined effects of treadmill training and cognitive training,” explains Lisa Shulman, M.D., co-investigator and professor of neurology at the University of Maryland School of Medicine.

“Parkinson’s patients are eager to know if there is anything they can do to give them greater control over their condition. Mobility and memory are the two key components to preserve independence. If these treatment strategies are found to be effective, we will learn important new approaches to delaying disability,” says Dr. Shulman who is co-director of the Maryland Parkinson’s Disease and Movement Disorders Center.

The treadmill training will take place at the Baltimore VA Medical Center in the Maryland Exercise and Robotics Center of Excellence, a gym facility with specialized equipment for people with physical limitations or balance issues. For safety, participants will wear a safety harness while walking on the treadmill. Experienced exercise physiologists will supervise each training session.

The computerized memory training game will take place both at the VA and University of Maryland School of Medicine.

“This study shows the commitment of our University of Maryland faculty to exploring new approaches, such as exercise and memory training, to help patients with illnesses such as Parkinson’s disease around the world,” says E. Albert Reece, M.D., Ph.D., M.B.A, vice president for medical affairs, University of Maryland, and dean, University of Maryland School of Medicine.

The Maryland researchers expanded the exercise studies to Parkinson’s patients after first finding success with treadmill training for stroke patients. This research, also conducted at the University of Maryland School of Medicine and the VA Maryland Health Care System, found that regular exercise on a treadmill can improve stroke patients’ walking ability even years after they’ve had a stroke.

Co-investigator Richard Macko, M.D., says, “With stroke patients, we have seen that the consistent, repetitive motion of walking may help the brain to develop new connections to compensate for the damaged ones. This new Parkinson’s study takes the concept of exercise training for neurology patients in a new direction. We will be interested to see if this consistent training will produce benefits to memory.” Dr. Macko is director of the Maryland Exercise and Robotics Center of Excellence at the VA Maryland Health Care System and professor of neurology at the University of Maryland School of Medicine.

On the Net:

Hydroxyurea Does Not Cause Genetic Damage In Children With Sickle Cell Anemia

Young infants and toddlers with sickle cell anemia who received the drug hydroxyurea were no more likely to have cellular genetic damage than those who received a placebo or inactive medicine, said researchers from Baylor College of Medicine in Houston and St. Jude Children’s Research Hospital in Memphis, Tenn., yesterday at the American Society of Hematology’s annual meeting in San Diego, Calif.

These findings provide further evidence of the safety of hydroxyurea for children with sickle cell anemia, said the study’s lead author Dr. Patrick T. McGann, a clinical fellow of pediatric hematology and oncology at BCM and the Texas Children’s Hematology Center. There is some concern that hydroxyurea therapy would lead to genetic damage at the cellular level and could increase the risk of developing cancer.

Inherited blood disorder

Sickle cell anemia is an inherited blood disorder caused by a gene mutation in the beta-globin gene. The red blood cells of individuals affected with the disorder are shaped like a sickle or crescent, unlike normal red blood cells that are disc—shaped. Normal, healthy red blood cells move through blood vessels easily, carrying hemoglobin, the critical protein necessary for the delivery of oxygen, throughout the body. The stiff, sickle-shaped red blood cells do not flow as easily and often get stuck in small blood vessels, decreasing oxygen delivery to legs, arms and other organs and resulting in severe pain, increased risk of infection and chronic organ damage.

Hydroxyurea was approved by the U.S. Food and Drug Administration as a treatment for adults with sickle cell disease in 1998, but it has not yet been approved for use in children. Hydroxyurea reduces the amount of sickle hemoglobin by stimulating the production of fetal hemoglobin, which results in healthier red blood cells and a reduction in the frequency and severity of the complications associated with the disorder.

“It is important to demonstrate the low genotoxic risks associated with hydroxyurea for children with sickle cell anemia given that hydroxyurea was historically used as a chemotherapeutic treatment for cancer,” said McGann, working under the mentorship of Dr. Russell E. Ware, professor of pediatrics at BCM and director of the Texas Children’s Center for Global Health and of the Texas Children’s Hematology Center.

The research team used data collected from the Pediatric Hydroxyurea Phase III Clinical Trial (BABY HUG), a multi-center study whose primary goal was to investigate the ability of hydroxyurea to prevent chronic organ damage in very young patients with sickle cell anemia.

A total of 193 infants (average 13.6 months) were assigned at random to receive hydroxyurea or placebo for two years. After studying the effect of the drug, the researchers found no evidence of increased genetic damage in the children who received hydroxyurea than in those who received placebo.

Effective treatment

“We found that young children treated with hydroxyurea for two years did not demonstrate evidence of increased DNA damage when compared to children receiving placebo” said McGann. “These data represent another important piece of evidence that this very effective treatment is safe for young children.”

Other investigators at BCM who contributed to this study include Ware, Dr. Jonathan Flanagan, assistant professor of pediatrics at BCM and Thad Howard.

Other investigators include Dr. Stephen Dertinger of Litron Laboratories in Rochester, New York; Dr. Jin He of St. Jude Children’s Research Hospital in Memphis, Tenn.; Dr. Anita Kulharya of the Georgia Health Sciences University in Augusta and Dr. Bruce Thompson of Clinical Trials and Surveys Corp. in Owings Mills, M.D.

On the Net:

The Brain On Trial

Three experts discuss the rising influence of neuroscience in the courtroom, how advances in neuroscience are posing new challenges for the judicial system, and the use of therapeutic solutions for reforming criminals.

Increasingly what we know about the brain is affecting what happens during and after criminal trials. Recently, for instance, the U.S. Supreme Court decided that adolescents could not be eligible for the death penalty partially based on neuroscience evidence that indicated the teen brain is not fully mature. Defense attorneys have also used brain scans to suggest their clients’ actions were influenced by brain damage or abnormalities, as part of an argument for modified sentences. And the recent view of drug addiction as a brain disorder has raised questions about how drug addicts should be held accountable for their criminal behavior.

The rising influence of neuroscience in the courtroom was the focus of the Fred Kavli Public Symposium, held in November at the Society for Neuroscience’s “Neuroscience 2011.” Titled “The Brain on Trial: Neuroscience and the Law,” the symposium was chaired by Alan Leshner, Chief Executive Officer of the American Association for the Advancement of Science and former head of the National Institute of Drug Abuse. One of the key issues this symposium explored was how advances in neuroscience are posing serious challenges for the judicial system, as well as possible solutions for the treatment of criminals.

The Kavli Foundation held a teleconference to discuss this topic. Along with Dr. Leshner, the participants included cognitive neuroscientist and neuroethics expert Martha Farah, director of the Center for Neuroscience and Society, University of Pennsylvania. Also joining the dialogue was neuropsychiatrist Jay Giedd, MD, an expert in adolescent brain development at the National Institute of Mental Health and chief of NIMH’s Unit on Brain Imaging in the Child Psychiatry Branch.

Together, they discussed what role neuroscience should have in determining legal policies and judgments, innovative brain-based treatments for certain pathological behavior, and how we are easily fooled by colorful brain scans. They also shared opinions on whether “painting a picture of the neural processes that give rise to criminal behavior” as Dr. Farah put it, can excuse that behavior or mandate lighter sentencing. Below is the edited transcript of that discussion.

THE KAVLI FOUNDATION: Dr. Leshner, to begin, in the context of neuroscience and the law, what does your field reveal about personal responsibility that you feel is not well understood?

ALAN LESHNER: There´s been a lot of controversy around the increased understanding that addiction is a brain disease and the implications of that for personal responsibility or law. The fact that you have a brain disease means that biologically, change in your brain has led to compulsive repeated drug use. But it doesn´t mean that you have no responsibility for any of your behavior. Are you less responsible? In a way you are, and that should be taken into account in sentencing; but more to the point, it means we should require treatment while we have drug addicts under criminal justice control because that would decrease significantly the probability they would commit crimes again. Although people may not want treatment initially, they become more involved in it as they are required to participate in drug treatment programs and the treatment is ultimately effective. So I believe drug addicts should be required to go into treatment if they commit an act against society. More criminal justice institutions are implementing treatment while people are under criminal justice control.

TKF: Dr. Giedd, as an expert in adolescent brain development, what is the proper use of your insights into the brain for determining in a courtroom whether a person is responsible for his or her actions?

JAY GIEDD: What the neuroscientist is often asked is, “Was it the person, or was it his or her brain, that made them commit a crime?” To me this is a false dichotomy because the person and the brain are inseparable–everything we think, feel or do, all of our motivations, urges, and our ability to act or not act upon those urges is ultimately a product of our brain activity. This means that, even if we can trace the person´s actions back to the brain, for the most part that doesn´t change the personal responsibility for the crime, which is more of a social, moral, or philosophical issue. There are some rare but remarkable exceptions where someone might have brain damage or a brain tumor that dramatically affects their capacity to make decisions. But most often, it´s not a matter of the brain being qualitatively different from that of other people, and more of the brain reflecting a lifelong accumulation of experiences in which genes interact with the environment in very complex ways. In general, people overvalue the ability of brain imaging or neuroscience to cut through that complexity.

TKF: What about the sentencing of a convicted criminal. Does neuroscience have a role in determining or tempering the legal consequences of a crime, particularly among teenagers?

JAY GIEDD: There currently is debate about whether adolescent brain maturity should be considered in sentencing for a crime or help determine what would be the proper deterrent for future adverse behavior. On average, the 15-year-old brain is different than the 25-year-old brain. What´s difficult is applying this to individuals, as there are so many exceptions to the rule — there are many mature teens, and likewise immature people in their twenties and thirties. So the real challenge is going from group averages to individual prediction or characterization. One pretty strong group phenomenon is that teenagers don´t really think about the future the same way that older people do, so long-term consequences are less of a deterrent to their behavior. Teens are less likely to weigh long-term consequences over the immediate consequences. The other strong adolescent phenomenon is heightened sensitivity to peer pressure. The judicial system could focus on both of these adolescent phenomena when determining deterrents and interventions for adolescent criminal behavior.

TKF: Dr. Farah, what are your thoughts on this topic?

MARTHA FARAH: I agree with Jay and Alan. The mere fact that brain processes give rise to the behavior isn´t enough to excuse it. The fact that your behavior had causes doesn´t diminish your responsibility for it in the eyes of the law. But the law does recognize some psychological conditions that diminish responsibility, and if neuroscience knows something about the neural processes underlying these conditions, it can aid in their diagnoses. If the kind of neuroimaging research that Jay does continues for a few more decades, we may well get to the point where we can point to a brain signature in the brain scans of people who have not yet developed an ability to think about the future. Then it would be really reasonable for an attorney to bring in that kind of a brain scan and say, “This young person was developmentally incapable of thinking about the consequences of what he just did.” That would be a case of neuroscience appropriately helping to excuse someone, but it wouldn´t be doing it just by showing there´s a brain basis to the behavior. It would be doing it by showing that a certain psychological ability wasn´t there.

TKF: So is it ever a valid defense to simply say, “My brain made me do it”?

JAY GIEDD: The Charles Whitman case of 1966 comes to mind–he was the Texas tower shooter who killed 16 people and wounded 32 others. Brain scans revealed he had a brain tumor, which may have led to these aggressive, uncontrolled actions. Those cases are so rare, in which there is blatant damage to the brain that is obviously linked to the behavior that occurred, but it certainly can happen. What´s more difficult to discern are the much more subtle brain differences that may not necessarily be linked to the behavior. In these cases, the colorful brain scans introduced in courts can be almost irrationally persuasive. One study showed that if you put in an image into a scientific paper that had nothing to do with the content, other scientists will rate the paper as being more logical and better written, as being more persuasive.

TKF: A picture says a thousand words.

JAY GIEDD: Yes, but not necessarily true words. It´s very seductive to have an aesthetically beautiful brain scan that brings a sense of certainty to something that really isn´t certain. Human behavior is much more nuanced and complex than what some of these images are showing. There are several reasons why brain scans are often misinterpreted and can´t be counted on. One is that you´d really have to be scanning the person at the time of the crime to accurately capture what their brain was like then, because brain states change second-by-second. Another reason is that there´s always the danger of making reverse inferences. An example of a reverse inference is “All dogs have four legs, a chair has four legs, therefore a chair is a dog.” People will look at a brain scan and say “Aha, there´s a bright color indicating heightened activity in this area of the brain that is involved in impulsivity,” even though there are thousands of other activities besides impulsivity that would also cause that same bright color in that spot of the brain. People notoriously overestimate our ability to go from the pictures, aesthetically pleasing as they are, to predicting behavior, to being useful.

TKF: What has been your experience in how neuroimaging plays out in the courtroom?

JAY GIEDD: In the court situations I´ve been involved in, attorneys present me with case histories of incredibly impulsive behavior–of young children hopping into police cars and driving them off without thinking about the consequences, and ask me to do brain scans to see if these kids are impulsive, but that´s a backwards way to approach this. Everything they´ve just said clearly demonstrates that indeed these kids are very impulsive, and if their brain scans show heightened activity in an area that´s involved in impulsivity, okay, but if they didn´t show that, I just wouldn´t believe the images. We can use imaging to tell if someone can do long division just by showing them a problem while they´re in the scanner for about $600, or we can get the same information for three cents just by pulling out a pencil and paper and asking them what´s the answer to a long division problem. It´s a fundamental misconception that the imaging can confirm certain behavioral realities.

TKF: It seems like you are being asked to make a distinction between normal and abnormal behavior based on neuroimaging or other neuroscience evidence. The determination that drug addiction is a disease also suggests we have clear diagnostic endpoints that really distinguish normal from abnormal behavior. But is the evidence ever clear cut in certain cases? Is it really obvious when someone is drug addicted?

ALAN LESHNER: It´s easily determined whether someone is addicted with current diagnostic tools, such as the addiction severity index. But all this knowledge of brain involvement in addiction helps you put the issue in the right box. Traditionally we have thought of addiction only as a criminal justice issue and therefore everything we did related to it had to do with a criminal justice approach. But once you understand that it´s a brain disease and therefore a health issue, you realize the only effective policies in modern society are those that recognize both the health aspect of drug addiction–that drug addicts commit crimes like burglary in order to secure drugs–and that society feels they ought be held responsible for those crimes. The issue for me is not so much what´s the appropriate punitive approach, but what will work best for society. If you don´t deal with the illness or the brain part of drug addiction, you have much less chance of actually reducing the behavior you don´t like, whether it´s drug using or committing crimes. By combining the health approach to the criminal justice approach, I think you have a better chance of having effective and acceptable public policy.

TKF: Do prosecutors ever argue that a brain scan proves a person is hopelessly addicted or has some other problem that suggests they shouldn´t be allowed to join society?

ALAN LESHNER: It does happen, but when that happens it´s a misrepresentation. In fact, we have quite effective treatments for drug addiction that are equally effective as those for other chronic relapsing disorders. It´s not the same as taking penicillin for your strep throat, but there are very effective treatments that can help bring the addiction under control.

MARTHA FARAH: Where the strategy of presenting brain scans as evidence in a trial can backfire is for psychopaths. Psychopaths are likely to commit crimes and, unlike drug addicts, there are no effective treatments for them. So the defense could show a brain scan of a psychopath and say, “Look; this guy´s brain doesn´t work like a normal person´s brain–he´s wired up wrong and couldn´t have shown empathy and moral judgment,” with the aim of getting a lighter sentence. But the judge or jury can hear this and decide it makes him more dangerous and incorrigible so they had better lock him up and throw away the key.

TKF: What role do you think science should have in helping society or the courtroom determine when and how much someone should be held responsible for his or her actions?

ALAN LESHNER: Policy, including criminal justice policy, is always made on the basis of both scientific facts and societal values and we´re never in a situation where you can expect that science will drive the development of policy all by itself. Having said that, I would want to be assured that the science has been fully considered, and that we´re not denying it by diluting or distorting it in the process of making public policy, including criminal justice policy.

TKF: You´re saying that science can provide the facts, but society still has to put a value judgment on top of it?

ALAN LESHNER: Absolutely. It doesn´t have to, it will. You need to recognize that that´s a part of the way the world works.

JAY GIEDD: There are several ways neuroscience can help shape judicial or other social policies, including influencing determinations of appropriate punishments for adolescent crimes or legal age minimums for certain behaviors or political offices. One of the things in the neuroscience of adolescence that is getting more accepted is the importance of learning by example, by modeling. The brain is well suited to learn by example, modeling, and from experience. So incarcerating people in the second decade of their lives will mean their learning will come from other criminals and the judicial system. We have to be aware of the lifelong consequences of having people locked up at this time of life when their brains are specializing based on their experience. The danger is we might have them specialize in being criminals. Also, around the world people are struggling with what should be the appropriate age minimums for voting, driving, becoming a Congressman or president, or having other privileges in society. I think the neuroscience could help with those kinds of policy decisions or laws. Neuroscience findings on adolescents, for example, led to many states adopting graduated driving licenses, which have saved a lot of lives. So neuroscience can be helpful in creating policies related to keeping teens safe and healthy, more so than determining punishment and retribution after they´ve done crimes. Neuroscience may have a place at the table for helping us understand these ideas of competency in major life decisions.

TKF: It sounds like you´re more comfortable making scientific judgments about adolescents as a group as opposed to determining from an individual adolescent´s brain scan whether he or she is prone to abnormal behavior.

JAY GIEDD: Exactly, and that´s really been our greatest challenge, not just for the judicial system, but clinically as well. We can say the average schizophrenic brain is different than that of the average person without schizophrenia, but we can´t put an individual in a scan and tell if they have schizophrenia, or depression, ADHD, autism, etc. There´s a lot of group effort trying to improve our ability to go from the group average to the individual level. The amount of progress we make on that will affect how much neuroimaging findings should influence judicial proceedings.

MARTHA FARAH: Some of the most interesting implications of the neuroscience findings are for how criminals should be punished, as opposed to the question of how responsible they should be for their behavior. We punish for a variety of reasons: for retribution, to incentivize good behavior, and in some cases, to improve the offender. This last one is sometimes called “therapeutic justice”–we send someone to anger management therapy or parenting classes. A lot of good could come from getting society more strongly behind the idea of therapeutic justice. If people´s brains are causing them to do bad things and if we can fix their brains or foster healthier development of their brains, we´ll all be safer. The individual offender is better off and society is better off. So, by putting crime within that public health framework, it makes the idea of therapeutic justice more appealing. As clinical neuroscience advances, it´s actually going to give us ways of accomplishing therapeutic justice.

TKF: So along with being punished, someone´s sentence might include drug therapy to alter how they think and behave?

MARTHA FARAH: The therapy could actually be the sole punishment, or be part of someone´s sentence. We already sentence sex offenders to anti-androgen therapy–chemical castration. And that treatment works on the brain in addition to other parts of the body. It is a central nervous system intervention that´s supposed to help curb their antisocial urges and keep people safer. We may have more such therapies available as neuroscience goes forward.

TKF: So in the future, people might be incarcerated less and instead treated for behavioral offenses.

MARTHA FARAH: Yes, although it does raise the specter of Brave New World and state control of brain processes. The science can tell us what we can do, but society´s values will tell us whether we should do it.

ALAN LESHNER: I agree. And I do believe that people should be required to go into treatment, whether they want it or not, if they are addicted and commit an act against society. You´re not totally controlling the individual with that treatment–it´s not like they are turned into zombies. They go through treatment programs that don´t do away with their free will. It´s important to have a perspective on the degree of mind control that´s being exerted here–it´s not very great. So to second what Martha said, both from a societal standpoint and an individual standpoint, they´re better off getting that treatment.

TKF: What would a therapeutic punishment be for an adolescent whose brain is just not mature enough? There´s no treatment that can speed up that maturity.

JAY GIEDD: The adolescent brain is very plastic and there are things you can do to improve it, from modeling more healthy behaviors to working on impulse control. Brain-based interventions do not have to be deterministic. It´s not like one morning you wake up and your brain´s mature–it´s a very protracted process. But a lot can be done to help adolescents make better choices and have different peer systems. I completely agree with Martha and Alan that it could be a very good thing for that adolescent if they are mandated to get these services that they might not otherwise get because of finances or peer pressures. If we really want to make society safer and better, it would be a very good investment. You can´t move the clock to fast forward, in terms of brain development. But just because someone is an adolescent doesn´t mean you can´t improve their decision-making.

On the Net:

Studies Assess Hydroxyurea Therapy And Pre-Operative Transfusions For Patients With Sickle Cell Disease

Research assessing the safety and efficacy of hydroxyurea therapy in pediatric patients with sickle cell disease (SCD) and the use of pre-operative transfusions for patients with SCD who undergo low- and moderate-risk elective surgery will be presented today at the 53rd Annual Meeting of the American Society of Hematology.

An estimated 90,000 to 100,000 Americans are affected each year by SCD, a serious disorder that causes normal red blood cells to become rigid and form in a crescent “sickle” shape. The abnormal shape of these cells causes them to clump together and become embedded in the blood vessels of organs, causing pain, infection, potential organ damage, and stroke. Even with advancements in drug therapies and prevention methods, safe and effective treatment options remain limited, especially for children and patients facing surgery who have an increased risk for complications.

“The studies presented today underscore the need to assess the quality and effectiveness of therapy for sickle cell disease, particularly in the pediatric population, and investigate the strategic use of pre-operative transfusion for sickle cell patients,” said Susan B. Shurin, MD, moderator of the press conference and Acting Director of the National Heart, Lung, and Blood Institute (NHLBI) in Bethesda, Md.

Dr. Shurin, who has been instrumental in the U.S. Department of Health and Human Services´ (HHS) recent initiative to promote research advances in SCD, stated, “I am hopeful that discoveries like these, combined with continuing efforts to develop new treatments, share public health data, and provide evidence-based guidelines, will soon lead to significant improvement in the lives of patients with SCD.”

This press conference will take place on Sunday, December 11, at 10:00 a.m. PST.

Hydroxyurea Treatment of Young Children with Sickle Cell Anemia: Safety and Efficacy of Continued Treatment — the BABY HUG Follow-up Study [Abstract 7]

Hydroxyurea is the only federally approved therapy to prevent sickle cell complications in adults with sickle cell anemia (SCA). However, based on positive results from previous trials assessing clinical benefits for use in children, specialists are increasingly considering the use of hydroxyurea in their pediatric patients. Results from the BABY HUG Follow-up Study I suggest that continued use of hydroxyurea is both safe and effective in infants with SCA.

The Pediatric Hydroxyurea Phase III Clinical Trial (BABY HUG) was a multicenter, randomized clinical trial that assessed the clinical benefits of hydroxyurea in very young patients with SCA. Results from the study demonstrated that hydroxyurea administered to infants with SCA provided substantial clinical benefit over placebo.

Researchers launched the BABY HUG Follow-Up Study I in 2008 to assess the safety and efficacy of continued treatment with hydroxyurea in infants with SCA. The Follow-Up Study I included 163 children between the ages of 28 and 44 months who had participated in the BABY HUG trial and who had completed at least 18 months of randomized treatment of either hydroxyurea or placebo. Investigators collected clinical and laboratory data every six months from patient medical records, including use and dosage of hydroxyurea, blood counts, clinical imaging, and frequency of sickle cell-related complications.

At Follow-Up Study entry, families enrolling their children did not know their child´s randomized study treatment assignment in BABY HUG; 82 percent initially chose clinical prescription of open-label hydroxyurea, demonstrating a high acceptance rate for the drug. Through the 36 months of follow-up, acceptance remained high, with 68 to 75 percent of the participating families reporting that their children continued to take hydroxyurea.

Follow-Up Study I data indicate that children who continue to take hydroxyurea have statistically lower rates of pain crises requiring emergency room visits, episodic transfusions, and hospital admissions for any reason when compared to those taking placebo. These clinical benefits are similar to those demonstrated by the drug in BABY HUG, the results of which were published earlier this year and are consistent with previously published trials that detail the therapy´s benefits in older children and adults.

“Our study data reveal not only that the clinical benefits of hydroxyurea continue with ongoing administration, but also the wide acceptance of the treatment by the families of our patients, demonstrated by the high percentage of families that continued their children on hydroxyurea after the randomized trial ended,” said lead author Zora R. Rogers, MD, Professor of Pediatrics at UT Southwestern Medical Center Dallas and Clinical Director of the Bone Marrow Failure and General Hematology Program at Children´s Medical Center Dallas. “Analysis of growth and development assessments obtained in the Follow-Up Study along with these clinical results will further enhance our understanding of the benefits of the use of starting hydroxyurea in children with sickle cell disease at a very young age.”

The BABY HUG Follow-Up Study I is funded by the National Heart, Lung, and Blood Institute (NHLBI) and the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD).

Dr. Rogers will present this study in an oral presentation on Sunday, December 11, at 4:30 p.m. PST at the Manchester Grand Hyatt in the Elizabeth Ballroom AB.

Genotoxicity Associated with Hydroxyurea Exposure in Infants with Sickle Cell Anemia: Results From the BABY-HUG Phase III Clinical Trial [Abstract 8]

Using data from the largest clinical trial to date to assess the use of hydroxyurea in pediatric patients with sickle cell anemia (SCA), researchers have provided further evidence that the therapy likely does not cause long-term genetic damage (known as genotoxicity) in young patients with SCA.

To assess whether hydroxyurea potentially causes genotoxic effects in infants with SCA, researchers analyzed patient data from the Pediatric Hydroxyurea Phase III Clinical Trial (BABY HUG). This multicenter, randomized clinical trial assessed the clinical benefits of hydroxyurea in infants with SCA.

In this study, 193 infants between the ages of nine and 18 months were randomized to receive either hydroxyurea or placebo over the course of two years. An important secondary objective of the study was the in vivo measurement of acquired genotoxic effects of hydroxyurea, which was obtained by tracking the frequency of several laboratory-based measures of DNA damage. These markers include breaks in chromosomes or chromatids (DNA double-strands), abnormal recombination of DNA in cells of the immune system, and the formation of micronucleated reticulocytes (abnormal young red blood cells).

At the conclusion of the study, children receiving hydroxyurea did not show any significant differences in the numbers of any of the damage markers when compared with children on placebo. These data suggest that conventional doses of hydroxyurea do not appear to cause DNA damage as suggested by laboratory-based experiments that have used higher concentrations of the drug, providing reassurance that hydroxyurea presents low genotoxic risk to children.

“Although the clinical benefits of hydroxyurea for children with sickle cell disease are well recognized, treatment in young patients is limited due to concerns about potential long-term genotoxic effects,” said lead author Patrick T. McGann, MD, Fellow in the Department of Pediatrics in the Hematology-Oncology Section at Baylor College of Medicine in Houston. “Results from this study contribute to a growing body of evidence suggesting that in vivo genotoxicity of hydroxyurea in sickle cell anemia appears to be low. Taken together with the clinical benefits demonstrated in the BABY HUG study, hydroxyurea may be considered as a potential therapeutic option for even very young, asymptomatic children with sickle cell anemia.”

Dr. McGann will present this study in an oral presentation on Sunday, December 11, at 4:45 p.m. PST at the Manchester Grand Hyatt in the Elizabeth Ballroom AB.

Pre-Operative Transfusion Reduces Serious Adverse Events in Patients with Sickle Cell Disease (SCD): Results From the Transfusion Alternatives Preoperatively in Sickle Cell Disease (TAPS) Randomised Controlled Multicentre Clinical Trial [Abstract 9]

New research suggests that patients with sickle cell disease (SCD) who are undergoing surgeries, such as abdominal surgery or tonsillectomy, should receive pre-operative transfusions to reduce their risk of post-operative complications.

Pre-operative blood transfusions have been used to reduce complications after surgery for SCD patients, who are at higher risk for these complications than the general population. However, some experts have claimed that the procedure is not necessary and can be safely omitted prior to surgery.

To assess the clinical benefit of pre-operative transfusion to alleviate perioperative complications in patients with SCD undergoing low- or medium-risk surgery, researchers embarked on the multicenter, randomized controlled Transfusion Alternatives Pre-Operatively in Sickle Cell Disease (TAPS) Trial. The trial was carried out between November 2007 and March 2011 at 22 clinical sites in the United Kingdom, the Netherlands, and Canada.

Patients with HbSS and HbSbo thalassemia, the most severe forms of SCD, were randomized to two treatment arms: patients in Arm A received a pre-operative blood transfusion, while patients in Arm B did not. The primary outcome of the TAPS trial was the proportion of patients experiencing any significant complication between time of randomization into the study and 30 days post-surgery.

Of the 343 patients screened for the trial, 70 were randomized into treatment arms at time of study termination. The trial was closed early because of an excess of serious adverse events (SAEs), which were a subset of complication reports, in the untransfused arm (Arm A). Thirty-three patients in Arm A and 34 patients in Arm B were included in the final analysis. Researchers found that 39 percent of patients who did not receive pre-operative transfusions experienced a perioperative complication, compared to 15 percent of patients in the transfusion arm. There was also a difference in the rate of SAEs between the two arms; 30 percent of patients who did not receive a transfusion experienced an SAE, compared to only 3 percent of patients who received one prior to surgery. The majority of SAEs reported were acute chest syndrome, a life threatening complication of SCD.

“Currently, many patients with SCD are not receiving pre-operative transfusions, which may be putting them at risk of serious complications. The results from our trial demonstrate a striking increase in the total number of complications, both common and life-threatening, in patients who did not receive a blood transfusion before surgery,” said lead author Jo Howard, MD, Consultant Hematologist at Guy´s and St. Thomas Hospital in London. “Moving forward, we recommend that clinicans take these results into consideration when deciding on pre-operative transfusion and suggest that patients with HbSS and HbSbo thalassaemia receive a blood transfusion before surgery.”

The TAPS trial and was sponsored and funded by NHS Blood and Transplant in the United Kingdom.

Dr. Howard will present this study in an oral presentation on Sunday, December 11, at 5:00 p.m. PST at the Manchester Grand Hyatt in the Elizabeth Ballroom AB.

1Sickle Cell Disease: Data & Statistics. http://www.cdc.gov/ncbddd/sicklecell/data.html. Accessed October 18, 2011

American Society of Hematology 53rd Annual Meeting

The study authors and press program moderator will be available for interviews after the press conference or by telephone. Additional press briefings will take place throughout the meeting on new treatment techniques for patients with bleeding and clotting disorders, targeted therapies for acute and chronic leukemia, improving recovery and outcomes in transplantation, and emerging treatments for lymphoma and myeloma. For the complete annual meeting program and abstracts, visit www.hematology.org/2011abstracts. Get up-to-date information about the annual meeting by following ASH on Twitter @ASH_hematology.

The American Society of Hematology is the world´s largest professional society concerned with the causes and treatment of blood disorders. Its mission is to further the understanding, diagnosis, treatment, and prevention of disorders affecting blood, bone marrow, and the immunologic, hemostatic, and vascular systems by promoting research, clinical care, education, training, and advocacy in hematology. The official journal of ASH is Blood, the most cited peer-reviewed publication in the field, which is available weekly in print and online.

[7] Hydroxyurea Treatment of Young Children with Sickle Cell Anemia: Safety and Efficacy of Continued Treatment — the BABY HUG Follow-up Study

Zora R. Rogers, MD1, Billie Fish, CCRP2*, Zhaoyu Luo, PhD2*, Rathi V. Iyer, MD3, Courtney D. Thornburg, MD, MS4, Sharada A. Sarnaik, MD5, Sohail R. Rana, MD6*, Lori Luchtman-Jones, MD7, Sherron M. Jackson, MD8*, Thomas H. Howard, MD9, James F. Casella, MD10, R. Clark Brown, MD, PhD11, Ofelia A. Alvarez, MD12, Jonathan C. Goldsmith, MD13, Scott T. Miller, MD14 and Winfred C. Wang, MD15

1Pediatrics, University of Texas Southwestern Medical Center, Dallas, TX
2Clinical Trials & Surveys Corporation , Owings Mills, MD
3University if Mississippi, Jackson, MS
4Duke University Medical Center, Durham, NC
5Children’s Hosp. of Michigan, Detroit, MI
6Howard University, Washington, DC
7Children’s National Medical Center, Washington, DC
8Medical University of South Carolina, Charleston, SC
9University of Alabama at Birmingham, Birmingham, AL
10The Johns Hopkins University School of Medicine, Baltimore, MD
11Emory University / Children’s Healthcare of Atlanta, Atlanta, GA
12Univ. of Miami School of Med., Miami, FL
13National Heart Lung and Blood Institute, NIH, Bethesda, MD
14SUNY – Brooklyn, Brooklyn, NY
15St. Jude Children’s Research Hospital, Memphis, TN

BABY HUG [Clinical Trials #NCT00006400], an NIH-NICHD sponsored randomized placebo-controlled trial showed that hydroxyurea (HU) administered to 9-18 month old children with sickle cell anemia (SCA) provides substantial clinical benefit. Benefits include a decrease in pain crises, acute chest syndrome events, need for transfusion and hospital admission; hematologic improvement include higher total and fetal hemoglobin concentration, larger red cell size, and lower WBC counts with toxicity limited to transient reduction in absolute neutrophil count (ANC) [Lancet 2011; 377:1663-72]. The parent or guardian of all 176 children who completed at least 18 months of randomized treatment were offered participation in an initial observational BABY HUG Follow-Up Study and 163 (93%) consented to participate. Clinical and laboratory data were collected every 6 months by structured abstraction of the medical record regarding use of clinically prescribed HU (dose escalation recommended), blood counts, clinical imaging, and sickle cell-related events. At the time of enrollment the family did not know their child’s randomized study treatment assignment; 133 (82%) initially chose clinical prescription of open-label HU. Acceptance of HU has remained high through 36 months of follow-up; during each 6 month data collection period 68-75% of participants reported having taken HU.

Only 2 patients have left the study (due to relocation) and more than 93% of expected data have been collected. Preliminary analyses as of May 2011, including 417 patient years (pt-yrs) of follow up, demonstrate that in comparison to participants not taking HU, children who continue to take HU have statistically lower rates of pain crises requiring emergency department (ED) visits, episodic transfusions, and hospital admissions for any reason, including acute chest syndrome or febrile illness (see table). The substantial decrease in acute chest syndrome episodes is similar to the effect demonstrated with HU use in the randomized BABY HUG trial in younger infants and consistent with published trials detailing the benefit of HU therapy in older children and adults. The decrease in the rate of admission for febrile events in HU-treated patients is also comparable to that in the randomized trial, but the reason for this benefit is uncertain. There was no difference in hospitalization rates for painful events including dactylitis. Two patients in the non-HU group had a stroke. There were no differences between groups in the frequency of a palpable spleen or rate of acute splenic sequestration crises. Through 36 months of follow up children taking HU had persistently higher hemoglobin and MCV, and lower WBC and ANC than those not taking HU.

Results of these analyses including growth and development assessments will enhance our understanding of the impact of HU use in children with SCA starting at a very young age. The accruing data from the BABY HUG Follow-Up Study demonstrate a continuation of the substantial benefits of early HU therapy with no discernable additional toxicities. Ongoing follow up of this cohort is essential to fully define these benefits as children grow, and to observe for late toxicity.

Disclosures: Off Label Use: Hydroxyurea is not indicated for treatment of children with sickle cell disease. Use of this medication was for clinical indications and not mandated by this observational study.

[8] Genotoxicity Associated with Hydroxyurea Exposure in Infants with Sickle Cell Anemia: Results From the BABY-HUG Phase III Clinical Trial

Patrick T. McGann, MD1, Jonathan M Flanagan, PhD2, Thad A Howard, MS2*, Stephen D Dertinger, PhD3*, Jin He, MD4*, Anita S Kulharya, PhD5*, Bruce W Thompson, PhD6* and Russell E. Ware, MD, PhD2

1Texas Children’s Hospital Hematology Center, Baylor College of Medicine, Houston, TX
2Baylor College of Medicine, Houston, TX
3Litron Laboratories, Rochester, NY
4St. Jude Children’s Research Hospital, Memphis, TN
5Pathology, Georgia Health Sciences University, Augusta, GA
6Clinical Trials and Surveys Corp., Owings Mills, MD

The laboratory and clinical benefits of hydroxyurea therapy for children with sickle cell anemia (SCA) are well recognized, but treatment in young patients is limited in part by concerns about long-term genotoxicity, and specifically possible carcinogenicity. To date, few prospective data have been available to assess the mutagenic and carcinogenic potential of hydroxyurea in young patients with SCA. The Pediatric Hydroxyurea Phase III Clinical Trial (BABY HUG) was a multi-center double-blinded placebo-controlled randomized clinical trial (NCT00006400) testing whether hydroxyurea could prevent chronic organ damage in very young patients with SCA. BABY HUG was conducted across 14 centers and was approved by the local institutional review boards of all participating centers. A total of 193 infants (mean 13.6 months) with SCA (HbSS or HbS/ß0-thalassemia) were randomized to receive hydroxyurea (fixed dose of 20 mg/kg/day) or placebo for two years. An important secondary objective of the study was the in vivo measurement of acquired genotoxicity using three laboratory assays: chromosomal karyotype including quantitation of chromosomal breaks, chromatid breaks, and fusion events; illegitimate VDJ recombination events representing inversion events on chromosome 7 with juxtaposition of T-cell receptor Vg and Jb and gene loci; and micronucleated reticulocyte formation signifying aberrant erythroid production. Subjects in both the hydroxyurea and placebo groups had significantly increased numbers of total chromosome breaks and similar numbers of chromatid breaks at study exit compared to study entry. However, at study exit, subjects with hydroxyurea exposure had similar numbers of chromosome and chromatid breaks as subjects receiving placebo (0.5 ± 1.4 chromosome breaks per 100 metaphases vs. 0.4 ± 2.5, p=NS; 0.6 ± 1.1 chromatid breaks per 100 metaphases vs.0.8 ± 3.4, p=NS). There were no changes in the number of illegitimate VDJ recombination events observed, comparing study entry and exit samples either in the hydroxyurea or the placebo treatment group. At study exit, subjects treated with hydroxyurea had similar numbers of illegitimate VDJ recombination events as subjects receiving placebo (0.7 ± 0.5 events per µg of DNA versus 0.7 ± 0.4 events, p=NS). Subjects treated with hydroxyurea had a similar number of early reticulocytes containing micronuclei at study exit compared to subjects receiving placebo (0.3 ± 0.2% versus 0.3 ± 0.2%, p=NS). Together, these data indicate that hydroxyurea treatment in very young patients with SCA was not associated with any significant increases in genotoxicity compared to placebo treatment. These data provide evidence of cytogenetic stability in this susceptible population of young children and contribute to a growing body of evidence to suggest that in vivo genotoxicity of hydroxyurea in patients with SCA appears to be low.

Disclosures: Dertinger: Litron Laboratories: Employment.

[9] Pre-Operative Transfusion Reduces Serious Adverse Events in Patients with Sickle Cell Disease (SCD): Results From the Transfusion Alternatives Preoperatively in Sickle Cell Disease (TAPS) Randomised Controlled Multicentre Clinical Trial

Jo Howard, MB, BChir, MRCP, FRCPath1*, Moira Malfroy, RN2*, Llewelyn Charlotte, PhD2*, Louise Choo, PhD3*, David Rees, FRCPath4*, Isabeau Walker, FRCA5*, Tony Johnson, PhD3*, Louise Tillyer, FRCPath6*, Karin Fijnvandraat, MD, PhD7, Melanie Kirby-Allen, MD8*, Renate Hodge, MSc2*, Shilpi Purohit2*, Sally C. Davies, FRCP, FMedSci9 and Lorna M Williamson, FRCPath2*

1Haematology, Guy’s and St Thomas’ NHS Foundation Trust, LONDON, United Kingdom
2NHSBT/MRC Clinical Studies Unit, NHS Blood and Transplant
3Clinical Trials Unit, Medical Research Council
4Haematology, Kings College Hospital NHS Foundtion Trust
5Great Ormond Street Hospital NHS Trust
6Haematology, Royal Brompton and Harefield NHS Foundation Trust
7Academic Medical Center (AMC), Amsterdam, Netherlands
8Hospital for Sick Children Toronto
9Department of Health, London

Introduction: The rate of complications after surgery is increased in patients with Sickle Cell Disease (SCD) and pre-operative blood transfusion has historically been used to decrease this risk. Observational studies and one limited Randomised Controlled Trial (RCT) have suggested that in some patients, transfusion can safely be omitted. Since transfusion is associated with complications including alloimmunisation and increased post-operative infections, we performed a RCT to address whether overall peri-operative complications in SCD are reduced by pre-operative transfusion.

Methods: TAPS was a Phase III multicentre, pragmatic, randomised controlled trial with a parallel group sequential superiority design, carried out between November 2007 and March 2011 at 22 sites in the UK, Netherlands and Canada. Eligible patients had HbSS or HbSβ0thal, were aged one year or more and were having low risk (eg adenoidectomy, dental surgery) or medium risk (eg joint replacement, cholecystectomy, tonsillectomy) elective surgery. Patients were excluded if they had a haemoglobin (Hb) <6.5g/dl, had received a blood transfusion within the last 3 months or had severe SCD. Patients were randomly assigned to Arm A, which received no pre-operative transfusion, or Arm B, which received a top-up transfusion if Hb<9g/dl or a partial exchange if Hb9g/dl. Sites followed their own standards for all other aspects of peri-operative care, although guidance was provided. The primary outcome was all significant complications between randomisation and 30 days post surgery as defined in the protocol. These were sent blinded to the End-Point Review Panel for final classification. Complications which were life-threatening or resulted in death or persistent or significant incapacity/disability and other important medical events were also recorded as Serious Adverse Events (SAEs) and were reviewed by an Independent Data Monitoring Committee (IDMC). Due to a major imbalance in the number of SAEs between treatment groups, the trial was terminated early following an IDMC recommendation.

Results: 333 patients were screened for the trial and 70 patients were randomised at the time the trial was terminated. Thirty three completed 30 day follow up in Arm A and 34 in Arm B. Both groups were comparable with respect to age, gender, severity of SCD, type of surgery and baseline Hb. Only 13 patients had low risk surgery. The pre-operative (post-transfusion) Hb was higher in Arm B (9.7g/dl vs 7.7g/dl) and 5 patients in Arm B received partial exchange transfusion with a mean pre-operative HbS% of 47.2%. There were no differences in peri-operative management, including fluid support and oxygen therapy, between the two groups.

There were 11 SAEs (33%) in patients who did not receive a pre-operative transfusion, compared to only 1 SAE (3%) in patients who did receive a top-up transfusion or partial exchange. Eleven of the SAEs were Acute Chest Syndrome (ACS). Patients in the no pre-operative transfusion group also had more significant complications (13/33, 39%), which included SAEs, as compared to patients in the top-up/exchange group (5/34, 15%).

Type of surgery: 58% of patients underwent abdominal or ENT surgery. Four out of 13 patients (31%) who had abdominal surgery in Arm A had ACS events compared to none out of 10 patients in Arm B. Out of the 9 patients who had Tonsillectomy in Arm A, 3 patients had ACS events (33%) compared to none in 7 patients in Arm B.

Discussion: This RCT has shown a large increase in SAEs in un-transfused patients with HbSS and HbSβ0thal having low and moderate risk surgery. In particular there was a striking increase in ACS, a potentially life-threatening complication. We therefore recommend that pre-operative transfusion should be strongly considered for patients with HbSS and HbSβ0thal undergoing moderate risk surgery, in particular abdominal surgery and tonsillectomy. There was no evidence of increased benefit of exchange transfusion over top-up, although numbers were small, and exchange transfusions should be reserved for patients with a Hb>9g/dl. There is insufficient evidence to reach a conclusion on the role of pre-operative transfusion in other types of surgery or in patients with other sickle genotypes. Pre-operative transfusion in these patients should be decided on a case by case basis.

Acknowledgement: submitted on behalf of the TAPS Trial Investigators.

Disclosures: No relevant conflicts of interest to declare.

On the Net:

Gene Inheritance Patterns Influence Age Of Diagnosis In BRCA Families

Women who inherit the cancer genes BRCA1 or BRCA2 from their paternal lineage may get a diagnosis a decade earlier than those women who carry the cancer genes from their mother and her ancestors, according to a new study by researchers at the North Shore-LIJ Health System’s Monter Cancer Center in Lake Success, NY. The findings were reported on Thursday, Dec. 8, at the San Antonio Breast Cancer Symposium.

Iuliana Shapira, MD, North director of cancer genetics, and her colleagues conducted a retrospective review of 130 breast or ovarian cancer patients with the BRCA1 or BRCA2 mutations. They chose only those patients who knew the parent of origin. In other words, they could follow along their family tree to see where the breast cancer gene originated from. Some of their families had their own genetic tests done. For others, it was a matter of following the family pedigree.

As expected, a person had a 50-50 chance of getting a mutant BRCA gene from their mother or their father’s branch that carried the mutation. It is an autosomal dominant mutation. Looking at the family maps revealed some surprising findings. Contrary to the notion that the BRCA mutations are associated more commonly with Ashkenazi Jews, the scientists found that the BRCA mutations were also in families of Irish and Jamaican descent. “No one had ever conducted a study to look at the parent-of-origin effects,” said Dr. Shapira. “Genetic diseases may display parent-of-origin effects. In such cases, the risk depends on the specific parent or origin allele. Cancer penetrance in mutations carriers may be determined by the parent origin of BRCA mutation.”

They analyzed 1,889 consecutive (136 ovarian + 1753 breast) breast (BrCa) or ovarian cancer (OvCa) patients presenting for treatment at the Monter Cancer Center between 2007 and 2010. In 130 patients with BRCA 1 or 2 mutations the parent of origin for the mutation was known. Of the 130 patients, two had both BRCA1 and BRCA2 mutated paternally inherited disease and were excluded from this analysis. Of the breast cancer patients: 28 patients had paternal and 29 had maternal BRCA1 mutations, 24 had paternal and 21 had maternal BRCA 2 mutations. Of the ovarian cancer patients, six had paternal and 10 had maternal BRCA1 mutations; seven had paternal and three had maternal BRCA2 mutations.

In carriers of BRCA mutations, the mean age at diagnosis for ovarian cancer was 51 (range 21-70) and for breast cancer was 43 (range 24-78). But when they compared the mean age at diagnosis in the maternal versus paternal inheritance, they were surprised to find that breast cancer patients with a BRCA1 maternal inheritance, the age of diagnosis was on average around 45. By comparison, women with BRCA1 paternal inheritance were diagnosed around 38. For breast cancer BRCA2 maternal inheritance, the average age of diagnosis was 50 compared to 41 years old for those with a BRCA2 paternal inheritance.

There was no significant difference between paternal and maternal age of ovarian cancer diagnosis of BRCA1 or BRCA2 mutations.

“If this observation is duplicated in larger cohorts the results will have important implications for recommendation of surgical risk reduction in BRCA mutation carriers,” said Dr. Shapira. “That would mean that doctors might think about watching and waiting in young woman with BRCA mutations inherited from her mother’s family and being more aggressive in young women who inherited the mutation from their father’s side.”

On the Net:

Two Species Per Day Discovered In Mekong Jungle

The diversity of life in the Mekong River region of Southeast Asia, which includes portions of China, Myanmar, Laos, Thailand, Cambodia and Vietnam, is so astonishing that a new species is found every two days, according to various media reports.
Two-hundred and eight new species were discovered during the last year alone, including a multi-colored gecko and a black and white snub-nosed monkey with an “Elvis” hairdo.
The region is also home to some of the world´s most endangered species, including tigers, Asian elephants, Mekong dolphins and Mekong giant catfish, explains the environment-defending World Wildlife Fund (WWF).
“This is a region of extraordinary richness in terms of biodiversity but also one that is extremely fragile,” Sarah Bladen, communications director for WWF Greater Mekong, told the Associated Press (AP). “It´s losing biodiversity at a tragic rate.”
Among the new finds of the last year are a lizard that reproduces via cloning without the need for males, a fish that resembles a gherkin, and five species of carnivorous pitcher plant, some of which lure in and consume animals as large as rats and birds.
“Mekong governments have to stop thinking about biodiversity protection as a cost and recognize it as an investment to ensure long-term stability,” Stuart Chapman, Conservation Director of WWF Greater Mekong, said in a recent press release.
“The region´s treasure trove of biodiversity will be lost if governments fail to invest in the conservation and maintenance of biodiversity, which is so fundamental to ensuring long-term sustainability in the face of global environmental change.”
The extinction of the Javan rhino in Vietnam, recently confirmed by WWF, is one tragic indicator of the decline of biodiversity in the region. The Mekong´s wild places and wildlife are under extreme pressure from rapid, unsustainable development and climate change.
Despite restrictions, trade in wildlife remains an active threat to a range of endangered animals in the region with some hunted because body parts, rhinoceros horns being one example, are coveted ingredients in traditional Asian medicine, reports Elaine Lies for Reuters.
Others, such as Mekong dolphins, face threats from fishing gear such as gill nets and illegal fishing methods, prompting the WWF in August to warn that one dolphin population in the river was at high risk of extinction.
The WWF is calling on the six leaders from the Greater Mekong Sub-region (GMS) meeting next week in Myanmar to put the benefits of biodiversity, and the costs of losing it, at the center of decision-making and regional cooperation.

On the Net:

WISE Images Supernova’s Rose

About 3,700 years ago, people on Earth would have seen a brand-new bright star in the sky. It slowly dimmed out of sight and was eventually forgotten, until modern astronomers later found its remains, called Puppis A. In this new image from NASA’s Wide-field Infrared Survey Explorer (WISE), Puppis A looks less like the remains of a supernova explosion and more like a red rose.
Puppis A (pronounced PUP-pis) was formed when a massive star ended its life in a supernova, the most brilliant and powerful form of an explosion in the known universe. The expanding shock waves from that explosion are heating up the dust and gas clouds surrounding the supernova, causing them to glow and appear red in this infrared view. While much of the material from that original star was violently thrown out into space, some of it  remained in an incredibly dense object called a neutron star. This particular neutron star (too faint to be seen in this image) is moving inexplicably fast: over 3 million miles per hour! Astronomers are perplexed over its absurd speed, and have nicknamed the object the “Cosmic Cannonball.”
Some of the green-colored gas and dust in the image is from yet another ancient supernova — the Vela supernova remnant. That explosion happened around 12,000 years ago and was four times closer to us than Puppis A.
The colors in this image represent different wavelengths of infrared light that humans can’t see with their eyes.
JPL manages and operates the Wide-field Infrared Survey Explorer for NASA’s Science Mission Directorate, Washington. The principal investigator, Edward Wright, is at UCLA. The mission was competitively selected under NASA’s Explorers Program managed by the Goddard Space Flight Center, Greenbelt, Md. The science instrument was built by the Space Dynamics Laboratory, Logan, Utah, and the spacecraft was built by Ball Aerospace & Technologies Corp., Boulder, Colo. Science operations and data processing take place at the Infrared Processing and Analysis Center at the California Institute of Technology in Pasadena. Caltech manages JPL for NASA.

Image Caption: About 3,700 years ago people on Earth would have seen a brand-new bright star in the sky. As it slowly dimmed out of sight, it was eventually forgotten, until modern astronomers found its remains — called Puppis A.

On the Net:

Increased Ice Loss Resulted In Greater Greenland Bedrock Lifting

A higher-than-normal 2010 melting season sped up the melting of ice in southern Greenland, causing sizable portions of the island’s bedrock to rise somewhere about a quarter of an inch more than usual, an Ohio State University (OSU) researcher said on Friday.

According to an OSU press release, Michael Bevis, Ohio Eminent Scholar in Geodynamics and professor in the OSU School of Earth Sciences, said that 50 GPS stations spread across the coast of Greenland normally “detect uplift of 15 mm (0.59 inches) or more, year after year. But a temperature spike in 2010 lifted the bedrock a detectably higher amount over a short five-month period — as high as 20 mm (0.79 inches) in some locations.”

Those comments came during a presentation by Bevis, who serves as the principal investigator for the Greenland GPS Network (GNET), at the American Geophysical Union (AGU) meeting in San Francisco.

He also addressed what implication the findings could have in relation to climate change, saying that “pulses of extra melting and uplift imply that we´ll experience pulses of extra sea level rise“¦ The process is not really a steady process.”

In a December 9 article, UPI also said that the Bevis believes that uplift was the result of accelerated ice loss in the region, noting that the southern part of Greenland lost an extra 100 billion tons of ice due to the above average conditions.

“Really, there is no other explanation. The uplift anomaly correlates with maps of the 2010 melting day anomaly,” he told the news organization. “In locations where there were many extra days of melting in 2010, the uplift anomaly is highest.”

Bevis’ colleagues in the research were Abel K. Brown, Eric C. Kendrick, Jason E. Box, Dana John Caccamise, Hao Zhou, Jian Wang, and Terry J. Wilson, all from the OSU School of Earth Sciences, as well as John M. Wahr of the University of Colorado; Shfaqat Abbas Khan, Finn Bo Madsen, and Per Knudsen of the Danish Technical University; Michael J Willis of Cornell University; Tonie M. van Dam and Olivier Francis of the University of Luxembourg; Bjorn Johns, Thomas Nylen, and Seth White of UNAVCO, Inc, in Boulder, Colorado; Robin Abbott of CH2M HILL Polar Services; and Rene Forsberg of the Space Institute in Denmark, the university said in its press release.

GNET is a project sponsored by the National Science Foundation (NSF). The AGU 2011 Fall Meeting was held from December 5 through December 9 in San Francisco, California.

Image 2: The 2010 Uplift Anomaly (green arrows), superimposed on a map showing the 2010 Melting Day Anomaly (shaded in red), which was produced by R.  Simmon of the NASA Earth Observatory using data provided by M. Tedesco. Courtesy of Ohio State University.

On the Net:

Working Nights is Linked to Type 2 Diabetes

(Ivanhoe Newswire) — Women who work one night on, one night off, a day shift and then a night shift “¦ also known as rotating night shift work could be putting themselves at risk for type 2 diabetes. And furthermore, the longer you work, the more likely you are to gain weight. This new study sheds light on potential public health risk since a large portion of the working population is involved in some kind of permanent night and rotating night shift work.

The authors used data from the Nurses’ Health Study I (NHS I – established in 1976, and which included 121704 women) and the Nurses’ Health Study II (NHS II – established in 1989, and which included 116677 women), and found that in NHS I, 6,165 women developed type 2 diabetes and in NHS II 3,961 women developed type 2 diabetes. Using statistical models, the authors found that the duration of rotating night shift work was strongly associated with an increased risk of type 2 diabetes in both cohorts and that the risks of women developing type 2 diabetes, increased with the numbers of years working rotating shifts. However, these associations were slightly weaker after the authors took other factors into consideration.

Although these findings need to be confirmed in men and other ethnic groups, these findings show that additional preventative strategies in rotating night shift workers should therefore be considered.

“Recognizing that rotating night shift workers are at a higher risk of type 2 diabetes should prompt additional research into preventive strategies in this group,” the authors were quoted as saying.

“Some modifications to shift work itself might also be feasible. Rotating shift work comprises a range of alternative schedule patterns, such as backward- and forward-rotating shift systems, and the proportion of night and early morning shifts varies. Future studies should address these variations and identify patterns that minimize [type 2 diabetes] risk, ideally through large-scale randomized trials that would provide insights into causality,” the authors concluded.

SOURCE: PLoS Medicine, published online December 2011.

Women May Be Able To Smell STDs On Men, Study Claims

A new study appearing in the December 6 edition of the Journal of Sexual Medicine suggests that women might be able to tell whether or not a man has a sexually transmitted disease (STD) based on his smell.

According to JoNel Aleccia of MSNBC.com’s Vitals column, the study — led by Mikhail Moshkin, professor at the Institute of Cytology and Genetics in Novosibirsk, Russia — found that gonorrhea-infected men had a “putrid” odor to young women asked to participate in the study.

“The off-putting scent may be subtle, more a chemical warning than a blast of body odor, but it definitely has an effect, according to the experiment conducted by Moshkin and his colleagues,” she added.

MyHealthNewsDaily Staff Writer Rachael Rettner says that Moshkin’s team collected armpit sweat from 34 Russian men between the ages of 17 and 25. Of the subjects, 13 had gonorrhea, 5 had suffered from the STD at one time but had since recovered, and the other 16 were healthy.

Each man wore a tee-shirt with cotton pads in the armpits for one hour. The scientists then placed the pads in glass vials and has 18 healthy women to smell the vials, rate the pleasantness of the odor on a scale of one to 10 (with 10 being the most pleasing smell), and choose an adjective from a list — including “putrid,” “floral,” “minty” and “fruity” — to describe the smell.

“The women rated the infected men’s sweat as less than half as pleasant as the healthy men’s sweat,” Rettner wrote. “And the women said about 50 percent of men who had gonorrhea had sweat that smelled ‘putrid,’ whereas only 32 percent of the healthy men were described as putrid. And while 26 percent of the healthy men smelled ‘floral,’ just 10 percent of those with gonorrhea were described that way.”

“The researchers speculated that the men’s immune systems might be involved because they found a link between the concentration of disease-fighting proteins called antibodies in the men’s saliva and how  pleasant their sweat smelled to women: the higher the antibody concentration, the lower the score,” she continued, adding that the researchers said that those suffering from STDs shouldn’t worry, because the body odor caused by the condition “can be improved by deodorants.”

On the Net:

CDC Links Raw Flour to 2009 E. Coli Outbreak

A 2009 E. coli outbreak that affected 77 people across 30 states may have been caused by raw flour that was an ingredient in ready-to-cook cookie dough, a new report published in the journal Clinical Infectious Disease has discovered.

According to Jeannine Stein of the Los Angeles Times, the investigation, which was conducted by the Centers for Disease Control and Prevention (CDC), involved analyzing records and interviewing patients who were infected by the disease-causing agent.

The researchers matched each of the patients with control subjects suffering from intestinal illnesses unrelated to E. coli, and were also asked to fill out questionnaires about the types of food that they had eaten before falling ill, Stein added. Among the most frequently eaten types of food was a specific brand of cookie dough, samples of which had tested positive for the bacteria.

Which specific ingredient was to blame? According to Stein, the researchers — including the CDC’s Dr. Karen Neil and other experts from both the federal disease control and prevention organization as well as from state health departments — still don’t know.

“After ruling out the likelihood of being caused by factors such as food handling, safety violations or intentional contamination, the study authors considered ingredients like molasses, unpasteurized eggs, sugar, margarine, chocolate chips and baking soda. But each of those was also ruled out,” she wrote on Friday.

“That pretty much left flour, which, the authors noted, is a raw product,” Stein added. “Although no conclusive evidence was found to pin the illnesses on flour, they made the case that flour is purchased in large quantities and could have been distributed to a number of lots. Also, it’s not processed to kill pathogens.”

In a December 9 press release discussing the investigation, the Infectious Diseases Society of America said that Neil and colleagues came to two main conclusions. First, they decreed that cookie dough manufacturers should try to reformulate their product in order to make it safe to eat before cooking. Second, they called for increased consumer education about the dangers of eating unbaked goods out of the package.

Neil and her associates also said that foods that contain raw flour “should be considered as possible vehicles of infection of future outbreaks of STEC,” and suggested that manufacturers switch to heat-treated or pasteurized flour in all products labeled ready to cook or ready to bake, so that the contents will be safe to eat out of the package, even if the product warns against such practices.

“Eating uncooked cookie dough appears to be a popular practice, especially among adolescent girls, the study authors note, with several patients reporting that they bought the product with no intention of actually baking cookies,” the press release added. “Since educating consumers about the health risks may not completely halt the habit of snacking on cookie dough, making the snacks safer may be the best outcome possible.”

On the Net:

Law Enforcement Key To Great Ape Survival

A recent study shows that, over the last two decades, areas with the greatest decrease in African great ape populations are those with no active protection from poaching by forest guards.

Recent studies show that the populations of African great apes are rapidly decreasing. Many areas where apes occur are scarcely managed and weakly protected. Researchers from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, have carried out an international collaborative project together with field researchers and park managers. The project aim was to evaluate how the lack of conservation effort influences the extinction risk of African great apes. Records were collected over the last 20 years from 109 resource management areas. The researchers found that the long-term presence of local and international non-governmental organization support and of law enforcement guards are the most crucial factors affecting ape survival, and that they have a clear measurable impact. Conversely, national development, often cited as a driver of conservation success, and high human population density had a negative impact on the likelihood of ape survival.

For the protection of natural resources, and in particular to fight wildlife population decline, it is fundamental to implement measures that are truly effective. How well a particular conservation activity reduces extinction risk of a species has rarely been quantified in comparison to other types of conservation efforts over a long-term and large spatial scale. Such a quantitative comparison is important in helping to direct conservation strategies that will mitigate and reverse the recent decline in many African great ape populations.

This study provides a continent-wide assessment of the relative significance of four different types of conservation efforts: law enforcement guards, tourism, research and non-governmental organizations support. Their effects have been assessed in 109 African resource management areas for the survival of chimpanzees, bonobos and gorillas over twenty years (1990-2009), located in 16 countries in East, West and Central Africa. Along with these data, environmental and anthropogenic variables were included and as well recent records of ape status.

The study confirmed unequivocally that prolonged conservation efforts lead to a measurable decrease of the probability of apes going extinct, and the longer they last, the lower the probability. “The results confirm and prove quantitatively that the most influential risk for ape disappearance is the lack of law enforcement guards, rather than the absence of tourism and research, which nevertheless remain activities with a measurable positive impact”, says researcher Sandra Tranquilli of the MPI for Evolutionary Anthropology. “Futhermore, ape persistence is positively influenced by the presence of non-governmental organization support”.

“Remaining wilderness areas are disappearing at a rate which is unimaginable for most people. If we want to preserve some of these places for the future, we need many more studies of this type” says conservation group leader Hjalmar Kuehl of the MPI for Evolutionary Anthropology. “These studies help to better understand which conservation measures are more efficient and into which conservation activities the limited available resources should best be invested. This information will help to increase the effectiveness of conservation measures by maximizing the return on invested resources”.

“This is an excellent example of evidence-based conservation research, where conservation activities and strategies are evaluated quantitatively,” says researcher Fiona Maisels of the Wildlife Conservation Society. “Our findings will ensure the best use of the limited human and financial resources available, particularly in terms of effective law enforcement on the ground”.

“The survival of African Greats Apes depends on law enforcement”, says DR Congolese researcher Fidèle Amsini of Frankfurt Zoological Society, “These efforts also depend on the availability of funds from donors and the support of country agencies”.

This study measures for the first time on a continental scale the relative impact of conservation efforts and shows how crucial their long-term presence is for African great ape existence. In addition to the application of an evidence-based approach, the authors recommend a continuous monitoring program of population trends and threats to ensure the long-term persistence of ape populations.

This work was supported by Arcus Foundation, the Max Planck Society and US Fish and Wildlife Service, and the data was housed in the IUCN official A.P.E.S. database. It was conducted in collaboration with members of the following NGOs, universities and national parks: AGRECO, African Wildlife Foundation, Fauna and Flora International, Federal University of Technology (Akure, Nigeria), Garamba National Park (Democratic Republic of Congo), Ghana PADP II-LTS International, Ghana Wildlife Society, Great Ape Trust, Institut National pour l’Environment et la Conservation de la Nature (Burundi), IUCN/SSC Primate Specialist Group, Kalinzu Forest Project, Support for Conservation of Bonobos, Tshuapa-Lomami-Lualaba Project, University of Amsterdam (Amsterdam, The Netherlands), University of California (Davis, USA), University College London (London, UK), University of Ghana (Legon, Ghana), University of Stirling (Stirling, UK), University of Kyoto (Kyoto, Japan), University of Melbourne (Melbourne, Australia), West African Primate Conservation Action, West Chester University (Pennsylvania, USA), Wild Chimpanzee Foundation, Wildlife Conservation Society, World Wide Fund for Nature, Zoological Society of London.

Reference: Lack of conservation effort rapidly increases African great ape extinction risk. Conservation Letters, December 8, 2011; doi: 10.1111/j.1755-263X.2011.00211.x

Image Caption: A park guard measures gorilla dung in Kahuzi-Biega National Park. © A. Plumptre/Wildlife Conservation Society

On the Net:

Is Generosity The Key To A Healthy Marriage?

Is there an answer to the question, “What makes a happy marriage”? The answer may be found in how generous spouses are to each other. Do you make your spouse a cup of coffee, order flowers or provide a backrub? Then you may find yourself with a long-lasting and stable relationship.

A new study by the National Marriage Project at the University of Virginia revealed couples who reported a high amount of generosity in their relationship were five times more likely to say their marriage was “very happy,” compared with those who reported a low amount of generosity, reports Rachael Rettner for MSNBC.

When a person is generous to his or her spouse, “The underlying message is, you´re valuable, you´re important,” said Dr. Anthony Castro, an assistant professor of psychiatry at the University of Miami Miller School of Medicine, who was not involved in the study.

The odds of successfully combining marriage and parenthood include a mix of newer and more “institutional” marriage values, the report explains. The newer, “soul-mate” model of marriage include shared housework, good sex, marital generosity, date nights and having a college degree.

The “institutional” model of marriage include shared religious faith, commitment, the support of friends and family, a sound economic foundation provided by a good job, and quality family time.

Researcher W. Bradford Wilcox, an associate professor of sociology at the University of Virginia explains that generosity works best if you give your spouse something he or she likes, “[It´s] signaling to your spouse that you know them, and are trying to do things for them that are consistent with your understanding of them.”

But if, for example, your spouse delights in almond mochas, and you get her black coffee instead, it might not be very helpful, Wilcox said.

Based on the responses, the researchers compiled a list of the top five predictors of a very happy marriage. For men and women, sexual satisfaction ranked first, followed by level of commitment (a sense of “we-ness”), generosity and a positive attitude toward raising children, reports the University of Virginia Today.

For women, the fifth factor was above-average social support from friends and family, and for men, the fifth factor was spirituality within a marriage.

Compare these values to the 1960s and 1970s, when many couples engaged in a more individualistic approach to marriage, Wilcox said. “But that didn´t work out so well, as illustrated by the divorce revolution. By contrast, this report finds that in today´s marriages both wives and husbands benefit when they embrace an ethic of marital generosity,” he said.

Co-author Marquardt said, “One of the striking findings of this report is that equality in shared housework has emerged as a predictor of marital success for today´s young married parents, even as most married mothers would prefer to work part-time and most married fathers would prefer to work full-time.”

“Every individual situation is different,” Castro emphasizes. A couple may find themselves falling into the 14 percent of couples who are very happy without a high level of generosity. “Each specific relationship needs to be thought about individually, depending on both individual and partners´ needs.”

On the Net:

Robot Aircraft Teach Themselves Which Way Is Up

[ Video 1 ] | [ Video 2 ]

Australian vision scientists today unveiled a novel way to help pilotless aircraft accurately determine their heading and orientation to the ground – by imitating how insects do it.

The technology can improve the navigation, flight characteristics and safety for civil and military aircraft, as well as pilotless drones says Mr Richard Moore, a researcher at The Vision Centre and The Queensland Brain Institute at the University of Queensland.

“UAVs (unmanned aerial vehicles or pilotless aircraft) are used in crop dusting, bushfire monitoring, tracking algal blooms or crop growth and infrastructure inspection as well as defense roles,” he says. “Some of these tasks require the aircraft to fly close to the ground and amongst obstacles, so it is crucial that the aircraft knows its heading direction and roll and pitch angles accurately.”

While there are other sensors such as magnetometers, gyroscopes, and accelerometers that can help the aircraft determine its heading and orientation, they suffer from problems such as noise-induced drift, and can be adversely affected by the motions of the aircraft or materials in the environment surrounding the sensors, Mr Moore explains.

“This means that UAVs can´t perform significant maneuvers without losing their sense of direction for a while.”

To provide real-time guidance for UAVs, the researchers have designed a vision based system that provides aircraft with the same advantage that insects have — a fixed image of the sky and the horizon.

“If you watch a flying insect, you will see that their heads are upright when they turn their bodies,” Mr Moore says. “Keeping their heads still allows them to have a stabilized image of the horizon and the sky, which is crucial in determining their heading.”

In the new system, the aircraft uses two back-to-back fisheye lenses to capture a wide-angle view of the environment. It then divides the image into the sky and ground regions using information such as the brightness or color combinations. The orientation of the sky and ground regions allows the aircraft to determine its roll and pitch angles with respect to the horizon.

“Using its estimated orientation, the aircraft can then generate a panoramic image of the horizon, and use it as a reference,” Mr Moore explains. “The aircraft can then determine its heading direction continuously throughout the flight by producing an instantaneous horizon panorama and comparing it with the reference image.”

Although a similar vision-based approach has been proposed previously, this new system improves visual performance by enabling the aircraft to learn directions by itself.

“This system doesn´t need any programming before take-off, unlike earlier ones that required lots of offline training: researchers had to manually compute the differences between the sky and the ground, then feed it into the system.

“With the new system, we only have to tell the aircraft that it´s in the upright position when it starts flying. It will then use that as a starting point to work out which is sky and which is ground, and train itself to recognize the differences.

“This is important because if the aircraft relies solely on the prior training, it will be in trouble once it´s in an unfamiliar environment. The self-learning ability allows the system to keep a record of what it ℠sees´, update its reference base continuously and be adaptive.”

The group performed a closed-loop flight test with the new system where the aircraft was commanded to perform a series of 90 degree turns for four minutes.

“The tests indicate that the aircraft can estimate its heading much more accurately with a visual compass, compared to other navigation systems like magnetic compasses and gyroscopes,” says Mr Moore.

“The ability to estimate the precise roll and pitch angles and the heading direction instantaneously is crucial for UAVs, as small errors can lead to misalignments and crashes.”

Mr Moore will presented the paper “A method for the visual estimation and control of 3-DOF Attitude for UAVs” today at the Australasian Conference on Robotics and Automation 2011.

The Vision Centre is funded by the Australian Research Council as the ARC Centre of Excellence in Vision Science.

On the Net:

Bee Behavior Mimics Brain Neuron Function

A new study of bees has come to the conclusion that bee swarm communication works similarly to that of neurons in the human brain.
The study, published in the December 9 issue of Science, found that bees use inhibitory “stop” signals to prohibit the scout bees from completing a waggle dance that helps bees learn the directions of competing sites for new hives. This behavior helps to ensure that the best homesite is found for the hive.
Thomas Seeley, a biologist from Cornell University, said this behavior is “analogous to how the nervous system works in complex brains. The brain has similar cross inhibitory signaling between neurons in decision-making circuits.”
To study this behavior the researchers set up swarms, one at a time, on an island off the coast of Maine that was devoid of natural nesting cavities. After setting out two identical nesting boxes, they labeled scout bees with two different paint colors. They then videotaped the scout bees doing the waggle dance. The dances were tracked by watching the scout bees with the marks by using microphones and videotape to tell when they received the stop signals and from which bees.
The team observed that the stop signals came from scouts that were marked at the other site.
Visscher said, “The message the sender scout is conveying to the dancer appears to be that the dancer should curb her enthusiasm, because there is another nest site worthy of consideration Such an inhibitory signal is not hostile. It´s simply saying, ℠Wait a minute, here´s something else to consider, so let´s not be hasty in recruiting every bee to a site that may not be the best one for the swarm. All the bees have a common interest in choosing the best available site.”
According to the press release once the bees decide to swarm and move to a new nesting site the message of the stop signal changes. Visscher says, “Apparently at this point, the message of the stop signal changes, and can be thought of as, ℠Stop dancing, it is time to get ready for the swarm to fly. It is important for the scouts to be with the swarm when it takes off, because they are responsible for guiding the flight to the nest site.”

On the Net:

Binge Drinking Linked To Sexual Assault Risk For College Freshmen

A new study, to be published in the January issue of the Journal of Studies on Alcohol and Drugs, has linked binge drinking with the risk of sexual assault among women in their first year of college.

The study, lead by scientists at the University of Buffalo, followed 437 young women from their high school graduation through their freshman year at a college or university.

According to a press release from the school, the researchers discovered that of women who did not drink heavily during high school, nearly half of them said that they had engaged in “heavy episodic drinking — commonly called binge drinking — at least once by the end of their first college semester.”

Those who were already binge drinkers in high school continued drinking at a similar rate during their freshman year of university, and of those young women who engaged in binge drinking of at least four to six drinks, 25% of them said that they had been sexually victimized during the fall semester — which could entail anything from unwanted sexual contact to rape, they added.

“The more alcohol those binges involved, the greater the likelihood of sexual assault,” they added. “Of women who’d ever consumed 10 or more drinks in a sitting since starting college, 59 percent were sexually victimized by the end of their first semester.”

“Though young women are not to blame for being victimized — that fault lies squarely with the perpetrator — if colleges can make more headway in reducing heavy drinking, they may be able to prevent more sexual assaults in the process,” they added.

“This suggests that drinking-prevention efforts should begin before college,” said lead researcher Maria Testa, a senior scientist at the University of Buffalo’s Research Institute on Addictions, said in a statement.

Testa emphasized that parents should talk to their kids about drinking before they leave for college, regardless of whether or not they believed that those teens were active drinkers in high school, and should continue to emphasize responsible alcohol consumption even after they go to college.

On the Net:

America’s Health Rankings Released

The United Health Foundation´s annual America´s Health Rankings report was released Tuesday, ranking Vermont the healthiest US state for the fifth straight year, thanks in part to its high rate of education and low incidence of infectious disease.
Rounding out the top five healthiest states are New Hampshire, Connecticut, Hawaii and Massachusetts, respectively. New Jersey just missed the top 10, ranking 11th healthiest, and New York came in at number 18. New Jersey showed the most substantial improvements over the last year, jumping up six spots on the list.
Despite the good news for those states, others fared not so well.
Idaho and Alaska both showed the biggest downward movement over the past year, the United Health Foundation showed in its report. And the least healthy state, according to the report, was Mississippi. The other 4 least healthy states are Louisiana, Oklahoma, Arkansas and Alabama.
Overall, the country saw no improvement in overall health from 2010 to 2011, despite three consecutive years of gains prior to this year. And past rankings have shown an average of 0.5 percent improvement per year from 2000 to 2010, and an annual improvement of 1.6 percent through the 90s.
The 2011 report showed such a “dramatic” increase in obesity and diabetes that it canceled out other improvements, including for those who quit smoking in 2011. 2011 was also the first year in which every state reported at least 20 percent of its population was obese.
Experts estimate by 2030, more than half of Americans will be obese, CBS News reported.
“It is very easy to get access to a $1 cheeseburger, 24 hours a day, 7 days a week,” Dr. Reed Tuckson, chief of medical affairs for UnitedHealth Group, told Fox News. “It´s much more difficult to get access to a $1 tomato.”
The report also showed an increase in the obesity-related disease diabetes, up from 8.3 percent in 2010 to 8.7 percent in 2011.
But not all the findings were bleak. Fewer Americans died from heart disease in 2011. And fewer Americans smoke now. The number of preventable hospitalizations also dropped across the country.
“While this year´s Rankings shows some important improvements, we also see some very alarming trends – particularly diabetes and obesity – that, left unchecked, will put further strain on our country´s already strained health care resources,” Dr. Tuckson said in a statement. “At a time when the nation, states and individual families are grappling with tightening budgets and growing health care expenses, this year´s Rankings sends a loud wakeup call that the burden of preventable chronic disease will continue to get worse unless we take urgent action.”
The report, published online, has tracked US health for the past two decades by evaluating 23 factors including: smoking, binge drinking, diabetes, high school graduation, immunization, prenatal care and obesity.
The report is published by the United Health Foundation, American Public Health Association and Partnership for Prevention. It pulls data from sources including the Census Bureau, Centers for Disease Control and Prevention and the US Department of Education.
Read more at: http://www.americashealthrankings.org/

More Human Intelligence Comes With A Price

When researchers asked why we are not more intelligent, given the adaptive evolutionary process, their conclusion was that problems would arise if we had too much intelligence.
Thomas Hills of the University of Warwick and Ralph Hertwig of the University of Basel looked at a wide range of studies including drug studies of Ritalin patients, studies of people with autism, debilitating synaesthesia and neural disorders linked with enhanced brain growth.
People with attention disorders take Ritalin to help improve their ability to pay attention, whereas people who take the drug and don´t need it often are worse off when than if they didn´t take any drugs. This conclusion suggests there may be a limit to how much we need to pay attention.
Hills says, “This makes sense if you think about a focused task like driving. Where you have to pay attention, but to the right things – which may be changing all the time. If your attention is focused on a shiny billboard or changing the channel on the radio, you´re going to have problems.”
Enhanced memory also has its problems and can prove difficult. Medicalxpress.com notes that post-traumatic stress disorder is an example where people can´t stop remembering an awful episode. According to Hills, “if something bad happens, you want to be able to forget it, to move on.”
Even an above average IQ could turn out to be troublesome. The researchers studied a population of Ashkenazi Jews whose IQ are higher than the general European population. It turns out that this population has a high occurrence of Tay-Sachs disease that affects the nervous system. The increase in brain function may cause an increase in disease.
According to Hills, there may never be a ℠supermind´, which is only in the minds of science-fiction writers. The research suggests that if the mind evolves beyond its limits, then there are consequences for that increased cognitive ability that affects different parts of the brain. Hills says, “If you have a specific task that requires more memory more speed or more accuracy or whatever, then you could potentially take an enhancer that increases your capacity for that task. But it would be wrong to think that this is going to improve your abilities all across the board.”
The study, titled “Why Aren´t We Smarter Already: Evolutionary Trade-Offs and Cognitive Enhancements” can be found in Current Directions in Psychological Science.

On the Net:

The Physics of Rainbows

Somewhere over the rainbow there are laws of physics that were poorly understood—until now. A team of computer scientists at the Jacobs School of Engineering at UC San Diego have recently found a new way to simulate rainbows using computer graphics. In the process, they also elucidated how some types of rainbows form.  credit:  UCSD Jacobs

Importance Of Echocardiography To Evaluate Cardio Toxicity In Cancer Patients

One study presented at the meeting, which is being held in Budapest, Hungary, 7 to 10 December, reports on an initiative using echocardiography to document early warning signs of adverse effects from trastuzumab (Herceptin ®)´, while the other uses echocardiography to evaluate the protective role of ACE inhibitors and statins on the hearts of cancer patients².

“These studies open the way for the early identification of myocardial damage at the subclinical level, thereby allowing clinicians to identify patients who might benefit from either changes in cancer therapy or the delivery of protective treatments,” says EAE president Dr Luigi Badano, from the University of Padua, Italy.

Already echocardiography is widely used to evaluating cardio toxicity, but the most commonly used parameter of left ventricular ejection fractions (LVEF), only identifies myocardial damage that has already occurred and fails to identify the early subtle alterations in left ventricular function that predict future functional decline.

Newer cancer therapies have improved the survival of patients with cancer and, in some cases, turned cancer into a chronic disease. The result is that patients are now surviving long enough for the adverse cardiovascular effects of some cancer therapies to become apparent. The anthracyclines and related compounds are the most frequently implicated agents, but other treatments such as 5—Fluoroouracil, its prodrug capecitabine, and trastuzumab, a tumor-specific antibody, have also been associated with cardio toxicity. Currently it is estimated that 17% of patients have to stop cancer therapy due to adverse effects on their hearts.

The cardio toxic effects of cancer treatments encompass a heterogeneous group of disorders, says Dr Helder Dores, from Santa Cruz Hospital/São Francisco Xavier Hospital, Lisbon, Portugal, “They range from relatively benign arrhythmias, and hypertension, to potentially lethal conditions such as thromboembolism, myocardial infarction and cardiomyopathy with symptomatic heart failure.”

Cardiotoxicity can be acute, appearing in the first 10 days of treatment, late, or experienced 15 to 20 years later, as sometimes found with the survivors of childhood cancers. While the damage is well documented, the mechanisms are incompletely understood. “They appear to be multifactorial, with the production of oxygen free radicals considered the main cause of morphological alterations,” explains Liliana Radulescu, from Municipal Hospital, Cluj-Napoca, Romania.

In October 2011 the European Association of Echocardiograpy (EAE) announced that it is working with the American Society of Echocardiography (ASE) and American Society of Clinical Oncology (ASCO) to issue joint recommendations on the usefulness of echocardiographic evaluations in cancer patients, expected to be published in 2012. “The document should lay down guidance for the frequency of assessment with different chemotherapy agents, and also identify when patients should stop treatment or be prescribed protective treatments,” says Badano.

Study reveals early signs of myocardial damage

In the first study Helder Dores and colleagues, set out to identify early warning signs of adverse cardiac effects in women treated with trastuzumab for breast cancer. In the study 51 consecutive women, enrolled for treatment between May and September 2010, were assessed at baseline with echocardiography and then again at three months.

The investigators found that within the first three months no patients presented with overt signs of heart failure or significant left ventricular systolic function deterioration, although almost one-fifth developed impaired ventricular relaxation. Impaired ventricular relaxation occurs when pressure reduction in the left ventricle does not happen as fast as normal leading to abnormalities in the heart’s ability to fill properly.

“Patients with impaired ventricular relaxation are known to be at higher risk for progression to advanced stages of cardiac dysfunction (both systolic and diastolic), making it important for these patients to be subject to more frequent evaluations both during and after therapy,” says Dores.

Further studies are now needed, he says, to assess whether impaired ventricular relaxation occurs in larger populations of patients prescribed trastuzumab. “We need studies identifying the women who go on to develop overt cardiac dysfunction to see whether we can more accurately determine predictors of these adverse events at an earlier stage of treatment,” says Dores.

ACE inhibitors and statins deliver cardioprotection

In the second study Liliana Radulescu and colleagues used echo-Doppler echocardiography to investigate whether the ACE inhibitor lisinopril and the statin rosuvastatin might confer a cardio protective effect on patients treated with anthracyclines for a range of malignancies.

“While the exact mechanism of anthracycline related cardio toxicity is not fully understood, animal studies have pointed to oxidative stress and inflammation. Both ACE inhibitors and statins are known to play an important role in reducing oxidative stress and inflammation at the level of the heart muscle cells,”says Dr Andreea Parv.

In the prospective study left ventricular ejection fractions and LV diastolic function were compared for the study group of 26 patients treated with the anthracycline epirubicin who were also given the cardio protective treatments Lisinopril 10 mg and Rosuvastatin 10 mg, and a control group of 31 gender and age-matched patients who received epirubicin but had no accompanying cardioprotective treatments.

Results show that in comparison with patients receiving cardio protection the patients who receive no protection showed further deterioration of LV diastolic function, calculated as the ratio of early diastolic filling velocity(E) to filling velocity after atria contraction E/A (p<0.02).

“This is the first human prospective study documenting the cardioprotective effect of lisinopril and rosuvastatin in anthracycline induced cardio toxicity,” says Radulescu. Further studies, she adds, are now needed in larger numbers of patients, exploring a range of different types of malignancies.

On the Net:

Video Game Players Advancing Genetic Research

Users of game designed by McGill researchers contributing to analysis of DNA sequences
Thousands of video game players have helped significantly advance our understanding of the genetic basis of diseases such as Alzheimer´s, diabetes and cancer over the past year. They are the users of a web-based video game developed by Dr. Jérôme Waldispuhl of the McGill School of Computer Science and collaborator Mathieu Blanchette. Phylo is designed to allow casual game players to contribute to scientific research by arranging multiple sequences of colored blocks that represent human DNA. By looking at the similarities and differences between these DNA sequences, scientists are able to gain new insight into a variety of genetically-based diseases.
The researchers are releasing the results computed from the solutions collected over the last year today, together with an improved version of Phylo for tablets.
Over the past year, Phylo´s 17,000 registered users have been able to simply play the game for fun or choose to help decode a particular genetic disease. “A lot of people said they enjoyed playing a game which could help to trace the origin of a specific disease like epilepsy,” said Waldispuhl. “There´s a lot of excitement in the idea of playing a game and contributing to science at the same time,” Blanchette agreed. ”It´s guilt-free playing; now you can tell yourself it´s not just wasted time.”
Waldispuhl and his students came up with the idea of using a video game to solve the problem of DNA multiple sequence alignment because it is a task that is difficult for computers to do well. “There are some calculations that the human brain does more efficiently than any computer can. Recognizing and sorting visual patterns fall in that category,” explained Waldispuhl. “Computers are best at handling large amounts of messy data, but where we require high accuracy, we need humans. In this case, the genomes we´re analyzing have already been pre-aligned by computers, but there are parts of it that are misaligned. Our goal is to identify these parts and transform the task of aligning them into a puzzle people will want to sort out.”
So far, it has been working very well. Since the game was launched in November 2010, the researchers have received more than 350,000 solutions to alignment sequence problems. “Phylo has contributed to improving our understanding of the regulation of 521 genes involved in a variety of diseases. It also confirms that difficult computational problems can be embedded in a casual game that can easily be played by people without any scientific training,” Waldispuhl said. “What we´re doing here is different from classical citizen science approaches. We aren´t substituting humans for computers or asking them to compete with the machines. They are working together. It´s a synergy of humans and machines that helps to solve one of the most fundamental biological problems.”
With the new games and platforms, the researchers are hoping to encourage even more gamers to join the fun and contribute to a better understanding of genetically-based diseases at the same time.

On the Net:

SETI Back On Track After US Military Funding

The Search for Extra Terrestrial intelligence (SETI) is back on track after funding from the U.S. Air Force Space Command.

The U.S. Air Force has paid SETI the funds it needs to restart its efforts in looking for extra terrestrial life.

SETI plans to check out the new habitable exoplanets recently discovered by NASA’s Kepler space telescope to determine if they might be a home to an alien civilization.

“This is a superb opportunity for SETI observations,” said Jill Tarter, the Director of the Center for SETI Research, in a statement issued yesterday. “For the first time, we can point our telescopes at stars, and know that those stars actually host planetary systems – including at least one that begins to approximate an Earth analog in the habitable zone around its host star. That’s the type of world that might be home to a civilization capable of building radio transmitters.”

The U.S. Air Force Space Command helped fund SETI because it said it is interested in using the organization’s detection instruments for “space situational awareness.”

NASA announced the discovery its Kepler spacecraft has made of many exoplanets orbiting other stars.  It found one planet, known as Kepler-22b, that it described as Earth’s “twin”.

Kepler-22b orbits a Sun-like G type star about 600 lightyears away at a distance that could allow for an environment to contain liquid water.

SETI said that work at the Alien Telescope Array (ATA) has been made possible thanks to the interest and generosity of the public and the U.S. Air Force.

The Air Force said SETI’s ATA could be handy in picking up transmissions from spacecraft, which could help existing military Space Surveillance Network keep an eye on where they are.

The ATA is a set of 42 radio dishes located about 300 miles northeast of San Francisco.  It began scanning the skies for “technosignatures,” which are electromagnetic signals that could hint at the presence of an intelligent alien civilization.

Image Caption: The Allen Telescope Array against a rising Milky Way. Credit: SETI

On the Net:

Researchers Discover Patterns Of Genes Associated With Timing Of Breast Cancer Recurrences

Knowing biology of early or late recurring tumors could help extend survival by identifying interventions to delay or prevent recurrences after tamoxifen
An international research team led by Georgetown Lombardi Comprehensive Cancer Center has found biological differences in hormone-receptor positive breast cancer that are linked to the timing of recurrence despite endocrine therapy.
They say their findings, presented at the 2011 CTRC-AACR San Antonio Breast Cancer Symposium, may help oncologists find ways to individualize systemic therapy to delay or prevent recurrences, and to avoid excessive treatment of patients who will never recur.
“We found that, at the time of diagnosis, there are clear biological differences within the supposedly uniform group of hormone receptor positive breast cancers, and these differences distinguish subtypes relative to the time at which they recur,” says Minetta Liu, M.D., director of translational breast cancer research at Georgetown Lombardi Comprehensive Cancer Center.
“We need to exploit these differences and use our data to figure out what drives a tumor to never metastasize. Then we will try to manipulate the cancers that are programmed to recur to act like that of the non-recurrences,” she says.
Tamoxifen is credited with saving the lives of thousands of women with estrogen receptor-positive (ER+) breast cancer, which accounts for two-thirds of all diagnoses of invasive breast cancer in the United States. As the world’s leading breast cancer treatment and prevention drug, tamoxifen can stave off cancer recurrences for more than 10 years in some patients, but for others, the cancer returns much earlier.
To determine why some ER+ cancers treated with tamoxifen recur earlier rather than later, if at all, Liu and her Georgetown team collaborated with researchers at the University of Edinburgh and with engineers at Virginia Tech.
The Scottish collaborators shared high quality tumor biopsies collected from patients with different stages of breast cancer before they had started tamoxifen therapy. Critical clinical information was available to determine whether or not patients developed metastatic disease, and when the recurrence (if any) was found. The samples were processed and analyzed at Georgetown. Then scientists at Virginia Tech examined the gene expression patterns generated from the tumor biopsies relative to the known clinical outcomes to develop a predictive model of early, late or no disease recurrence.
The final analysis revealed distinct patterns in cancers that recurred early (up to three years from diagnosis) or late (more than ten years from diagnosis). Liu says that some of the genes that were identified were “expected and reassuring,” but others were “unexpected and novel.” Work is ongoing to validate selected genes as biological drivers of metastasis.
“Endocrine therapy and chemotherapy are not without toxicity,” Liu says. “The ability to predict which patients will recur early in their treatment course can lead to more appropriate recommendations for adjuvant chemotherapy. It might also identify those women who would benefit most from studies using investigational agents to enhance the effects of tamoxifen or aromatase inhibitors.”

On the Net:

Molecular Differences May Be Used To Predict Breast Cancer Recurrence In Early Vs. Late Hormone Receptor-Positive Breast Cancer

Researchers may have discovered a series of genes that will help predict whether or not a woman with hormone receptor-positive invasive breast cancer will experience early, late or no recurrence of her disease.

Minetta C. Liu, M.D., associate professor of medicine and oncology and director of translational breast cancer research at Georgetown Lombardi Comprehensive Cancer Center, presented the findings at the 2011 CTRC-AACR San Antonio Breast Cancer Symposium, held Dec. 6-10, 2011.

“There are clear biological differences within the supposedly unified group of hormone receptor (HR)-positive breast cancers, and these differences distinguish subtypes relative to the time at which they recur,” Liu said. “Understanding what drives these distinctions will allow us to tailor treatment and improve patient outcomes.”

Women with HR-positive breast cancer are frequently treated with tamoxifen, which is credited with saving the lives of hundreds of thousands of women. Although tamoxifen prevents or delays cancer recurrence in many women, some will recur 10 years or more from their original diagnosis. Until now, the molecular basis for this recurrence pattern was unknown.

Liu and colleagues, in collaboration with investigators from the University of Edinburgh, evaluated high-quality frozen tumor samples obtained at the time of breast cancer diagnosis. These tissue samples were linked to data on treatment and clinical outcomes, allowing researchers to analyze gene expression patterns present before the initiation of any systemic therapy.

Together with engineers at Virginia Polytechnic Institute, Liu and colleagues identified significant gene expression patterns among the tumor samples. These patterns correlated strongly with the development of distant metastatic disease.

“We confirmed what many have already suspected,” said Liu. “There are biological drivers that define – at the time of tumor development – whether or not breast cancer will recur early, late or not at all. Now we need to validate these findings and take our knowledge to the next step.”

Liu hopes that this research can be used to help personalize treatment in day-to-day clinical practice. “Endocrine therapy and chemotherapy are not without toxicity,” she said. “The ability to predict which patients will recur early in their treatment course can lead to more appropriate recommendations for adjuvant chemotherapy. It might also identify those women who would benefit most from studies using investigational agents to enhance the effects of tamoxifen or aromatase inhibitors.”

She added: “At the other extreme are those patients with HR-positive tumors who recur long after completing five years of endocrine therapy. These are the patients for whom extended endocrine therapy and its related side effects are really worth it.”

The team’s next step is to validate their predictive model for the timing of recurrences on tamoxifen so that physicians and patients can make more informed decisions about the potential added benefits of adjuvant chemotherapy, extended endocrine therapy and involvement in clinical trials. They will also investigate combinations of molecular targets with the ultimate goal of delaying or preventing the development of metastatic breast cancer, Liu said.

On the Net:

Whole Genome Testing – The Power To Help, Hurt And Confuse

The era of widely available next generation personal genomic testing has arrived and with it the ability to quickly and relatively affordably learn the sequence of your entire genome. This would include what is referred to as the “exome,” your complete set of protein-coding sequences.

But as University of North Carolina at Chapel Hill medical geneticists point out, this avalanche of information also includes the totality of one’s genetic mutations and as such arrives with both promise and threats associated with its use.

James P. Evans, MD, PhD is the Bryson Distinguished Professor of Genetics and Medicine at UNC and is a member of the Lineberger Comprehensive Cancer Center. He is also editor-in-chief of Genetics in Medicine, the journal of the American College of Medical Genetics. “What you’re now dealing with is a real medical test, one that has the power to help, hurt and to confuse. I believe we need to think carefully about how to best use it and how that use should be regulated in order to maximize benefit and minimize harm,” he said.

In a commentary published in JAMA on Wednesday, Dec. 7, 2011, Evans and UNC co-author Jonathan S. Berg, MD, PhD, Lineberger member and assistant professor of genetics and medicine, argue that whole genome and whole exome sequencing technology “will routinely uncover both trivial and important medical results, both welcome and unwelcome “¦ and presents the medical community with new challenges.”

“What we have had up until this point with direct-to-consumer genetic testing, despite all the hoopla, was arguably rather trivial from the standpoint of either benefits or threats. It was a fairly worthless technology because it really didn’t give people medically significant findings,” Evans said.

“Now we are entering an entirely different era due to the advent of robust sequencing technology. We have now the potential to tell people very real and important things about their genomes. Some of those things can be very useful and very welcome if acted upon in the right way, but some of that information may not be very welcome to people: being at high risk for an untreatable disease such as dementia, for example.”

As to regulation, Evans and Berg suggest that it need not be draconian but must be nuanced. “Basically, what we call for is that this new generation of medical testing be treated like other medical tests that involve complex medical information — and that there should be a reasonable expectation that an individual who gets it done has some relationship with a qualified care provider.”

That person doesn’t need to be a physician, Evans adds. “There are genetic counselors capable of dealing with this. But it must be a person not employed by the company or laboratory doing the testing since that invites egregious conflict of interest.”

As physicians pledged to avoid causing harm, the authors acknowledge the inevitable tension that exists between paternalism and the reasonable protection of people. They point to three compelling arbiters of whether the acquisition of medical information should require a relationship with a healthcare professional: the information’s complexity, ability to mislead and potential for harm.

“The advent of next generation sequencing technology marks a threshold at which genomic testing easily meets these bars,” they state.

Image Caption: Two UNC experts write in a JAMA commentary that whole genome and whole exome sequencing technology “will routinely uncover both trivial and important medical results, both welcome and unwelcome “¦ and presents the medical community with new challenges.” Credit: National Institute of General Medical Sciences

On the Net:

Researchers Assess Radioactivity Released From The Fukushima Dai-Ichi Nuclear Power Facility

With news this week of additional radioactive leaks from Fukushima nuclear power plants, the impact on the ocean of releases of radioactivity from the plants remains unclear. But a new study by U.S. and Japanese researchers analyzes the levels of radioactivity discharged from the facility in the first four months after the accident and draws some basic conclusions about the history of contaminant releases to the ocean.

The study, conducted by Woods Hole Oceanographic Institution chemist Ken Buesseler and two Japanese colleagues, Michio Aoyama of the Meteorological Research Institute and Masao Fukasawa of the Japan Agency for Marine-Earth Science and Technology, reports that discharges from the Fukushima Dai-Ichi nuclear power plants peaked one month after the March 11 earthquake and tsunami that precipitated the nuclear accident, and continue through at least July. Their study finds the levels of radioactivity, while quite elevated, are not a direct exposure threat to humans or marine life, but cautions that the impact of accumulated radionuclides in marine sediments is poorly known.

The release of radioactivity from Fukushima–both as atmospheric fallout and direct discharges to the ocean–represent the largest accidental release of radiation to the ocean in history. Concentrations of cesium-137, an isotope with a 30-year half life, at the plants’s discharge point to the ocean, peaked at over 50 million times normal/previous levels, and concentrations 18 miles off shore were much higher than those measured in the ocean after the Chernobyl accident 25 years ago. This is largely due to the fact that the Fukushima nuclear power plants are located along the coast, whereas Chernobyl was several hundred miles from the nearest salt water basins, the Baltic and Black Seas. However, due to ocean mixing processes, the levels are rapidly diluted off the Northwest coast of Japan.

The study used publically available data on the concentrations of cesium-137, cesium-134, and iodine-131 as a basis to compare the levels of radionuclides released into the ocean with known levels in the sea surrounding Japan prior to the accident. Impacts of the Fukushima Nuclear Power Plants on Marine Radioactivity is published in the latest issue of Environmental Science & Technology and is available on the journal’s website. Buesseler received funding support for this work from the Gordon and Betty Moore Foundation and the National Science Foundation´s Chemical Oceanography program.

The investigators compiled and analyzed data on concentrations of cesium and iodine in ocean water near the plants´s discharge point made public by TEPCO, the electric utility that owns the plants, and the Japanese Ministry of Culture, Sports, Science and Technology (MEXT). The team found that releases to the ocean peaked in April, a fact they attribute to “the complicated pattern of discharge of seawater and fresh water used to cool the reactors and spent fuel rods, interactions with groundwater, and intentional and unintentional releases of mixed radioactive material from the reactor facility.” They also found that the releases decreased in May by a factor of 1000, “a consequence of ocean mixing and a primary radionuclide source that has dramatically abated,” they report.

While concentrations of some radionuclides continued to decrease, by July they were still 10,000 times higher than levels measured in 2010 off the coast of Japan. This indicates the plants “remain a significant source of contamination to the coastal waters off Japan,” they report. “There is currently no data that allow us to distinguish between several possible sources of continued releases, but these most likely include some combination of direct releases from the reactors or storage tanks, or indirect releases from groundwater beneath the reactors or coastal sediments, both of which are likely contaminated from the period of maximum releases.”

Buesseler says that at levels indicated by these data the releases are not likely to be a direct threat to humans or marine biota in the surrounding ocean waters, but says there could be concern if the source remains high and radiation accumulates in marine sediments.  “We don´t know how this might impact benthic marine life, and with a half-life of 30 years, any cesium-137 accumulating in sediments or groundwater could be a concern for decades to come,” he said.

In June, Buesseler led the first international, multidisciplinary assessment of the levels and dispersion of radioactive substances in the Pacific Ocean off the Fukushima Dai-Ichi nuclear power plants–a major research effort also funded by the Gordon and Betty Moore Foundation. During the research expedition, a group of 17 researchers and technicians spent two weeks aboard the University of Hawaii research vessel R/V Kaimikai-O-Kanaloa examining many of the physical, chemical, and biological characteristics of the ocean that determine the fate of radioactivity in the water and potential impact on marine biota.  The results of their initial assessments will be presented in Salt Lake City in February 2012 at the Ocean Sciences Meeting, an international gathering of more than 4,000 researchers sponsored by The Oceanography Society, the American Society of Limnology and Oceanography, and the American Geophysical Union.

While international collaborations for comprehensive field measurements to determine the full range of isotopes released are underway, it will take some time before results are available to fully evaluate the impacts of this accident on the ocean.

Image 1: A new study uses publically available data to analyze levels of radioactivity released to the ocean as a result of the Fukushima accident. Map of sampling locations at the Fukushima Dai-ichi Nuclear Power Plant. Red dots N and S of the Dai-ichi NPP are discharge channels where samples were collected. Samples were also collected by TEPCO at the Dai-ni NPP with sampling indicated from shore near Dai-ni NPP and Iwasawa Beach (blue triangles). Also shown are sampling locations by MEXT 30 km offshore (green squares). (Ken Buesseler, Woods Hole Oceanographic Institution)

Image 2: Marine chemist Ken Buesseler (left) and University of Hawaii technician Paul Balch make a final inspection of a water sampling rosette prior to deploying it in the Pacific in June 2011. (Photo by Ken Kostel, Woods Hole Oceanographic Institution)

On the Net:

Hubble’s 10,000th Scientific Paper Published

The NASA/ESA Hubble Space Telescope has passed another milestone in its almost 21 years of observations: the publication of the 10,000th refereed scientific paper based on Hubble data.

Alvaro Gimenez, Director of Science and Robotic Exploration for the European Space Agency said: “Reaching the milestone of the 10,000th scientific paper reminds us that Hubble is one of the most successful scientific endeavors in history. European scientists have played a big part in this, and have been intimately involved with Hubble since before the telescope´s launch. Thanks to ESA´s partnership with NASA, Europe´s astronomers have made major contributions to our understanding of the Universe.”

European scientists are guaranteed at least 15% of the observing time on Hubble under the terms of ESA´s participation in this international project. Over the years, scientists in over 35 countries have engaged in Hubble research. The United States are responsible for the most papers published based on Hubble observations, followed by five ESA member countries: the United Kingdom, Germany, Italy, France and Spain.

The lead author of the 10,000th paper is Zach Cano of Liverpool John Moores University in the United Kingdom. He reports on the identification of the faintest ever supernova to be associated with a long-duration gamma ray burst, a type of outburst of high-energy radiation that follows the death of a star.

Since its launch in 1990, the Hubble Space Telescope has been serviced five times by astronauts, most recently in 2009. With recently overhauled equipment and a suite of new instruments, Hubble is now at the height of its powers, and is expected to continue operations into the second half of this decade at least.

Hubble´s successor, the NASA/ESA/CSA James Webb Space Telescope is currently under construction and will launch later this decade.

On the Net: