Distracted Driving Has Led To 16,000 Documented Deaths

U.S. researchers reported on Thursday that texting or talking on cell phones while driving killed 16,000 people from 2001 to 2007.

The research is one of the first scientific attempts to quantify how many people have died in accidents caused by cell phone distractions.

“Our results suggested that recent and rapid increases in texting volumes have resulted in thousands of additional road fatalities in the United States,” Fernando Wilson and Jim Stimpson of the University of North Texas Health Science Center wrote in the American Journal of Public Health.

The team used data from the National Highway Traffic Safety Administration on deaths attributable to distracted driving. 

“Since roughly 2001-2002, texting volumes have increased by several hundred percent,” Wilson told Reuters in a telephone interview. In 2002, 1 million texts were sent every month; this rose to 110 million in 2008.i

“Since 2001 our model predicts that about 16,000 people have died since then that we attribute to the increase in texting volume in the United States.”

Wilson said texting and using smartphones that provide email access and other applications takes the problem to a whole new level.

The Transportation Department said in 2009 that U.S. traffic deaths hit their lowest level since the mid-1950s at 33,963.

However, for every 1 million cell phone subscribers, Wilson and Stimpson estimate a 19-percent rise in deaths because of distracted driving.

“Distracted deaths as a share of all road fatalities increased from 10.9 percent to 15.8 percent from 1999 to 2008, and much of the increase occurred after 2005,” the researchers wrote.

“In 2008, approximately 1 in 6 fatal vehicle collisions resulted from a driver being distracted while driving,” the report said. It found 5,870 people died in accidents attributed to distracted driving.

Wilson and Stimpson found that cellphone ownership and the number of text messages sent rose over the same amount of time.

Wilson told Reuters that 30 states have already banned texting while driving, and some cities and states require hands-free devices for drivers using mobile phones.

Secretary of Labor Hilda Solis said this week that the Occupational Safety and Health Administration and the U.S. Department of Transportation would work together to fight distractions.

However, Wilson told Reuters that he does not see a way to better enforce this.

“I guess a perfect solution would be installing cell phone jammers in every car but that is not going to happen,” Wilson told Reuters.

“Unlike drunk driving, where you have effective enforcement mechanisms you don’t have that with texting,” he said. “The cop just has to get lucky and see you texting while driving.”

On the Net:

Causes of Death in Anemia Patients

(Ivanhoe Newswire) — Among patients receiving immunosuppressive therapy for severe aplastic anemia, a condition in which the bone marrow is unable to produce blood cells, the length of telomeres (chromosome markers of biological aging) was clearly associated with a higher rate of relapse and lower overall survival.

Severe aplastic anemia is characterized by life-threateningly low blood cell counts, but the condition can be treated by bone marrow transplant or immunosuppressive drugs. In older patients, or when an appropriate donor is not available, immunosuppression can be effective, although relapse can occur, as can chromosome abnormalities in bone marrow cells that accompany cancers in the blood ““ a condition referred to as clonal evolution.

Researchers have identified certain cell abnormalities as risk factors in bone marrow failure. “Mutations in telomerase complex genes resulting in extremely short telomeres have been described in some patients with apparently acquired severe aplastic anemia,” the authors were quoted as saying.

The telomere, a structure at the end of a chromosome that shortens with each cell division, functions as a protective cap to prevent erosion of genomic DNA during cell division.

Phillip Scheinberg, M.D., of the National Institutes of Health, Bethesda, MD., and colleagues conducted a study to determine the effect of telomere attrition in acquired severe aplastic anemia by measuring telomere length prior to immunosuppressive therapy. The study included 183 patients with severe aplastic anemia who were treated from 2000 to 2008.

Of the patients studied, 104 responded to immunosuppressive therapy. The researchers found that there was no correlation between telomere length and the probability of response to the therapy. The response rate for patients in the first quartile (shortest telomere lengths) was 56.5 percent; in the second quartile, 54.3 percent; in the third quartile, 60 percent; and in the fourth quartile, 56.5 percent.

Additional analysis, however, demonstrated that telomere length was associated with relapse, clonal evolution, and mortality. The shorter the telomere, the greater the probability of disease relapse. The probability of clonal evolution was higher in patients in the first quartile (24.5 percent) than in quartiles 2 through 4 (8.4 percent).

“Survival between these two groups differed, with 66 percent surviving 6 years in the first quartile compared with 83.8 percent in quartiles 2 through 4,” the researchers were quoted as saying.

“In conclusion, our data show that in a cohort of patients with severe aplastic anemia receiving immunosuppressive therapy, telomere length was not associated with response but was associated with risk of relapse, clonal evolution, and overall survival.”

SOURCE:  Journal of the American Medical Association (JAMA), September 22/29, 2010.

Vitamin D Protects Against Endometrial Cancer

(Ivanhoe Newswire) ““ So far in 2010, 7,950 people have died from endometrial cancer and there are 43,470 new cases. Studies on animals showed that obese women, who are at a higher risk for endometrial cancer, should take vitamin D to reduce their risk.

The study showed that 25 percent of obese mice given vitamin D supplements in their diet developed endometrial cancer, while 67 percent of mice not given vitamin D developed cancer. Vitamin D did nothing for mice of normal weight, whether or not they took vitamin D supplements. All of the mice were already at an increased risk for cancer because they lost one of two PTEN tumor suppressor genes, and without both they are at a much higher risk for human endometrial cancer.

“Vitamin D has been shown to be helpful in a number of cancers, but for endometrial cancer, our study suggests it protects only against cancer that develops due to obesity,” the study’s lead investigator, Leena Hilakivi-Clarke, PhD, a Professor of Oncology was quoted as saying. “Still, if these results are confirmed in women, use of vitamin D may be a wonderfully simple way to reduce endometrial cancer risk.”

“Until further studies are done, I think the best advice for women concerned about their risk is to take vitamin D supplements or spend a few more minutes each week in the sun. This vitamin has shown many health benefits in addition to the promise suggested by our mouse study,” explains Dr. Leena Hilakivi-Clarke.

It is best for women to remain at a healthy weight, not only to prevent endometrial cancer, but maintaining a healthy weight has many other health benefits. “However, since over 50% of women in the US are overweight or obese, and losing weight is difficult, other means are needed to prevent endometrial cancer in these women. One way is to use progesterone, but it increases breast cancer risk. Vitamin D supplements are likely to be safer than, for example, progesterone,” says Dr. Leena Hilakivi-Clarke.

Previous studies have shown that obesity in women increases human endometrial cancer by two to six times. Recent research published by the National Institutes of Health on the protective effects of vitamin D on endometrial cancer showed no overall benefit, but the study did not investigate whether vitamin D was effective in obese women, Dr. Hilakivi-Clarke says. For that reason, researchers focused on weight and endometrial cancer.

They used the best animal model available to look at endometrial cancer ““ the PTEN knock-out mice. “Loss of PTEN is a common event in endometrial cancer in women,” Hilakivi-Clarke says. The mice were divided into four groups: one group was fed a normal diet, another was fed a normal diet with vitamin D supplements, another was fed a high-fat diet, and the last group was fed a high-fat diet with vitamin D supplements.

The study showed that of the mice fed a high-fat diet, 67 percent developed endometrial cancer, while only 25 percent fed high-fat diets along with vitamin D developed it. “In the obese mice, vitamin D offered a very strong, very significant protective effect,” Hilakivi-Clarke says.

The researchers aren’t sure as to why vitamin D reduces the chance of endometrial cancer in obese mice, but they presume that obese mice produce less osteopontin, which makes cancer more aggressive. They also believe the obese mice are producing more E-cadherin, which stops cancer form metastasizing.

“But we really don’t know why dietary vitamin D works so well in our obese mice,” Hilakivi says. “We are currently investigating the mechanisms, and we are hopeful that we can find an answer.”

SOURCE: Cancer Prevention Research, published online September 20, 2010

CDC Study Finds ‘Disturbingly High’ U.S. Obesity Rates

Despite government-led efforts to reduce the American obesity rate to 15% by 2010, a study published earlier this month in the International Journal of Obesity has found the actual percentage of obese adults in the U.S. are on the rise.

In fact, the study, which was written by officials with the Centers for Disease Control and Prevention (CDC), found that obesity rates among adults have risen from 13% in the early 1960s, to more than 30% in 1999, to 32% in men and 35% in women by 2007-2008. Those most recent figures are an increase of 5% in males and 2% in females from a previous study in 1999-2000.

Likewise, abdominal obesity–which, according to Reuters Health is defined as having a waistline of 35 inches for women and 40 inches for men–is on the rise. The male abdominal obesity rate increased from 39% in 1999-2000 to 44% in 2007-2008, while the female rates jumped from 56% to 62% over that same span, the news organization reported on Tuesday.

“The prevalence of obesity and abdominal obesity remains disturbingly high among adults in the United States, and our trend analysis shows that both may still be increasing among men,” Dr. Earl S. Ford of the CDC’s National Center for Chronic Disease Prevention and Health Promotion, and his colleagues wrote in the study, according to Reuters Health.

The CDC study analyzed the medical data of more than 23,000 adults of at least 20 years of age.

“The rising tide of obesity ‘has all but ruled out’ the chances that the U.S. will meet its”¦ 2010 goals,” Reuters Health said in their report. “In order to whittle U.S. obesity rates down to 15 percent, Ford and his team say, the average American would either need to consume 500 fewer calories a day, walk for nearly two additional hours a day, or burn off the equivalent amount of calories doing some other type of physical activity.”

According to the CDC website, obesity ranges are determined using the body mass index (BMI) scale, which is calculated by using height and weight. The CDC considers a person who has a BMI of 25 to 29.9 to be overweight, while adults with a BMI of 30 or greater is considered clinically obese.

On the Net:

CERN Announces New LHC Discovery

Scientists working at the world’s largest atom smasher said Tuesday the they believe they have discovered a new phenomenon while trying to unravel the universe’s deepest secrets.

One of the detectors in the Large Hadron Collider (LHC) experiment indicated that “some of the particles are intimately linked in a way not seen before in proton collisions,” the European Organization for Nuclear Research (CERN) said on its website.

Physicist Guido Tonelli told his fellow scientists at a seminar that the new feature has appeared in analysis in the middle of July.

“We have today submitted a paper to expose our findings to the wider (scientific) community,” he added. The seminar was held to present findings from the collider’s Compact Muon Solenoid (CMS) detector.

Tonelli, a physicist from Italy’s University of Pisa and spokesperson for the CMS detector, noted that during weeks of cross-checks and critical debate among the team, “we didn’t succeed to kill it.”

The phenomenon showed up as a “ridge-like structure” on computer mapping graphs based on data from billions of collisions in the $5.2 billion LHC.

The CMS, one of six experiments running around the particle accelerator, is designed to look for the elusive and so far theoretical Higgs Boson, nicknamed the “God Particle.” It is also used to try to shed light on components of dark matter, the mysterious invisible void that makes up one-quarter of the universe.

“What we really hope to get is not just ideas, but how to test it,” MIT physicist Gunther Roland, one of the authors of the paper submitted for review, said during the seminar at CERN’s headquarters on the Geneva border.

Despite a positive response from their peers at CERN, the CMS team’s interpretation of the findings on Tuesday was strongly challenged during the meeting as scientists threw suggestions at one another.

“We are stating facts, facts that there is something that we have not seen before,” responded Tonelli, as they began the process of finding endorsement and an explanation for the observation.

Image Caption: Image of a 7 TeV proton-proton collision in CMS producing more than 100 charged particles.

On the Net:

Community Health Workers Can Effectively Manage Children With Malaria And Pneumonia

Press release from PLoS Medicine

Community Health Workers can safely and effectively provide integrated management of pneumonia and malaria to communities by dispensing amoxicillin to children with non-severe pneumonia and artemether-lumefantrine to children with malaria (after using rapid diagnostic tests). Furthermore, these activities result in a significant increase in the proportion of appropriately-timed antibiotic treatment for non-severe pneumonia and in a significant decrease in inappropriate use of antimalarials. These are the results from a study by Kojo Yeboah-Antwi from the Boston School of Public health, USA, and colleagues and published in this week’s PLoS Medicine.

The authors conducted their study in Zambia where 3125 children with fever and/or fast breathing were managed by community health workers over a 12-month period. Community health workers were matched and randomly allocated to the intervention arm (in which community health workers performed rapid diagnostic tests, treated rapid diagnostic test-positive children with the anti-malarial drug, artemether-lumefantrine, and treated children with non-severe pneumonia with amoxicillin) and the control arm (in which community health workers did not perform rapid diagnostic tests, treated all febrile children with artemether-lumefantrine and referred those with signs of pneumonia to the health facility, as per the Zambian Ministry of Health policy.

A significant proportion of children managed in the intervention arm [68.2% (247/362)] received appropriately-timed antibiotic treatment for non-severe pneumonia compared to 13.3% (22/203) in the control arm. There was also a significant decrease in inappropriate use of antimalarials when treatment was based on the results of rapid diagnostic tests. In the intervention group 27.5% (265/963) of children with fever received malaria treatment compared to 99.1% (2066/2084) of children in the control group.

The authors conclude: “The capacity of [community health workers] to use [rapid diagnostic tests], artemether-lumefantrine and amoxicillin to manage both malaria and pneumonia at the community level is promising and has the potential to reduce over usage of artemether-lumefantrine as well as to provide early and appropriate treatment to children with non-severe pneumonia.”

On the Net:

Cyber-Bullying Harder On Victims Than Physical Violence

U.S. researchers reported on Tuesday that cyber-bullying may be even harder on the victims than physical beatings or name-calling.

The team at the National Institute of Health found that cyber-bullies seem to be less depressed than their prey, unlike traditional bullies.

Jing Wang, Tonja Nansel and Ronald Iannottti of the NIH’s National Institute of Child Health and Human Development analyzed data from an international survey from 2005/2006 that included 4,500 U.S. preteens and teens.

The teens were asked about feelings of depression, irritability, grouchiness and ability to concentrate, and also if they had been hit, name-called, shunned or sent negative messages through a computer or cell phone.

“Unlike traditional bullying which usually involves a face-to-face confrontation, cyber victims may not see or identify their harasser,” Iannotti’s team wrote in the Journal of Adolescent Health.

“As such, cyber victims may be more likely to feel isolated, dehumanized or helpless at the time of the attack.”

Physical and verbal bullies are also often depressed.  However, the NIH team found that while there was little difference in depression between physical bullies and their targets, cyber-bullying victims reported significantly higher levels of depression than those frequently bullied.

Bullying can be a policy issue and can harm learning and lower a school’s test scores.  U.S. schools are increasingly under pressure to bring up scores and show regular improvements.

The team found last year that over 20 percent of all U.S. adolescents in school had been bullied physically at least once in the last two months, 53 percent were bullied verbally, 51 percent bullied socially by being excluded or ostracized and 13.6 percent were bullied electronically.

On the Net:

Researchers Model The Spread Of H1N1 Flu

As the United States prepares for the upcoming flu season, a group of researchers supported by the National Institutes of Health continues to model how H1N1 may spread.

The work is part of an effort, called the Models of Infectious Disease Agent Study (MIDAS), to develop computational models for conducting virtual experiments of how emerging pathogens could spread with and without interventions. The study involves more than 50 scientists with expertise in epidemiology, infectious diseases, computational biology, statistics, social sciences, physics, computer sciences and informatics.

As soon as the first cases of H1N1 infections were reported in April 2009, MIDAS researchers began gathering data on viral spread and affected populations. This information enabled them to model the potential outcomes of different interventions, including vaccination, treatment with antiviral medications and school closures. The work built upon earlier models the MIDAS scientists developed in response to concerns about a different potentially pandemic influenza strain, H5N1, or avian flu.

“Computational modeling can be a powerful tool for understanding how a disease outbreak is unfolding and predicting the implications of specific public health measures,” said Jeremy M. Berg, Ph.D., director of the National Institute of General Medical Sciences, which supports MIDAS. “During the H1N1 pandemic, MIDAS scientists applied their models to see what they could do to help in a real situation.”

Because the H1N1 flu strain is still circulating, a MIDAS group based at the University of Washington in Seattle is now studying the impact the virus could have this fall and winter. Its model, which represents the world population, includes information about immunity””how many people are protected by vaccination or prior infection””and the other circulating flu strains. Using the model, the scientists may be able to predict how H1N1 evolves and the possible role of the H3N2 strain, which historically has been the dominant seasonal flu virus. The results also may help forecast the potential effectiveness of the new flu vaccine that includes both the H1N1 and H3N2 viral strains.

Here are key findings from MIDAS’ earlier work on the H1N1 pandemic. For more results and links to the scientific papers, visit http://www.nigms.nih.gov/Initiatives/MIDAS/Publications.htm.

Estimating Severity

To predict the likely severity of H1N1 in the fall and winter months following the initial outbreaks, the MIDAS group led by Marc Lipsitch, D.Phil., of the Harvard School of Public Health in Boston analyzed patient care data from Milwaukee and New York City. The researchers estimated that about 1 in 70 symptomatic people were admitted to the hospital, 1 in 400 required intensive care and 1 in 2,000 died. They predicted H1N1 to be no more and possibly even less severe than the typical seasonal flu strain. The work, which factored in local differences in flu detection and reporting, also showed that it’s possible to make predictions about severity using data from the early stages of an outbreak.

Vaccinating Children

Ira Longini, Ph.D., at the University of Washington and his MIDAS colleagues developed a simulation model to evaluate the effectiveness of different strategies to vaccinate school-aged children, who are known to play a key role in transmitting the flu virus. They modeled a range of scenarios that varied the type of vaccine, the percentage of children vaccinated and the infectiousness of the virus. For each situation, the modeling results indicated that vaccinating this age group substantially reduced overall disease spread and prevented up to 100 million additional cases in the general population. These effects, however, were less strong when the virus was more contagious or when fewer children were vaccinated. Based on these results, Longini’s group concluded that vaccine distribution strategies should depend on a number of factors, including vaccine availability and viral transmission rates.

Cost-Benefit of Employee Vaccination Programs

In one of the first analyses of the economic value of work-sponsored seasonal and pandemic flu vaccine programs, the MIDAS group led by Donald Burke, M.D., at the University of Pittsburgh developed a model that estimated the employer cost to be less than $35 per vaccinated employee with a potential savings of $15 to $1,494 per employee, depending on the infectiousness of the virus.

Interventions and Local Demographics

To determine if a vaccination strategy would likely have the same effect in different locations, a team led by MIDAS investigator Stephen Eubank, Ph.D., of the Virginia Bioinformatics Institute at Virginia Tech in Blacksburg developed models representing the demographics of Miami, Seattle and each county in Washington. The models indicated that while vaccinating school-aged children was the best strategy in each place, the optimal timing and overall effectiveness of the approach varied due to specific characteristics of the local population, such as age, income, household size and social network patterns. These differences, Eubank concluded, suggest that vaccination and probably other intervention strategies should take local demographics into account.

Antiviral Medications

Lipsitch’s collaborators Joseph Wu, Ph.D., and Steven Riley, D.Phil., at the University of Hong Kong used mathematical modeling to predict the likelihood that the H1N1 strain would develop resistance to the widespread use of antiviral medications taken to lessen flu symptoms. Their work showed that giving a secondary antiviral flu drug either prior to or in combination with a primary antiviral could mitigate the emergence of resistant strains in addition to slowing the spread of infection. The results, the researchers concluded, point to the value of stockpiling more than one type of antiviral drug.

School Closures

A public health measure under consideration was closing schools, which previous MIDAS pandemic flu models identified as a potentially effective intervention. According to Burke’s model of Allegheny County, Penn., closing individual schools after they identified cases may work as well as closing entire school systems. When strictly maintained for at least 8 weeks, both types of school closure could delay the epidemic peak by up to 1 week, allowing additional time to develop and implement other interventions. However, the model also indicated that school closures lasting less than 2 weeks could actually facilitate flu spread by returning susceptible students to school in the middle of an outbreak.

“Models like the ones MIDAS has developed help us understand not only trends in disease spread, but also how different factors can influence those trends,” said Irene A. Eckstrand, Ph.D., who directs the MIDAS program. “MIDAS research is leading to new tools and approaches that can aid in making public health decisions at a range of levels, from local to national.”

On the Net:

Germany Seeks Privacy Code For Online Mapping

The government of Germany has called on Google Inc. and other providers of online navigation services to create a set of voluntary data protection guidelines for services such as Google’s “Street View” by the end of the year.

Failure to do so would result in the imposition of new market regulations to protect consumers, said Germany’s Interior Minister Thomas de Maiziere on Monday.

De Maizier’s comments came after a five-hour meeting with Internet executives, Germany’s federal justice, consumer protection ministers and various data protection authorities.

“We need a charter guarding private geographical data and we need it drafted… by December 7,” the AFP quoted de Maiziere as saying.

“A charter could, and I mean could, make regulation superfluous,” he told reporters during a press conference.

Berlin had called the meeting following public outrage over Google’s plan to display images from 20 German cities as part of its Street View online mapping service.

Launched in 2007, the service includes panoramic images from scores of cities throughout the world taken at street level by vehicles with specialized cameras.

Due to Germany’s history of privacy abuses under both the Nazi and communist governments, the nation is particularly sensitive to potential privacy violations.

In response to Germany’s strong public protest, Google has made the country the only one in which citizens can prevent images of their homes or businesses from being displayed on Street View.

Hundreds of thousands of people have already opted out ahead of an October 15 deadline, the AFP news agency reported, citing media reports that Google would neither confirm nor deny.

“At this stage it is not possible to give an accurate number of opt-outs” a Google spokesperson told BBC News.

“As expected, due to the wide media coverage and our own information campaign the number of letters we have received has increased in recent weeks.”

However, Germany’s government cautioned that such steps were inadequate, and threatened new legislation to soothe security and privacy concerns.

De Maiziere said any new, voluntary guidelines should be worked out in collaboration with data protection authorities, and that the online mapping service providers should allow users to clearly see how their privacy rights are affected by such services.

However, De Maiziere fell short of endorsing calls from some consumer advocates for an “opt in” policy.

“We need geo-services for environmental policy, preventing natural disasters, searching for a home, planning our holidays — all of that must still be possible in the future,” De Maiziere said.

Instead, he would support legislation defining “red lines that must not be crossed.” 

Among other things, this would guarantee that users’ whereabouts are not exposed online, he said.

However, such a law would not affect the firms attending Monday’s meeting based on the services they provide today.

Google appeared to embrace the opportunity to help develop the new rules.

The Internet search giant said it would “welcome the proposal for self-regulation,” and was “happy to contribute to it in a constructive way.”

“Online mapping and geographical tools are becoming ever more important for citizens, authorities and companies – a trend which is only set to increase through the tremendous growth of the mobile Internet,” a Google spokesperson told BBC News.

“Any future legislation must make sure that in addition to the requirements of data protection, the development of innovative business opportunities and modern technology are allowed to flourish.”

On the Net:

Google’s Street View Faces Strong Opposition In Germany

As officials from the German government met on Monday to discuss privacy issues centering around Google’s Street View service, weekly news magazine Der Spiegel reported that “several hundred thousand people” have told the California-based technology corporation that they do not want their homes and businesses featured on the mapping service.

“Der Spiegel cited sources close to Google, which in August said it expected tens of thousands of tenants and owners to respond to its offer, unique to Germany, to pixel out buildings before images are published,” a Sunday report from AFP noted, adding that the company planned to launch its Street View service in Germany later on this year by featuring images from 20 cities.

According to Reuters, officials from the German government have been “critical” of Google’s Street View service, and have promised to “scrutinize Google’s promise to respect privacy requests by letting people stay out of the project” if they opt-out of the program by October 15.

A Google spokesman told BBC News on Monday that it was “not possible to give an accurate number of opt-outs” received to date, but added that they were not surprised by the figures presented in the Der Spiegel report.

During Monday’s meeting, Berlin officials and representatives from Google hope to hammer out a way to protect the privacy of concerned German citizens while not outright banning the Street View service. BBC News correspondent Stephen Evans reports that the government could announce their decision before the end of the day.

“Online mapping and geographical tools are becoming ever more important for citizens, authorities and companies–a trend which is only set to increase through the tremendous growth of the mobile Internet,” a Google spokesperson told the British news agency. “Any future legislation must make sure that in addition to the requirements of data protection, the development of innovative business opportunities and modern technology are allowed to flourish.”

German authorities, who in May discovered that Google had been mistakenly collecting private information from unencrypted Wi-Fi networks, aren’t the only ones taking umbrage to Street View. Investigations regarding the unauthorized collection of data are also going on in France, Australia, and Spain, and the company is also facing both a class action lawsuit and a probe led by Connecticut Attorney General Richard Blumenthal in the United States.

On the Net:

Genomes Of Sexually Precocious Fruit Flies Decoded

Breakthrough study could transform drug, aging and fertility research

UC Irvine researchers have deciphered how lowly fruit flies bred to rapidly develop and reproduce actually evolve over time. The findings, reported in the Sept. 15 online issue of Nature, contradict the long-held belief that sexual beings evolve the same way simpler organisms do and could fundamentally alter the direction of genetic research for new pharmaceuticals and other products.

“This is actually decoding the key DNA in the evolution of aging, development and fertility,” said ecology & evolutionary biology professor Michael Rose, whose laboratory began breeding the “super flies” used in the current study in 1991 ““ or 600 generations ago. He joked that they “live fast and die young.”

Lead author and doctoral student Molly Burke compared the super flies to a control group on a genome-wide basis, the first time such a study of a sexually reproducing species has been done. The work married DNA “soup” gathered from the adapted flies with cheap, efficient technology that uses cutting-edge informatics tools to analyze the DNA of entire organisms. Burke found evidence of evolution in more than 500 genes that could be linked to a variety of traits, including size, sexual maturation and life span, indicating a gradual, widespread network of selective adaptation.

“It’s really exciting,” she said. “This is a new way of identifying genes that are important for traits we’re interested in ““ as opposed to the old hunting and pecking, looking at one gene at a time.”

For decades, most researchers have assumed that sexual species evolve the same way single-cell bacteria do: A genetic mutation sweeps through a population and quickly becomes “fixated” on a particular portion of DNA. But the UCI work shows that when sex is involved, it’s far more complicated.

“This research really upends the dominant paradigm about how species evolve,” said ecology & evolutionary biology professor Anthony Long, the primary investigator.

Based on that flawed paradigm, Rose noted, drugs have been developed to treat diabetes, heart disease and other maladies, some with serious side effects. He said those side effects probably occur because researchers were targeting single genes, rather than the hundreds of possible gene groups like those Burke found in the flies.

Most people don’t think of flies as close relatives, but the UCI team said previous research had established that humans and other mammals share 70 percent of the same genes as the tiny, banana-eating insect known as Drosophila melanogaster.

Scientists who did not participate in the work agreed that it could change the direction of much research. “Anyone who expects to find a single solution for problems like aging will be disappointed, because this work suggests there’s no one genetic target that could be fixed,” said Richard Lenski, an evolutionary biologist at Michigan State University. “On the other hand, it means there are many genetic factors that can be further investigated.”

Kevin Thornton and Parvin Shahrestani of UCI and Joseph Dunham of the University of Southern California are co-authors of the study, which was funded by UCI and National Science Foundation grants.

Image 1: UC Irvine doctoral student Molly Burke used fruit flies to find more than 500 new genes linked to aging and sexual development. Steve Zylius / University Communications

Image 2: Ecology & evolutionary biology professor Michael Rose says Burke’s work with fruit flies in his UCI lab will revolutionize the way drugs are developed. Steve Zylius / University Communications

Image 2: Control group fruit flies including a female, left, and male, right, are seen through a microscope. UCI researchers concluded that sexual species do not evolve the way simple bacteria do. Steve Zylius / University Communications

On the Net:

Introspective Thinking

(Ivanhoe Newswire) ““ A specific region of the brain appears to be bigger in people who are good at tuning into their own thoughts and reflecting upon their decisions. Introspection, or “thinking about your thinking”, is a key aspect of human consciousness, but scientists have already discovered differences in people’s ability to introspect.

The researchers along with Prof. Geraint Rees from University College London, suggests that the volume of gray matter in the anterior prefrontal cortex of the brain, which lies right behind our eyes, is a strong indicator of a person’s introspective ability. They also believe that the white matter connected to this area is also playing a major role in introspective thinking. The scientists are unsure of how the tow matters and introspection work, however they are sure that people with more gray matter in that area tend to be more introspective.

In the future this knowledge may help doctors to understand what brain injuries will do to a person’s ability to introspectively think. Eventually, this could lead to tailored treatments for stroke victims or people who have had major head trauma or brain injury.

“Take the example of two patients with mental illness””one who is aware of their illness and one who is not,” Stephen Fleming, one of the authors of the study, from University College London was quoted as saying. “The first person is likely to take their medication, but the second is less likely. If we understand self-awareness at the neurological level, then perhaps we can also adapt treatments and develop training strategies for these patients.”

The study recruited 32 healthy human participants and showed them two screens, each having six patterned patches. One screen had one patch that was brighter than all the rest and the participants were to look at the screens then say which screen had the brighter patch and rate how confident there answer is. While the experiment was taking place their brains were being scanned by an MRI. The research team designed the task to be difficult, and for the participants to have to use introspection to correctly answer and judge their confidence. The researchers thought that those who were good at introspection would be confident after they knew their answer was right and unconfident when their answer was wrong. They altered the patches every time to.

“It’s like that show, ‘Who Wants to Be a Millionaire?'” Ramiona Weil, form the Wellcome Trust Center Neuroimaging at the University of London was quoted as saying. “An introspective contestant will go with his or her final answer when they are quite sure of it, and perhaps phone a friend when they are unsure. But, a contestant who is less introspective would not be as effective at judging how likely their answer is to be correct.”

“We want to know why we are aware of some mental processes while others proceed in the absence of consciousness,” said Fleming. “There may be different levels of consciousness, ranging from simply having an experience, to reflecting upon that experience. Introspection is on the higher end of this spectrum””by measuring this process and relating it to the brain we hope to gain insight into the biology of conscious thought.”

SOURCE: Science, published online September 16, 2010

Global Project Underway To Preserve Yam Biodiversity

World yam collection in Nigeria provides ultimate rescue for African yam diversity in an initiative to conserve critical crop collections backed by the Global Crop Diversity Trust

Farmers and crop scientists worldwide are engaged in an ambitious new effort to add 3,000 yam samples to international genebanks with the aim of saving the diversity of a crop that is consumed by 60 million people on a daily basis in Africa alone, according to an announcement today from the Global Crop Diversity Trust.

In almost all the countries of the African yam belt, a large number of potentially important yam varieties are preserved only in fields, where they are in danger of being picked off by pests or diseases as well as more common disasters like fire or flooding. For example, a large fire recently destroyed a yam collection in Togo. Civil conflicts have also resulted in collections being destroyed.

Yam varieties gathered from West and Central African countries through the project are being sent to the International Institute for Tropical Agriculture (IITA) in Ibadan, Nigeria, where tissue samples of the crop will eventually be frozen at ultra-low temperatures in liquid nitrogen””a technique known as cryoconservation””which offers the most secure form of long-term storage currently available. The majority of the world’s crops can be conserved over long periods simply by drying the seeds and storing them under cold, dry conditions. However, a significant number of crops, including yams, cannot be stored so easily and must be conserved as vegetative material in tissue culture.

Farmers in West Africa’s “yam belt,” which includes the countries of Nigeria, Côte d’Ivoire, Ghana, Benin and Togo, produce more than 90 percent of the world’s yams. The project, however, will also include yam varieties collected in the Philippines, Vietnam, Costa Rica, the Caribbean and several Pacific nations. It is the first worldwide effort to conserve yam species and cultivars. The project is funded with support from the UN Foundation and the Bill and Melinda Gates Foundation.

“This opportunity to protect an incredibly wide variety of yams allows us to feel more reassured that the unique diversity of yam will be safely secured and available to future generations,” said Alexandre Dansi, a yam expert at the University of Abomey-Calavi in Benin.

For Benin, which sits squarely in the buckle of the yam belt, yam is an integral part of the culture and community life. The large tubers weighing up to 70 kilos are a common sight on roadside markets. Dansi has worked with producers to catalogue about 250 discrete types of yams and more than 1,000 named yam varieties. He is collaborating with farmers to document additional varieties. According to farmers’ reports, many traditional varieties are disappearing in their production zones because of high susceptibility to pests and diseases, poor soil, soil moisture content, weeds and drought, which make them less productive or more costly to grow compared to other crops such as cassava.

Through Dansi’s work, Benin already has sent 847 yam samples to the IITA. At IITA, the tubers will be grown out in fields, and cuttings taken for conservation in the lab as part of an international collection that already contains about 3,200 yam samples from West Africa.

Thousands of years of cultivation have resulted in a wide diversity of yam varieties existing in farmers’ fields, particularly in West Africa. In some parts of Africa (mainly Benin and Nigeria), yams are still being domesticated from wild tubers found in the forest. The popularity of the crop remains high with consumers, and sellers get a high price in urban markets. However, yams remain relatively under-researched despite their potential to bring farmers out of poverty in one of the world’s poorest regions. Using the collection now being assembled to find valuable traits that provide disease-resistance and higher yields is key to improving farmer’s fortunes.

“It’s really akin to putting money in the bank,” said Cary Fowler, executive director of the Trust. “All crops routinely face threats from plant pests, disease, or shifting weather patterns, and a country’s ability to breed new varieties to overcome these challenges is directly tied to what they have in the bank, not just in terms of financial resources but in terms of the diversity in their crop collections.”

The yam project is part of a broader effort involving major crop species worldwide in which the Trust is helping partners in 68 countries””including 38 in Africa alone””rescue and regenerate more than 80,000 endangered accessions in crop collections and send duplicates to international genebanks and the Svalbard Global Seed Vault in the Arctic Circle.

For yams, reproduced through vegetative propagation, IITA offers the only long-term form of conservation. Conserving the crop requires extracting tissue in the laboratory and freezing it in liquid nitrogen. However, the technique demands careful research and a staff of dedicated skilled technicians. Most African countries cannot afford to give their yam diversity this kind of attention.

At IITA, the DNA of the samples coming from locations around the world will also be analyzed to get a better sense of the genetic diversity contained in various collections. This is not, however, an academic exercise. It helps the genebank managers avoid keeping too many copies of the same material. It also helps the search for valuable genes that can provide the traits needed to deal with diseases or climate change.

“This project is fascinating because it involves the most traditional and the most advanced techniques of crop conservation. We would like to deploy the best tools science has to offer to secure centuries of yam cultivation,” said Dominique Dumet, head of the Genetic Resources Center (GRC) at IITA.

IITA also will be offering a stable and safe haven for yam collections that sometimes must endure unusual stress. For example, Cote D’Ivoire will be sending 5050 yam samples to IITA for conservation from a collection that, after the civil war in 2002, had to be moved from Bouak© in the north to Abidjan.

“We are building up our collection again, but some varieties were lost,” said Amani Kouakou, a scientist at Cote D’Ivoire’s Centre National de Recherch© Agronomique. “We welcome the chance to share the material with IITA and discover new materials that we have never cultivated in this country.”

Meanwhile, in Benin, Dansi is using the project as an opportunity to work with farmers to test and characterise materials, exchange varieties and techniques between different yam-growing regions in Benin, and build up better community storage barns for keeping the tubers in good health until the next planting season.

“The security we now have is reassuring and allows us to focus on other things, like working with farmers to improve yields,” Dansi said. “And on top of that we can now ask IITA for interesting yams from other parts of the world that we may never have seen before in Benin.”

On the Net:

Brain Lesions, Not Old Age, Could Cause Memory Loss

Old age and memory loss have long been anecdotally linked, but advancing in years may not actually be related to those so-called ‘senior moments’ that often plague the elderly, according to a new study published in the online journal Neurology on Wednesday.

The study, which was completed by Rush University Medical Center neuropsychologist Robert S. Wilson, PhD, found that the same types of brain lesions that are typically associated with dementia and Alzheimer’s disease could also be the cause of mild memory loss among senior citizens.

As part of the study, Wilson and his colleagues recruited more than 350 nuns, priests, and brothers to participate in the Chicago-based university’s Religious Orders Study.

Each participant completed as many as 13 years of annual cognitive testing, and upon their deaths, their brains were examined for neurofibrillary tangles, cerebral infarction (stroke), and Lewy bodies–lesions typically associated with dementia.

The researchers studied the rate of change in each individual’s cognitive function over time, and discovered a rapid decline over the past four to five years of life.

Pathologic lesions were said to be related to that rapid decline, according to the press release, but Wilson and his colleagues were said to be “somewhat surprised to find the pathology was very strongly predictive of the mild changes in cognitive function.”

“Higher tangle density adversely affected all forms of cognition at all trajectory points. Both Lewy bodies and stroke approximately doubled the rate of gradual memory decline, and almost no gradual decline was seen in the absence of lesions,” the media statement noted.

“Our study finds that Alzheimer’s disease and related dementias are the root cause of virtually all loss of cognition and memory in old age,” Wilson said. “They aren’t the only contributing factors; other factors affect how vulnerable we are to the pathology and to its effects.  But the pathology does appear to be the main force that is driving cognitive decline in old age.”

“It appears these brain lesions have a much greater impact on memory function in old age than we previously thought,” he added. “Our results challenge the concept of normal memory aging and hint at the possibility that these lesions play a role in virtually all late-life memory loss.”

“Understanding how and when these brain lesions affect memory as we age will likely be critical to efforts to develop treatments that delay memory loss in old age,” Wilson concluded.

On the Net:

Official: NATO Should Build A ‘Cyber Shield’

The North Atlantic Treaty Organization (NATO) must construct a “cyber shield” to protect the alliance from Internet threats to its military and economic infrastructures, a US defense official told the AFP news agency on Wednesday.

Cyber security is a crucial element for the 28-nation alliance to adopt at its summit of leaders in Lisbon November 19 and 20, US Deputy Defense Secretary William Lynn said in Brussels.

Lynn said the alliance needs to play a significant role in “extending a blanket of security over our networks.”

“NATO has a nuclear shield, it is building a stronger and stronger defense shield, it needs a cyber shield as well,” he said.

The US government estimates that more than 100 foreign intelligence agencies and governments try to hack into US systems daily, Lynn said, stressing the magnitude of the challenge.

“I think they see the asymmetric advantage that can be gained through cyber technology,” Lynn said.

The threat of cyber attacks was highlighted in Estonia in 2007 when it suffered an assault that disrupted key business and government web services for days.

The Pentagon was forced to review its own cyber security in 2008 after the most serious cyber attack on the US military networks, which came from a tainted flash drive that was inserted into a military laptop in the Middle East.

The Pentagon strategy has “five pillars” to cyber security, said Lynn.

1) Recognizing cyberspace as the next domain of warfare.

2) The need for active defenses.

3) Protection of critical infrastructure.

4) Enhancing collective defense.

5) The Need to “marshal our technological prowess.”

Lynn also stressed that any cyber security strategy needs to take into account threats to critical infrastructure for economies such as power grids, transportation and financial sectors.

“I think at Lisbon we will see the kind of high-level leadership commitment to cyber defense. It’s the foundation for any alliance effort,” he said.

Lynn said he discussed cyber security at a meeting with NATO’s North Atlantic Council in Brussels Wednesday. “I was very impressed with the unity of purpose and the similar vision that most nations in the alliance seem to have towards the cyber threat,” he said.

On the Net:

New Evidence On How Cranberry Juice Fights Bacteria That Cause Urinary Tract Infections

Scientists reported new evidence on the effectiveness of that old folk remedy “” cranberry juice “” for urinary tract infections at the ACS’ 240th National Meeting. “A number of controlled clinical trials “” these are carefully designed and conducted scientific studies done in humans “” have concluded that cranberry juice really is effective for preventing urinary tract infections,” said Terri Anne Camesano, Ph.D., who led the study. “That has important implications, considering the size of the problem and the health care costs involved.”

Estimates suggest that urinary tract infections (UTIs) account for about 8 million medical visits each year, at a total cost of more than $1.6 billion. Camesano, who is with the Worcester Polytechnic Institute, said the study set out to shed light on how cranberry juice fights E. coli, the most common cause of UTIs. The study involved growing strains of E. coli in urine collected from healthy volunteers before and after consumption of cranberry juice cocktail. The scientists then tested the E. coli for their ability to stick together and form biofilms. Biofilms are thin, slimy layers that provide an environment for bacteria to thrive.

The scientists concluded that cranberry juice cocktail prevents E. coli from sticking to other bacteria and the surface of a plastic petri dish. E. coli that doesn’t stick has a better chance of being flushed out of the urinary track. The results suggest that the beneficial substances in cranberry juice could reach the urinary tract and prevent bacterial adhesion within 8 hours after consumption of cranberry juice.

On the Net:

Researchers Find Selfishness Can Sometimes Help The Common Good

Scientists have overturned the conventional wisdom that cooperation is essential for the well-being of the whole population, finding evidence that slackers can sometimes help the common good. Researchers, from Imperial College London, the Universities of Bath and Oxford, University College London and the Max Planck Institute for Evolutionary Biology studied populations of yeast and found that a mixture of ‘co-operators’ and ‘cheats’ grew faster than a more utopian one of only “co-operators.” The study, publishing next week in the online, open access journal PLoS Biology, used both laboratory experiments and a mathematical model to understand why and how a little “selfishness” can benefit the whole population.

In the study, the “co-operator” yeast produce a protein called invertase that breaks down sugar (sucrose) to give food (glucose) that is available to the rest of the population. The “cheats” eat the broken down sugar but don’t make invertase themselves, and so save their energy.

Professor Laurence Hurst, Royal Society-Wolfson Research Merit Award Holder at the University of Bath, explained: “We found that yeast used sugar more efficiently when it was scarce, and so having ‘cheats’ in the population stopped the yeast from wasting their food. Secondly we found that because yeast cannot tell how much sucrose is available to be broken down, they waste energy making invertase even after there is no sugar left. This puts a brake on population growth. But if most of the population are ‘co-operators’ and the remainder are ‘cheats,’ not all of the population is wasting their energy and limiting growth. For these effects to matter, we found that ‘co-operators’ needed to be next to other ‘co-operators’ so they get more of the glucose they produce. If any of these three conditions were changed, the ‘cheats’ no longer benefitted the population.”

Dr. Ivana Gudelj, NERC Advanced Fellow and Lecturer in Applied Mathematics at Imperial College London added: “Our work illustrates that the commonly used language of ‘co-operators’ and ‘cheats’ could in fact obscure the reality. When the addition of more invertase producers reduces the fitness of all, it is hard to see invertase production as co-operation, even if it behaves in a more classical co-operative manner, benefitting all, when rare.”

The researchers suggest similar situations may exist in other species where ‘cheats’ help rather than hinder the population.

On the Net:

McDonald’s Upset Over New Commercial

A new television ad produced by a Washington-based health group is taking aim at McDonald’s high-fat menu, enraging the fast food giant.

The Physicians Committee for Responsible Medicine produced the new commercial, which is set to be aired in Washington DC during the airing of The Daily Show with Jon Stewart on Thursday.

The ad centers around an overweight, middle-aged man seen lying dead in a morgue holding a half-eaten burger as a woman weeps over his body. McDonald’s omnipresent golden arches then trace the dead man’s feet with the text “I was lovin’ it,” a harsh jibe at McDonald’s long-running slogan “I’m lovin’ it.”

A voiceover then says: “High cholesterol, high blood pressure, heart attacks. Tonight, make it vegetarian.”

PCRM said it is also considering showing the ad in Chicago, Detroit, Houston and Los Angeles.

The ad “takes aim at McDonald’s high-fat menu, with the goal of drawing Washingtonians’ attention to the city’s high rates of heart disease deaths and its high density of fast-food restaurants,” PCRM said in a statement.

People who consume fast food are at a higher risk for obesity, one of the major contributing factors to heart disease, according to several studies, it said.

But the new ad has infuriated McDonald’s.

“This commercial is outrageous, misleading and unfair to all consumers. McDonald’s trusts our customers to put such outlandish propaganda in perspective, and to make food and lifestyle choices that are right for them,” spokeswoman Bridget Coffing said.

Washington D.C. has more McDonald’s, Burger King, and KFC outlets per square mile than eight other cities with similar size populations, PCRM said.

McDonald’s, the world’s largest fast food chain, has seen its earnings grow recently despite the global economic downturn. The company attributes some of its ongoing successes to a range of alternatives to its famous burgers.

On the Net:

Link Between Mother’s Smoking And Adult High Cholesterol

A recent study points to a link between mothers who smoked during pregnancy and babies having an increased risk of developing high cholesterol as adults.

Increasing evidence suggests there is a link between being born small-for-gestational-age (SGA) — smaller than normal for the baby’s sex and at what week in pregnancy the child was born — and having high cholesterol in adulthood, Xiaozhong Wen of Harvard Medical School, in Boston, told Reuters Health.

But Wen and his colleagues were curious if it was just certain groups of people born SGA, or in the bottom 10th percentile for gestational age, that carried the highest risk.

The team wondered if birth size was more to blame than coexisting environmental factors that trigger the serious condition, which can lead to heart disease and stroke?

To shed light on the matter, the team studied the birth records and cholesterol levels of more than 1,350 adults, who were 39 years old, on average. About 25 percent — or 345 people — reported having high cholesterol. Thirty-four percent of subjects with high cholesterol were born SGA, while 24 percent were born at an age-appropriate size.

When digging deeper, the researchers found that only the adults born SGA whose mothers smoked during pregnancy were at an increased risk of developing high cholesterol. After taking into account other puzzling factors, those exposed to a pregnant mother’s heavy smoking — more than a pack a day — had 2.5 times the risk.

Those born SGA to non-smoking moms were not an increased risk compared to peers born at normal sizes, the researchers found. And the normal sized babies who were born to smoking mothers also didn’t appear any more likely to develop high cholesterol compared to those not exposed.

“It seems to be the co-existence of maternal smoking during pregnancy and SGA that put offspring at high risk,” suggested Wen.

Wen acknowledged the study had some limitations, including its observational nature that precludes proving cause and effect, and a small number of participants that were actually born SGA.

Wen said he and his team are planning to expand their research to other diseases, including high blood pressure, heart disease and diabetes. “Besides maternal smoking, other possible co-factors, such as genetics, nutrition and stress, should also be considered,” he said.

For now, the findings give pregnant women one more good reason to stop smoking, said Wen.

Results of the research are published in the journal Epidemiology.

On the Net:

Soaring Obesity Rates Cost US $215 Billion A Year

A new report released Tuesday by the Brookings Institution finds that obesity costs the U.S. $215 billion annually in direct medical expenses and indirect productivity losses.

The study found that medical costs for obese adults are $147 billion dollars more per year than for those of normal weight.  For children, the added costs due to obesity are $14.3 billion per year.

“The overall economic impact of obesity in the US appears to be substantial,” wrote researchers Ross Hammond and Ruth Levine of the Brookings economic studies program.

“Medical costs appear to have increased dramatically over the last decade and may continue to grow with future increases in rates of overweight and obesity in US adults and children, perhaps substantially,” they wrote in the journal Diabetes, Metabolic Syndrome and Obesity: Targets and Therapy.

In addition to the direct costs of obesity, the indirect costs include absenteeism, lost productivity, disability and premature death.

The researchers also found that transportation costs may be higher due to the added weight of obese travelers.

“Total productivity costs are likely substantial, perhaps as high as 66 billion dollars annually for the US,” the researchers wrote.

Although previous research has examined the costs of obesity, it did not focus specifically on the potential external effects, such as transportation and environmental effects.

“Increases in body weight among Americans mean that more fuel and, potentially, larger vehicles are needed to transport the same number of commuters and travelers each year,” the researchers wrote.

“This produces a direct cost (in the form of greater spending on fuel), as well as potential indirect costs in the form of greater greenhouse gas emissions.”

Obesity may also have “human capital” costs, such as a diminished ability to attain higher levels of education.  However, these effects are hard to calculate.

“Women who had been obese in the baseline survey had significantly fewer years of school completed (0.3 year on average),” the researchers said.

“Likewise, they were less likely to be married, had lower household incomes, and higher rates of poverty. For men, the only statistically significant correlation was for marital status.”

The report found that obesity has grown into a significant global epidemic, with some half a billion people worldwide classified as overweight in 2002.

In the United States, more than two-thirds of adults are now overweight.  This includes one-third of the U.S. adult population that are now considered obese.

On the Net:

Smokeless Tobacco Not Safe, Won’t Help Smokers Quit

Smokeless tobacco products should not be used as an alternative to cigarettes or for smoking cessation due to the risk of addiction and return to smoking, according to an American Heart Association policy statement.

Smokeless tobacco products such as dry and moist snuff as well as chewing tobacco may also increase the risk of fatal heart attack, fatal stroke and certain cancers, according to the statement published online in Circulation: Journal of the American Heart Association.

“No tobacco product is safe to consume,” said Mariann Piano, Ph.D., lead writer of the statement and a professor in the Department of Biobehavioral Health Science at the University of Illinois at Chicago.

The statement also addresses a controversy over whether smokeless tobacco product use is a “safer” alternative to smoking. The idea that smokeless tobacco products are preferable to cigarettes is based in part on the Swedish experience where there was a significant decrease in smoking among Swedish men between 1976 and 2002 which corresponded to an increase in the use of smokeless tobacco.

However, the opposite was true in a recent United States study which found no reduction in smoking rates among people using smokeless tobacco products. For people trying to quit smoking, nicotine replacement therapy (nicotine gum or a nicotine-releasing patch placed on the skin) is a safer alternative compared to using smokeless tobacco products. Clinical studies have found no increased risk of heart attack or stroke with either type of nicotine replacement therapy.

As smoke-free air laws become common in the U.S., smokeless tobacco products have been marketed as a situational substitute (“pleasure for whenever”) for cigarette smoking when smoking is prohibited.

“Smokeless tobacco products are harmful and addictive ““ that does not translate to a better alternative,” Piano said.

Smokeless tobacco also is being used more by teenage boys, according to the statement. The Food and Drug Administration issued a final regulation related to the Tobacco Control Act that became effective June 22 that prohibits the sale of tobacco products to anyone younger than 18 years.

“Scientists and policy makers need to assess the effect of “reduced risk” messages related to smokeless tobacco use on public perception, especially among smokers who might be trying to quit,” said Piano.

Co-authors are Neal L. Benowitz, M.D.; Garret A. FitzGerald, M.D.; Susan Corbridge, Ph.D., A.P.N., ACNP; Janie Heath, Ph.D.; Ellen Hahn, Ph.D.; Terry F. Pechacek, Ph.D.; George Howard, D.P.H. Author disclosures are on the manuscript.

On the Net:

Aqua Satellite Provides Snapshot Of Sea Ice

The Arctic Ocean is covered by a dynamic layer of sea ice that grows each winter and shrinks each summer, reaching its yearly minimum size each fall. While the 2010 minimum remains to be seen, NASA’s Aqua satellite captured this snapshot on Sept. 3.

How does the Aqua satellite “see” sea ice? Microwaves. Everything on Earth’s surface — including people — emits microwave radiation, the properties of which vary with the emitter, thereby allowing the AMSR-E microwave sensor on Aqua to map the planet.

Ice emits more microwave radiation than water, making regions of the ocean with floating ice appear much brighter than the open ocean to the AMSR-E sensor. This difference allows the satellite to capture a sea ice record year-round, through cloud cover and the months of polar night. Continuous records are important because sea ice is dynamic. Besides melting and freezing, the ice moves with wind and currents which can cause it to split or pile up.

“The data from AMSR-E and other NASA satellites are critical for understanding the coupling between sea ice and the ocean and atmosphere,” said Tom Wagner, Cryosphere program manager at NASA Headquarters in Washington. “It’s important for us to understand these connections to improve our predictive models of how the planet will change.”

The Arctic sea ice is a major factor in the global climate system. The ice cools the planet by reflecting sunlight back into space. It also helps drive ocean circulation by converting the warm Pacific water that flows into the Arctic into the cold, saltier water that empties into the Atlantic. The sea ice also fundamentally shapes the Arctic; defining the organisms that make up its ecosystem and keeping heat from the ocean from melting the frozen tundra.

In fall 2009, Arctic sea ice reached its minimum extent on about Sept. 12, and was the third lowest since satellite microwave measurements were first made in 1979. Researchers are interested in year-to-year changes, which can be highly variable, so that scientists need many years, even decades, of data to examine long-term trends. Notably, all of the major minimums have occurred in the last decade, consistent with other NASA research, which shows January 2000 to December 2009 was the warmest decade on record.

As the sea ice nears the 2010 minimum later this month, look for images and analysis from NASA and the National Snow and Ice Data Center, in Boulder, Colo.

Kathryn Hansen / NASA’s Earth Science News Team

Image Caption: This image was compiled using data gathered by NASA’s Aqua satellite on Sept. 3, 2010. Credit: NASA Goddard’s Scientific Visualization Studio

On the Net:

Superbug Found In 3 U.S. States, Global Response Needed

A new “superbug” from India that is resistant to every known antibiotic has sickened people in three states as it continues to spread throughout the globe, health officials said Monday.

The bacteria now poses a worldwide threat, warned experts attending the 50th annual meeting of the Interscience Conference on Antimicrobial Agents and Chemotherapy (ICAAC), the world’s largest gathering of infectious disease specialists.

“There is an urgent need, first, to put in place an international surveillance system over the coming months and, second, to test all the patients admitted to any given health system” wherever possible, said Patrice Nordmann of Bicetre Hospital in France.

“For the moment, we don’t know how fast this phenomenon is spreading… it could take months or years, but what is certain is that it will spread,” Nordmann told the AFP from the ICAAC conference in Boston, where 12,000 specialists were in attendance from September 12-14.

Measures have already been approved in France, and are being negotiated in Japan, Singapore and China, said Nordmann, a microbiology professor at South-Paris Medical School and head of Bicetre’s department of bacteriology and virology.

“It’s a bit like a time bomb,” he warned, urging health authorities to track the new superbug, which contains the gene ““ known as NDM-1 (New Delhi metallo-beta-lactamase 1) ““ believed responsible for the antibiotic resistance.

The bacteria and its variants appear to have originated in India, but were first detected in Britain in 2007.

A citizen of Belgium who had been hospitalized in Pakistan following an auto accident was the first known death related to an NDM-1 infection, the AFP reported.

The U.S. cases, along with two others in Canada, all involve people who had recently received medical care in India.

Last month, an article published in a British medical journal described scores of cases involving Britons who received medical procedures in India.

However, the total number of deaths from the infection is unknown since there is no central tracking of such cases.

So far, NDM-1 has primarily been found in bacteria that cause digestive or urinary infections.

Scientists have long feared an infectious-disease nightmare such as this “” a highly adaptable gene that incorporates itself into many types of common germs and confers widespread drug resistance.

“It’s a great concern,” because antibiotic resistance has been increasing, and few new antibiotics are in development, said Dr. M. Lindsay Grayson, director of infectious diseases at the University of Melbourne.

“It’s just a matter of time” until the gene is transmitted more broadly person-to-person, he told The Associated Press (AP).

This year, the U.S. cases occurred in people from California, Massachusetts and Illinois, said Brandi Limbago, a lab chief with the U.S. Centers for Disease Control and Prevention (CDC).

Three types of bacteria were involved in the cases, and three different mechanisms allowed the gene become part of them.

“We want physicians to look for it,” particularly in patients who have recently traveled to Pakistan or India, she added.

Limbago advises people not to add to the problem of drug resistance by pressuring doctors for antibiotics if they say they are not needed.  Instead, use the ones that are prescribed properly, and try to avoid infections by thorough hand washing.

The gene is carried by bacteria that can spread hand-to-mouth, making good hygiene a critical component of preventing infections.

It’s also why health officials are so concerned about where the threat is coming from, said Nordmann. 

With 1.3 billion people, India is an overpopulated nation that overuses antibiotics and has widespread diarrheal disease.  Many of its citizens are without clean water.

“The ingredients are there” for widespread transmission, Nordmann told AP.

“It’s going to spread by plane all over the world.”

The U.S. patients were not related. The case in California involved a woman who sought hospital care following a car accident in India. The Illinois case involved a man with pre-existing medical conditions and a urinary catheter.  He is believed to have contracted an infection with the gene while traveling in India. The Massachusetts case involved a woman from India who had surgery and chemotherapy there before traveling to the United States.

According to lab tests, in all three cases the germs were not killed by the types of antibiotics typically used to treat drug-resistant infections, including “the last-resort class of antibiotics that physicians go to,” Limbago said.

Although she did not know how the three patients were treated, all survived.

Physicians have tried treating some cases with combinations of antibiotics, hoping the approach would be more effective than individual antibiotics alone.

Some doctors have even turned to using polymyxins “” antibiotics used half a century ago that were unpopular because they can result in kidney damage.

The two Canadian cases, one in Alberta and one in British Columbia, were treated with a combination of antibiotics, said Dr. Johann Pitout of the University of Calgary in Alberta, Canada.

Both patients had medical emergencies while traveling in India, and developed urinary infections involving bacteria once they returned home to Canada, Pitout told AP.

The CDC advises any hospitals that find NDM-1 cases to medically isolate the patient, check the patient’s close contacts for possible infection and look for possible additional infections in the hospital.

Any case “should raise an alarm,” said Limbago.

Image Caption: Klebsiella pneumoniae, the bacterium in which NDM-1 was first identified. Credit: CDC

On the Net:

BPS: Brain Positioning System

Keeping better track of yourself and your keys

Imagine if getting lost became a thing of the past. Even the common search for lost keys would no longer seem like a lost cause. Well, cognitive psychologist Amy Shelton of Johns Hopkins University is doing research that might help us keep track of ourselves, as well as our things. “What we’re trying to study is when you get around in the world and in your day-to-day environments, how is it that you learn them,” she explains. With support from the National Science Foundation (NSF), Shelton is exploring some of the ins and outs of our brains’ navigation system.

On the day Science Nation visited, we found Shelton at the Kennedy Krieger Institute. This is where she comes to examine the brains of people willing to spend time learning their way through a virtual world that is projected on a small screen. They also have to lie still within the confines of an MRI while their brain blood flow is analyzed. Today, one of Shelton’s research associates, Scott Clark, is the one willing to have his head examined, so Shelton can demonstrate one of her experiments.

“He’s watching a movie of an observer moving around and, as the observer moves around, you see these items pop up,” she says. “There’s a shopping cart, and a palm tree … and the job is to learn where in the environment those items are located.”

Later, Clark takes an easy retrieval test and that reveals some hard evidence about how he learns the world around him. Does he make a map in his head, known as a place learner? Or does he follow the same route over and over again like a creature of habit, known as a response learner?

“What we look for is: are they taking the shortcuts or are they sticking to their familiar path,” continues Shelton. “And this tends to be very diagnostic.” The test requires the subject to recall the location of items that are in the virtual world. Shelton points out that if the individuals take short cuts to get to items, they tend to be place learners. But if they take familiar paths or routes, they would be response learners.

The MRI images Shelton takes can easily distinguish between place learners and response learners. “The hippocampus is more active or sort of pops up for people who are showing place learning on that test: who are taking detours or who are using space more flexibly. Whereas the caudate, is more active for people who are creatures of habit: using the familiar routes over and over again,” she says.

The hippocampus and caudate are parts of the brain.

Shelton believes that her research will help develop better memory techniques. “By understanding the different kinds of learning mechanisms and, in particular, what kinds of advantages and disadvantages each type of learning conveys, we can start to tailor people’s GPS systems to play to their advantages, she adds.”

If Shelton has her way, getting lost might get a whole lot harder.

By Miles O’Brien, Science Nation Correspondent

Image Caption: The human brain is a three-pound paradox: We use it every moment of our lives, yet so much about our brains remains a mystery to us. Four leading neuroscientists and psychologists discuss some major issues in current brain research in these videos. Credit: Morguefile

On the Net:

Touch-Sensitive Artificial Skin For Robots

U.S. researchers said Sunday that new artificial “skin” fashioned out of flexible semiconductor materials can sense touch, making it possible for robots to be able to grip eggs but be strong enough to hold a frying pan as well.

Scientists have been struggling with a way to try and make robots be able to adjust the amount of force needed to hold different objects.  The pressure-sensitive materials are designed to help overcome that challenge.

Ali Javey, an electrical engineer at the University of California Berkeley, told The Associated Press (AP), “humans generally know how to hold a fragile egg without breaking it.”

“If we ever wanted a robot that could unload the dishes, for instance, we’d want to make sure it doesn’t break the wine glasses in the process. But we’d also want the robot to be able to grip a stock pot without dropping it,” Javey, who led one of two teams reporting on artificial skin discoveries in the journal Nature Materials, said in a statement.

The team found a way to make ultra tiny “nanowires” out of an alloy of silicon and germanium.  Materials from these wires were formed on the outside of a cylindrical drum depositing the wires in a uniform pattern.

A second team led by Zhenan Bao, a chemical engineer at Stanford University in California, made a material so sensitive it detects the weight of a butterfly resting on it.

Bao’s sensors were made by sandwiching a highly elastic rubber layer between two electrodes in a regular grid of tiny pyramids.

“We molded it into some kind of microstructure to incorporate some air pockets,” Bao said in a telephone interview with AP. “If we introduce air pockets, then these rubber pieces can bounce back.”

Once this material is stretched out, the artificial skin measures the change in electrical activity.

“The change in the thickness of the material is converted into an electrical signal,” she told AP.

The team hopes that artificial skin will eventually be used to restore the sense of touch in people who have prosthetic limbs.  However, scientists will first need to have a better understanding of how the system’s sensors work with the human nervous system.

The first team’s artificial skin is the latest application of new ways of processing brittle, inorganic semiconductor materials like silicon into flexible electronics.

A team at the California Institute of Technology in Pasadena devised a way earlier this year to help make flexible solar cells with silicon wires that are thin enough to be used in clothes.

Image 1: An optical image of a fully fabricated e-skin device with nanowire active matrix circuitry. Each dark square represents a single pixel. (Images: Ali Javey and Kuniharu Takei, UC Berkeley)

Image 2: An artist’s illustration of an artificial e-skin with nanowire active matrix circuitry covering a hand. The fragile egg illustrates the functionality of the e-skin device for prosthetic and robotic applications. (Images: Ali Javey and Kuniharu Takei, UC Berkeley)

On the Net:

Malfunctioning Protein Triggers Type 2 Diabetes

Scientists say that shedding light on how a malfunctioning protein helps trigger type 2 diabetes could eventually help offer the chance to stop the damage, according to BBC News.

The presence of amyloid protein may help create a chain reaction that destroys vital insulin-producing cells.

Researchers say that future drugs could help target this process.

Amyloid is also implicated in many other diseases such as Alzheimer’s.

Type 2 diabetes is the most common form of the disease and it normally develops later on in adulthood.

It takes place when the body loses both its ability to produce enough insulin to control blood sugar levels, and becomes resistant to the insulin that it does have.

The scientists first noticed “deposits” of the amyloid protein in pancreatic tissue of some people with type 2 diabetes a few years ago.

Originally, it was thought that the protein could poison the cells directly.  However, new research offers an additional explanation.

The study found that a type of immune cell known as macrophage reacted abnormally when it ingested amyloid.

The cell triggered activity in other cells, which in turn released proteins that cause inflammation.

The inflammation then destroys the vital beta cells, and the ability to produce insulin shrinks.

The researchers wrote in the journal Nature Immunology that the finding would “spur new research” to target the mechanisms of the disease.

Dr. Eric Hewitt, a researcher into amyloid-related disease at Leeds University, told BBC News that the paper was “interesting,’ and may explain why the presence of amyloid deposits could be so damaging.

He told BBC: “It suggests we are looking at a very complex disease – we know that amyloid is present in some type 2 diabetics, but not others.

“What we have is a second indirect mechanism which can lead to the destruction of beta cells, and this could be helpful when looking at other diseases which may involve amyloid, such as Alzheimer’s.”

“It does offer a possible opportunity to interrupt this mechanism at some point in the future and perhaps stop the disease from progressing.”

On the Net:

Accumulator (computing)

An accumulator is a register in a computer’s central processing unit (CPU) that stores mathematical information, such as additions, subtractions, etc. For example, when adding a list of numbers, each number is entered into the accumulator and added together. Once all numbers have been inputted into the accumulator, they are stored on the computer in the register. Without an accumulator, each calculation would have to be inputted manually.

An accumulator machine, or a 1-operand machine, is a kind of CPU that has one main register, or “the” accumulator of the computer, that stores most information. A computer may have multiple registers for information, but in an accumulator machine most calculations are stored in a specific register. Most early computers were accumulator machines, and many microcontrollers of the current century are essentially just complex accumulator machines.

In order to distinguish a particular register as the accumulator of the computer, it must be used as an implicit operand for mathematical calculations. For example, if a CPU has an instruction to add a calculated result from a particular address (i.e. rdaddress), then the accumulator would read the value from the memory location at rdaddress and add it to its memory. Also, the accumulator would not be identified in the instruction; instead, it is an implicit and is the only register that can be specified in the instruction.

Example instructions for accumulators include:
-Clear accumulator and add number from memory location x.
-Add number copied from memory location x to the contents of the accumulator.
-Subtract number copied from memory location x from the contents of the accumulator.
-Clear accumulator and shift contents of register into accumulator.

Link Found Between Beetle Attacks, Wildfire

If your summer travels have taken you across the Rocky Mountains, you’ve probably seen large swaths of reddish trees dotting otherwise green forests. While it may look like autumn has come early to the mountains, evergreen trees don’t change color with the seasons. The red trees are dying, the result of attacks by mountain pine beetles.

Mountain pine beetles are native to western forests, and they have evolved with the trees they infest, such as lodgepole pine and whitebark pine trees. However, in the last decade, warmer temperatures have caused pine beetle numbers to skyrocket. Huge areas of red, dying forest now span from British Columbia through Colorado, and there’s no sign the outbreak is slowing in many areas.

The affected regions are so large that NASA satellites, such as Landsat, can even detect areas of beetle-killed forest from space. Today, NASA has released a new video about how scientists can use Landsat satellite imagery to map these pine beetle outbreaks, and what impact the beetle damage might have on forest fire.

As the dog days of summer hit full force, some say the pine beetles have transformed healthy forest into a dry tinderbox primed for wildfire.

For Yellowstone National Park Vegetation Management Specialist Roy Renkin, those worries are nothing new. “I’ve heard [the tinderbox analogy] ever since I started my professional career in the forestry and fire management business 32 years ago,” he said. “But having the opportunity to observe such interaction over the years in regards to the Yellowstone natural fire program, I must admit that observations never quite met with the expectation.”

The idea that beetle damaged trees increase fire risks seems a logical assumption ““ dead trees appear dry and flammable, whereas green foliage looks more moist and less likely to catch fire. But do pine beetles really increase the risk of fire in lodgepole pine forest? University of Wisconsin forest ecologists Monica Turner and Phil Townsend, in collaboration with Renkin, are studying the connection in the forests near Yellowstone National Park. Their work — and their surprising preliminary results — are the subject of the NASA video.

First, the researchers used Landsat data to create maps of areas hardest hit by the recent beetle outbreak. The Landsat satellites capture imagery not just in the visible spectrum, but also in wavelengths invisible to the human eye. One such wavelength band combination includes the near infrared, a part of the spectrum in which healthy plants reflect a great deal of energy. By scanning the Landsat near infrared imagery, the team located areas of probable beetle damage.

Next, they hiked into the areas to confirm that the majority of the affected trees were indeed killed by beetles rather than by other causes. Mountain pine beetles leave telltale signs of their presence, including “pitch tubes” — areas of hardened resin where trees attempt to defend themselves from the boring insects by flowing sticky pitch from the wounds. By scanning the trees for pitch tubes and looking for beetle “galleries” under the bark where the adult insect lays its eggs, the team was able to confirm that they were reading the satellite imagery correctly.

Finally, the University of Wisconsin team compares maps of beetle-killed forest with maps of recent fires.

“Of course, we can’t go out and actually set a fire in beetle damaged areas where we’ve got red, green or no needles,” Townsend said. We just can’t do that, so we collect data on the ground, we collect data from satellites, and then we build models of how much fuel is there and how burnable it is.”

Their preliminary analysis indicates that large fires do not appear to occur more often or with greater severity in forest tracts with beetle damage. In fact, in some cases, beetle-killed forest swaths may actually be less likely to burn. What they’re discovering is in line with previous research on the subject.

The results may seem at first counterintuitive, but make sense when considered more carefully. First, while green needles on trees appear to be more lush and harder to burn, they contain high levels very flammable volatile oils. When the needles die, those flammable oils begin to break down. As a result, depending on the weather conditions, dead needles may not be more likely to catch and sustain a fire than live needles.

Second, when beetles kill a lodgepole pine tree, the needles begin to fall off and decompose on the forest floor relatively quickly. In a sense, the beetles are thinning the forest, and the naked trees left behind are essentially akin to large fire logs. However, just as you can’t start a fire in a fireplace with just large logs and no kindling, wildfires are less likely to ignite and carry in a forest of dead tree trunks and low needle litter.

Forest ecologists noted this same phenomenon after the massive Yellowstone wildfires in 1988. As large crown fires swept quickly through the forest, many trees were killed and their needles burned off, but the standing dead tree trunks remained. In the ensuing years, new wildfires have tended to slow and sometimes even burn out when they reach standing dead forest. There simply aren’t enough small fuels to propel the fire.

For Townsend, the results are a further reminder that, in complex ecosystems like that in and around Yellowstone, things aren’t always as they appear at first blush.

“I think it’s important for people not to assume that there are relationships for certain types of features on the landscape,” he says. “It’s easy to think, ‘It’s more damaged so more likely to burn.’ That’s why it’s important to ask questions and not take everything as gospel truth, but go out and see if what we think is happening in our mind is really happening on the ground.”

While pine beetle attacks may not, in fact, increase fire risk in western forests, Townsend believes fire and beetles do share a connection — climate change.

Cold winter nights have traditionally kept beetle numbers in check by killing off larvae as they overwinter in trees. In the last decade, winter nighttime temperatures have not dipped as low — an observation predicted by climate change models. More beetles are surviving to damage larger areas of forest.

Fires, of course, are also affected by warmer temperatures. As temperatures warm and some areas become drier, many climate scientists predict fires to increase in number and size.

Both hold the potential to significantly change Rocky mountain forests, but, as Townsend noted, both are also key to forest health.

“Both fire and beetle damage are natural parts of system and have been since forests developed,” Townsend said. “What we have right now is a widespread attack that we haven’t seen before, but it is a natural part of the system.”

Renkin agrees with the assessment. “Disturbances like insect outbreaks and fire are recognized to be integral to the health of the forests,” he said, “and it has taken ecologists most of this century to realize as much. Yet when these disturbances occur, our emotional psyche leads us to say the forests are ‘unhealthy.’ Bugs and fires are neither good nor bad, they just are.”

The Rocky Mountain West has experienced relatively few large fires this year, but the fire season isn’t over yet. The end of the current pine beetle outbreak is likely even further away.

As a result, future summer travelers are likely to see more of these two Rocky mountain natives — mountain pine beetles and fire.

Landsat is a joint program of NASA and the U.S. Geological Survey.

The study is published in the September issue of the journal BioScience.

By Jennifer Shoemaker, NASA’s Goddard Space Flight Center

Image 1: Mountain pine beetles have killed large areas of forests in the West, such as these dead and dying trees flanking Avalanche Peak on the border between Yellowstone National Park and the Shoshone National Forest in Wyoming. Credit: Roy Renkin, National Park Service

Image 2: Mountain pine beetle image courtesy USDA Forest Service, Rocky Mountain Region Archive. (Image No. 1441137)

On the Net:

Report Describes Psychological Effects Of Cybercrime

More than one-in-four Internet users have given a fake name while online, and more than 20 percent have done something they regret while surfing the Web, according to a groundbreaking study released Wednesday by Internet security firm Symantec.

The study, entitled “Norton Cybercrime Report: The Human Impact”, reveals the widespread problem of global cybercrime, including identity theft, viruses, hacking, online harassment, cyberscams, phishing and sexual predation. 

It also highlights some of the conflicting beliefs about our own unethical or illegal behavior, said Symantec.

Some 7,000 adults in 14 countries participated in the study, nearly two-thirds of which said they had been a victim of cybercrime.
 
The most victimized are in China (83%), followed by Brazil and India (tied at 76%) and the United States (73%), Symantec said.

The report also includes data about online activities that may seem questionable, such as lying, spying on others, and the illegal downloading of music and videos.

For instance, seventeen percent of the survey’s respondents said they had lied about their age or where they live while they were online.  Nine percent reported lying about their financial or relationship status, and seven percent reported having lied about their appearance.

Although the psychological effects of cybercrime on its victims can vary, the study found that 58 percent of respondents reported feeling angry, 29 experienced fear, 26 percent felt helpless and 78 percent reported feeling guilty.

Psychology professor Joseph LaBrie of Loyola Marymount University said cybercrime victims often experience “learned helplessness”.

“It’s like getting ripped off at a garage ““ if you don’t know enough about cars, you don’t argue with the mechanic. People just accept a situation, even if it feels bad,” Symantec quoted him as saying.

The report reveals a disparity among countries in terms of the costs and complexity to victims in responding to cybercrime. 

In Britain, for example, 59% of respondents said they had been a victim of cybercrime, requiring an average of 25 days and $153 to resolve the matter.

Cybercrime victims in Brazil and India reported significantly different results, with Brazilian victims requiring an average of 43 days and $1408 to resolve their problem, while victims in India required 44 days and just $114 to settle the matter.

Sweden had the shortest average resolution time, at just nine days, with an average cost of $178.

In addition to examining the effects of cybercrime, the report also analyzed our own online activities, finding they can sometimes cross into hypocritical or even unethical territory, Symantec said.

Roughly one in six respondents said it was “legal” to download music or videos without paying for the content, while 17 percent said they view plagiarism as an acceptable practice.

Nearly one-third had e-mailed or posted pictures of someone else without permission, while 25 percent said they had secretly snooped into another person’s browsing history.

Orla Cox, a security operations manager with Symantec, said she was not surprised about the study’s results with respect to the honesty of the respondents.

“A lot of people, while they want to get information about other people on the web, they themselves would like to remain somewhat anonymous, to hide some of their own information so as to be not too easily identifiable on the web,” she said during an interview with BBC News.

“I don’t think it’s always a bad thing but certainly people are trying to create a whole different identity for themselves for nefarious purposes.”

The full report, entitled “Norton Cybercrime Report: The Human Impact”, can be viewed at http://www.symantec.com/norton/theme.jsp?themeid=cybercrime_report

Iowa State Study Finds Corn Bred To Contain Beta-Carotene Is A Good Source Of Vitamin A

A new Iowa State University study has found that corn bred to contain increased levels of beta-carotene is a good source of vitamin A. The discovery gives added support to the promise of biofortified corn being developed through conventional plant breeding as an effective tool to combat vitamin A deficiency in developing countries.

Beta-carotene is converted in the body to vitamin A. The researchers found that the beta-carotene in the corn was converted to vitamin A at a higher rate than what’s predicted for corn, and higher than the rate for beta-carotene in vegetables – including spinach and carrots, among others.

Wendy White, an ISU associate professor of food science and human nutrition, led the six-week study conducted at Iowa State’s Nutrition and Wellness Research Center. The results validate the promise of ‘orange’ maize that will soon be released to combat vitamin A deficiency in sub-Saharan Africa.

According to a 2009 World Health Organization estimate, vitamin A is deficient in more than half of the world’s countries, with Africa and Southeast Asia having the highest deficiencies. Medical researchers have reported vitamin A deficiency to be one of the most serious causes of malnutrition in developing countries and can cause blindness, poor immune function and even premature death — particularly in young children.
Working with HarvestPlus on biofortified corn

The effort to biofortify corn with beta-carotene is being led by HarvestPlus (http://www.harvestplus.org/) – a global research initiative directed, in part, by the Washington, D.C.-based International Food Policy Research Institute.

“Biofortification is a revolutionary approach to combating micronutrient malnutrition in developing countries and it has the potential to be self-sustaining,” White said. “The seeds are bred by plant breeders to be naturally high in key micronutrients, such as vitamin A, zinc and/or iron. And then the seeds will ultimately be distributed to poor farmers in developing countries and they’ll be able to reproduce the seeds so they can share them with their communities.

“This study answered a major feasibility concern for the biofortification program because plant breeders were quickly successful in ramping up the beta-carotene content in the corn, but then the question was, ‘Would it be available to be absorbed and utilized by people?,'” she continued. “So what we’ve shown is the beta-carotene is bioavailable to be converted to vitamin A in the body, and much more so than previously expected.”

The study was posted online this month by the American Journal of Clinical Nutrition, which is published by the American Society for Nutrition. Iowa State graduate students Shanshan Li and Angela Nugroho, and Purdue University researcher Torbert Rocheford — who was at the University of Illinois at Urbana-Champaign at the time the research was conducted — collaborated with White on the study. An abstract is available at: http://www.ajcn.org/cgi/content/abstract/ajcn.2010.29802v1.

The researchers had their six healthy female subjects, between the ages of 18 and 30, consume 250-gram portions of maize porridge three times at two-week intervals. Each subject consumed the beta-carotene biofortified maize porridge, as well as two white maize control porridges that were naturally devoid of beta-carotene, but contained known amounts of added beta-carotene or vitamin A. Blood samples were drawn after they ate each porridge to determine the amount of vitamin A that was absorbed in the blood.
An important step in fighting malnutrition

White says the study’s findings provide an important step in the process of making the biofortified corn available to the people who desperately need vitamin A in their diets.

“These [their subjects] were mostly graduate students based in the U.S. who were screened for excellent health. So this study was conducted under ideal conditions,” White said. “And so the next step — knowing that under ideal conditions the beta-carotene can be well absorbed — is to take it into a more field setting.”

White reports that there is already a pilot program being conducted in Zambia to feed the beta-carotene, biofortified maize to young children to increase their vitamin A intake. HarvestPlus is conducting that project and supported the development of the maize for the Iowa State study.

The HarvestPlus Challenge Program was launched when it became the first recipient of funding for biofortification research granted by the Bill and Melinda Gates Foundation.

On the Net:

Floppy Disk

A floppy disk is a thin, flexible magnetic storage device that is stored in a square or rectangular plastic cover. They are read and written by a floppy disk drive in the computer. There are five major parts of a floppy disk. The read/write heads are on both sides and move together. The drive motor is a small spindle motor that spins in the center, while the stepper number moves at a certain speed to read or write. The mechanical frame is a system of levers that open a protective window, along with a button to eject the disk. The circuit board contains all the electronics of the disk. It allows the entire disk to work and all the parts rely on it. To operate the floppy disk, there is a small motor in the drive that rotates the disk at a certain speed. At the same time, a second motor-operated mechanism moves the magnetic read/write head along the surface of the disk. In order to write data onto the disk, a current is sent through a coil in the head. The magnetic field of the coil magnetizes spots on the rotating disk, and the change in magnetization programs the digital data. To read data, the tiny voltages stimulated in the head coil by the magnetization on the disk are detected and sent to the floppy disk controller. It separates the data from the stream of pulses coming from the drive to decode it and make sure it is free of errors. Then, it sends the data on to the host computer system. If the floppy disk is blank, it will have a uniform featureless coating of magnetic oxide on it. Initially, a pattern of tracks that are magnetized and broken up into sectors is written on the disk so it can find data when it’s ready. The tracks are rings around the disk with empty spaces that are open for data to be written. Some of the gaps are filled with padded bytes that help with speed variation; however, when new data is introduced, the controller discards them. Each sector of data has a header that distinguishes the sector location on the disk. A cyclic check is written on the headers to check for errors when reading data. Most computer systems have programs built in that are designed to format blank disks.

The floppy disk has drastically impacted the way computers store information and has become present in most home and personal computers today. Before hard disks, floppy disks were also used to store the computer’s operating system as well as other data. As time went on, data sizes increased which meant that multiple floppy disks were needed. To help with that problem, CD-ROMs and Zip drives were introduced. However, in the case of the zip drive, they were unaffordable and did not have a standard device. Instead, drives and media were constantly changing and therefore not compatible. USB flash drives became popular as well. Presently, manufacturers and retailers have reduced the availability of floppy disk drives in computers and of the disks themselves.

The first known floppy disks were created by IBM and available in 1971. Shugart Associates introduced the first 5.25 inch floppy disk drive and media in 1976. The production of these grew quickly and then faded. However, in 1984, the 1.2 megabyte dual sided floppy disk was introduced, which influenced the development of more advanced double sided disks. Throughout the 1990s, many attempts were made to introduce newer formatted disks based on the universal 3.5 inch format. While it was thought that floppy disks would be replaced, they never were.

To use a floppy disk, it must be inserted into the disk drive with the medium opening first. The lever has to be moved down close to the drive so it can engage the motor and heads with the disk. The circular hole in the center of the disc allows room for the spindle to help the head to read and write the data. It’s spun by rotating it from the middle hole. Inside the disk are two layers of fabric designed to reduce friction between the device and its cover. Generally, floppy disks are incompatible because of their size differences. For example, 3.5 inch disks are unable to be read or written alongside an 8 inch disk. Also, the data is usually written to floppy disks in a series of sectors, angular blocks of the disk, and in tracks, concentric rings at a constant radius. To compare, 3.5 inch floppy disks use 512 bytes per sector, 18 sectors per track, 80 tracks per side and two sides, for a total of 1,474,560 bytes per disk. There are also many different formats and types of floppy disks that all work in different ways, at different speeds.

One advantage to floppy disks is that they will not destroy each other if they are stored close together because of their low magnetism. Also, they are fairly affordable. One of the biggest problems with the floppy disk is its vulnerability. Even inside a closed plastic case, the disk medium is still very sensitive to dust, condensation and extreme temperatures. As with all magnetic storage, it is also vulnerable to magnetic fields.

Photo Copyright and Credit

Observations Could Explain Galaxy Evolution

Researchers have observed the signs of distant dwarf galaxies being swallowed up by spiral galaxies, and their findings “could shed further light on the evolution of galaxies,” according to BBC News.

A team of researchers, led by David Martinez-Delgado of Germany’s Max Planck Institute for Astronomy (MPIA), noted that the dwarf galaxies that are being absorbed tend to form vine-like structures and long strands of stars known as stellar streams that are produced due to tidal forces.

While the phenomenon has been seen in nearby “Local Group” galaxies for years, this marks the first time that it has been spotted in spiral galaxies farther away–some as far as 50 million light years from Earth, according to BBC reports on Tuesday.

“As part of a pilot survey for such interaction signatures, we have carried out ultra deep, wide field imaging of 8 isolated spiral galaxies in the Local Volume, with data taken at small (D =0.1-0.5m) robotic telescopes that provide exquisite surface brightness sensitivity (ÃŽ¼lim(V ) ∼ 28.5 mag/arcsec2),” Martinez-Delgado and his colleagues wrote in The Astronomical Journal.

“This initial observational effort has led to the discovery of six previously undetected extensive (to ∼ 30kpc) stellar structures in the halos surrounding these galaxies, likely debris from tidally disrupted satellites,” they added. “In addition, we confirm and clarify several enormous stellar over-densities previously reported in the literature, but never before interpreted as tidal streams.”

The research was conducted using lower-powered, remotely controlled telescopes belonging to amateur astronomers in the U.S. and Australia, according to the BBC. Furthermore, the British news agency notes that the telescopes were fitted with apertures ranging from 10cm to 50cm and were also equipped with commercially-available charged-coupled device (CCD) cameras.

“Even this pilot sample of galaxies exhibits strikingly diverse morphological characteristics of these extended stellar features: great circle-like features that resemble the Sagittarius stream surrounding the Milky Way, remote shells and giant clouds of presumed tidal debris far beyond the main stellar body, as well as jet-like features emerging from galactic disks,” the researchers wrote. “Together with presumed remains of already disrupted companions, our observations also capture surviving satellites caught in the act of tidal disruption.”

“The common existence of these tidal features around ‘normal’ disk galaxies and the morphological match to the simulations constitutes new evidence that these theoretical models also apply to a large number of other Milky Way-mass disk galaxies in the Local Volume,” they concluded.

Joining Martinez-Delgado on the research team were R. Jay Gabany of the Black Bird Observatory, Ken Crawford of the Rancho del Sol Observatory, Stefano Zibetti and Hans-Walter Rix of the MPIA, Steven R. Majewski and David A. McDavid of the University of Virginia, Taylor S. Chonis from the University of Texas, as well as researchers from Spain’s Instituto de Astrofısica de Canarias, the Massachusetts Institute of Technology (MIT), Germany’s Argelander Institut fur Astronomie, the Observatories of the Carnegie Institution of Washington, and the University of Cambridge.

Image Caption: Stellar streams around the galaxy M 63: remnants of a satellite galaxy that M 63 has swallowed. The central part is an ordinary positive image; in the outer regions, the negative of the image is shown. In this way, the faint structures that are the target of this survey are more readily discerned. This galaxy’s distance from Earth is around 30 million light-years. The new survey has, for the first time, shown the presence of such tell-tale traces of spiral galaxies swallowing smaller satellites for galaxies more distant than our own “Local Group” of galaxies. Image Credit: R. Jay GaBany (http://www.cosmotography.com/images/small_ngc5055.html) in collaboration with David Martinez-Delgado.

On the Net:

Solar Impulse Plane To Make More Flights

The Solar Impulse team, which successfully conducted a round-the-clock flight powered only by the sun, said Tuesday it will next plan three test flights across Switzerland as it prepares the plane for longer journeys.

“Solar Impulse HB-SIA, the solar-powered aircraft… will undertake three flights, Payerne-Geneva, Geneva-Zurich and Zurich-Payerne, with no fuel,” the team said in a statement.

The team hopes the tests will “contribute to opening a new era of aviation.”

The plane will first fly from western Switzerland’s military base, Payerne, and land at Geneva International Airport. It will then fly from Geneva to northern Switzerland’s International Airport, and then make one more flight from Zurich back to Payerne.

The date for the test flights has not been confirmed, but a spokeswoman told the AFP news agency it would most likely take place in September.

The flights will train the team to work outside their bases as well as to collaborate with international airports. The team said the “The objective is to prepare the long-distance flights scheduled for next year.”

The Solar Impulse project came into existence seven years ago with the sights set on ocean crossings, transcontinental and global flights by 2014.

On the Net:

Chronic Lyme Disease: Does It Even Exist?

(Ivanhoe Newswire) ““ Lyme disease is no laughing matter. The disease affects the joints, heart, and even the nervous system, and it assumed that when it is deemed “chronic” that these symptoms worsen, but some scientists believe that chronic Lyme disease may not even exist and that doctors treating the disease may be causing more harm than good.

Lyme disease is a multi-system infection cause by Borrelia Burgdorferi, a bacterium, and treatment usually consists of 10-28 days of antibiotics. If doctors are treating patients with chronic Lyme disease then they are receiving prolonged and intense usage of oral and intravenous antibiotics, which can cause blood clots and life threatening infections. “Lyme literate” groups, such as the International Lyme and Associated Diseases Society, define chronic Lyme disease as a debilitating illness caused by a persistent infection of Borrelia Burgdorferi, and the symptoms are fatigue, difficulty concentrating, headaches, and irritability.

Dr. Michael Johnson at the Hospital of Saint Raphael in New Haven, CT, and Dr. Henry Feder, of the University of Connecticut Health Center, researched how frequently doctors in Connecticut diagnose and treat chronic Lyme disease. They found that of the 258 primary care physicians that responded to the survey, about half thought that chronic Lyme disease isn’t a legitimate illness, 48 percent were undecided, and about 2.1 percent said they did diagnose and treat chronic Lyme disease. The 2.1 percent of doctors who treated chronic Lyme disease said they only treated the disease for about 20 weeks.  Thus, they do not fit into the “Lyme literate” category because they weren’t treated for months to years.

The research suggests that for the most part chronic Lyme disease isn’t treated for months to years, and that maybe there are legitimacy issues with the “Lyme literate” organizations. However, chronic Lyme disease is drawing attention, and now some insurance companies are willing to cover the costs for treatment of chronic Lyme disease.

SOURCE: Journal of Pediatrics, published online September 1, 2010

Researcher Breeds First Blue Hibiscus

Rare hibiscus color is achieved after four years

Dr. Dariusz Malinowski is seeing blue, and he is very excited.

For four years, Malinowski, an AgriLife Research plant physiologist and forage agronomist in Vernon, has been working with collaborators Steve Brown of the Texas Foundation Seed and Dr. William Pinchak and Shane Martin with AgriLife Research on a winter-hardy hibiscus breeding project.

The project was first a private hobby of the inventors and became a part of the strategic plan of the Texas AgriLife Research and Extension Center at Vernon in 2009. The flower commercialization is a part of the research on non-traditional or under-utilized crops that have value because of drought tolerance.

Malinowski’s breeding goal has been to create a blue-flowering winter-hardy hibiscus.

“A blue pigment does not exist in this species, thus hybridizers have not been successful so far in creating a plant with blue flowers,” he said. “There are a couple of recently introduced cultivars with plum and lavender flower color.”

But now Malinowski has managed to breed a flower with the illusive color.

He and his collaborators have created a number of lines with unique flower and foliage shape and color. The new hibiscus hybrids range in color from white through different shades of pink, lavender, bluish, red and magenta tones, and some of them have combinations of two or even three colors.

One line has dark maroon foliage with moderately big, white flowers that blend into a pink center with darker veins, Malinowski said. Flower size of these hybrids varies from miniature blooms 2 inches in diameter to the size of dinner plates, about 12 inches in diameter.

Malinowski has been using these cultivars in his breeding project for several generations. This year, they finally had one plant bloom with almost blue flowers, a significant breakthrough in efforts to create a blue hibiscus cultivar.

“It took four years of work and more than 1,000 crosses among three winter-hardy hibiscus species to achieve this goal of creating an almost-blue flowering hibiscus hybrid,” he said. The new hybrid is not perfect yet, Malinowski said.

“The flowers get a fantastic blue hue in shade, but in full sunlight they are still plum-lavender-bluish,” he said.

Brown said it is important to note that in the world of ornamentals, “blue” is interpreted to have a wide range of hues. Most ornamental blues have a more purple or lavender cast.

“There are very few true blue flowers in any ornamental cultivar,” he said. “Although I would call this flower ‘almost blue’ as Dariusz has, there is no question that this development is unique in known hardy hibiscus color ranges.

“My expectation is that we will see more vibrant colors in next year’s F1s (cultivars) using this line as a parent,” Brown said.

Malinowski said he will use this plant as a parent in his breeding project this summer, with the goal to stabilize the blue color in full sunlight and increase flower size from the current 7 inches to the “magic” 12-inch diameter.

Breeding of ornamental plants is not the major research area of Malinowski, but he said he enjoys new challenges and the benefits of combining his private hobby with business.

“I never thought I would be an expert in breeding winter-hardy hibiscus,” he said. “The knowledge I have gained during the past few years of intensive work on hardy hibiscus helps me reach most of the breeding objectives in a relatively short time.”

What is next? Malinowski and his collaborators have a new challenge – to create an orange flowering hardy hibiscus.

This goal seems to be even more difficult, but not impossible, Malinowski said. It will require hybridization with a distantly related hibiscus species, which has shades of orange flowers. The researchers hope that with the help of molecular genetic tools they will be able to meet this objective.

Image Caption: Dr. Dariusz Malinowski, Texas AgriLife Research plant physiologist in Vernon, has bred a unique blue shade of hibiscus. (Credit: Texas AgriLife Research photo by Dr. Dariusz Malinowski)

On the Net:

SMOS Water Mission Reveals Insight Into Amazon Plume

ESA’s SMOS water mission has taken another step forward by demonstrating that it will lead to a better understanding of ocean circulation. Using preliminary data, scientists can clearly see how surface currents affect the ‘Amazon plume’ in the open sea.

The Soil Moisture and Ocean Salinity (SMOS) mission has been delivering observations of ‘brightness temperature’ to the science community since mid-July. As a measure of radiation emitted from Earth’s surface, this information can be used to derive global maps of soil moisture every three days and maps of ocean salinity at least every 30 days.

By consistently mapping soil moisture and ocean salinity, SMOS will advance our understanding of the exchange processes between Earth’s surface and atmosphere ““ the water cycle ““ and help improve weather and climate models. In addition, these data will be of practical use for agriculture and water resource management. 

Soil moisture and ocean salinity data products will be released later this month, but scientists are very encouraged by what they are already seeing.

Talking about observations that relate to ocean salinity, Nicolas Reul from Ifremer said, “One of the dramatic steps forward achieved with SMOS is that we now have the ability to track the movement of low-salinity surface waters, particularly those resulting from large ‘plumes’ such as the Amazon.

“Observations between mid-July and mid-August clearly show how the North Brazilian Current transports fresh water from the Amazon River as the current flows across the mouth of the river. These observations confirm the excellence of the data we are already getting from SMOS.”

Discharge from the Amazon River, the Amazon plume, amounts to around 15% of the global input of fresh water into the ocean. The migration of the plume, however, varies seasonally. During the first half of the year the river water generally disperses over a broad area to the northwest, towards the Caribbean Sea, but in the second half of the year the plume flows around the North Brazil Current and is carried back eastwards.

The migration of the Amazon plume results in significant differences in the salinity of the surface ocean water, so it was expected that SMOS would reveal the intricacies of these variations.

“Over the last weeks we have been able to track how the Amazon freshwater plume curves back on itself at this time of year as large North Brazilian Current eddies form northwest of the river mouth,” said Dr Reul.

“At the same time, the Orinoco plume has also been clearly visible as a tongue of fresh water entering the Tropical Atlantic along the windward side of the Caribbean islands.

“These observations are a good example of how well SMOS is performing and they show us that the mission can provide data on temporal sea-surface variability at scales of less than a week.”

Along with temperature, variations in ocean salinity drive global three-dimensional ocean-circulation patterns. This conveyor-like circulation is an important component of Earth’s heat engine and crucial in regulating weather and climate. Ocean salinity data from SMOS are therefore expected to greatly improve our knowledge of the conditions that influence these circulation patterns and thus climate.

Image 1: The Soil Moisture and Ocean Salinity (SMOS) mission makes global observations of soil moisture over Earth’s landmasses and salinity over the oceans. Variations in soil moisture and ocean salinity are a consequence of the continuous exchange of water between the oceans, the atmosphere and the land ““ Earth’s water cycle. Credits: ESA – AOES Medialab

Image 2: The image, derived from SMOS data in July 2010, clearly shows the plume of fresh Amazon River water as it enters the Atlantic Ocean. Credits: I. Corbella, UPC / Google Earth

On the Net:

Study Challenges Pain Relieving Effects Of Sugar

The practice of giving infants a small amount of sugar when performing invasive procedures does not reduce the amount of pain they feel, and could actually lead to brain damage, according to a new study published Wednesday in the medical journal The Lancet.

Previous studies, including a 2001 series of clinical trials performed by doctors, found that by giving one-tenth of a gram of sucrose to babies before completing a procedure such as installing an antibiotic drip or taking a blood sample would have an analgesic effect.

However, according to Guardian Health Correspondent Denis Campbell, the new study–which was funded by the Medical Research Council and completed at University College London (UCL)–discovered that previous research was “flawed” because the researchers involved “relied on the change in the baby’s facial expression upon receiving the sugar, from puckered-up to relaxed, as proof that it works.”

The UCL team, was led by Dr. Rebeccah Slater from the school’s Neuroscience, Physiology and Pharmacology departments, simulated drawing blood samples from 59 newborns by pricking their heels with a small blade. The subjects were given either sterile water or a solution of concentrated sugar, delivered orally through a small syringe, and the researchers monitored pain activity in the brain and spine by using electrodes. They observed no noticeable difference in the two groups, according to Campbell’s September 2 report.

“The absence of evidence for an analgesic action of sucrose in this study, together with uncertainty over the long-term benefits of repeated sucrose administration, suggest that sucrose should not be used routinely for procedural pain in infants without further investigation,” Slater’s team said, according to Wednesday reports from French news agency AFP.

“Our findings indicate that sucrose is not an effective pain relief drug. This is especially important in view of the increasing evidence that pain may cause short and long-term adverse effects on infant neurodevelopment,” Slater also told Campbell. “While we remain unsure of the impact sucrose has, we suggest that it is not used routinely to relieve pain in infants without further investigation.”

On the Net:

Nanotechnology Yields Breakthrough In Cancer Research

Researchers clear hurdle on path toward gene-therapy treatment for disease

One of the most difficult aspects of working at the nanoscale is actually seeing the object being worked on. Biological structures like viruses, which are smaller than the wavelength of light, are invisible to standard optical microscopes and difficult to capture in their native form with other imaging techniques.
 
A multidisciplinary research group at UCLA has now teamed up to not only visualize a virus but to use the results to adapt the virus so that it can deliver medication instead of disease.
 
In a paper published last week in the journal Science, Hongrong Liu, a UCLA postdoctoral researcher in microbiology, immunology and molecular genetics, and colleagues reveal an atomically accurate structure of the adenovirus that shows the interactions among its protein networks. The work provides critical structural information for researchers around the world attempting to modify the adenovirus for use in vaccine and gene-therapy treatments for cancer.
 
To modify a virus for gene therapy, researchers remove its disease-causing DNA, replace it with medications and use the virus shell, which has been optimized by millions of years of evolution, as a delivery vehicle.
 
Lily Wu, a UCLA professor of molecular and medical pharmacology and co-lead author of the study, and her group have been attempting to manipulate the adenovirus for use in gene therapy, but the lack of information about receptors on the virus’s surface had hampered their quest.
 
“We are engineering viruses to deliver gene therapy for prostate and breast cancers, but previous microscopy techniques were unable to visualize the adapted viruses,” Wu said. “This was like trying to a piece together the components of a car in the dark, where the only way to see if you did it correctly was to try and turn the car on.”
 
To better visualize the virus, Wu sought assistance from Hong Zhou, a UCLA professor of microbiology, immunology and molecular genetics and the study’s other lead author. Zhou uses cryo-electron microscopy (cryoEM) to produce atomically accurate three-dimensional models of biological samples such as viruses.
 
Wu, who is also a researcher at the California NanoSystems Institute (CNSI) at UCLA, learned of Zhou’s work after he was jointly recruited to UCLA from the University of Texas Medical School at Houston by the UCLA Department of Microbiology, Immunology and Molecular Genetics and UCLA’s CNSI.
 
About a year ago, once the transfer of Zhou’s lab was complete, Sok Boon Koh, one of Wu’s students, sought out Zhou’s group for their expertise and initiated the collaboration.
 
“This project exemplifies my excitement about being part of an institute as innovative as CNSI,” Zhou said. “Not only am I able to work with state-of-the-art equipment, but because CNSI is the hub for nanotechnology research and commercialization at UCLA, I have the opportunity to collaborate with colleagues across many disciplines.”
 
Working in the Electron Imaging Center for Nanomachines at the CNSI, a lab run by Zhou, the researchers used cryoEM to create a 3-D reconstruction of the human adenovirus from 31,815 individual particle images.
 
“Because the reconstruction reveals details up to a resolution of 3.6 angstroms, we are able to build an atomic model of the entire virus, showing precisely how the viral proteins all fit together and interact,” Zhou said. An angstrom is the distance between the two hydrogen atoms in a water molecule, and the entire adenovirus is about 920 angstroms in diameter.
 
Armed with this new understanding, Wu and her group are now moving forward with their engineered versions of adenovirus to use for gene therapy treatment of cancer.
 
“This breakthrough is a great leap forward, but there are still many obstacles to overcome,” Wu said. “If our work is successful, this therapy could be used to treat most forms of cancer, but our initial efforts have focused on prostate and breast cancers because those are the two most common forms of cancer in men and women, respectively.”
 
The group is working with the adenovirus because previous research has established it as a good candidate for gene therapy due to its efficiency in delivering genetic materials inside the body. The virus shell is also a safe delivery vehicle; tests have shown that the shell does not cause cancer, a problem encountered with some other virus shells. The adenovirus is relatively non-pathogenic naturally, causing only temporary respiratory illness in 5 to 10 percent of people.
 
CryoEM enables such a high-resolution reconstruction of biological structures because samples, in water, are imaged directly. In contrast, with X-ray crystallography (the conventional technique for atomic resolution models of biological structures), researchers grow crystal structures replicating the sample and then use diffraction to solve the crystal structure. This technique is limited because it is difficult to grow crystals for all proteins, samples for x-ray crystallography need to be very pure and uniform, and crystals of large complexes may not diffract to high resolution. These limitations resulted in critical areas of the adenovirus surface being unresolved using x-ray crystallography.
 
The study was funded by the National Cancer Institute and the U.S. Department of Defense.

On the Net:

Cough Medicines Should Be Restricted: FDA

US health regulators are working out possible restrictions on popular cough suppressants in hopes to stop cases of abuse that send thousands of people to the hospital each year.

The Food and Drug Administration posted its review on Tuesday of dextromethorphan, an ingredient found in over a hundred non-prescription medications that is often abused for its euphoric effects. “Robotripping,” as it is known among abusers, involves taking more than 25 times the recommended dosage of a cold medicine. The problem is most often associated with teenagers.

High doses of the drug can cause increased blood pressure, heart rate and fever. Abusers can also suffer side effects from other ingredients found in cough medicines, such as acetaminophen, which is linked to liver damage.

The drug is typically viewed as a safe, easy-to-use medication. But the psychoactive effects it can cause, “is sought after by those seeking to alter their mental state,” states the FDA in a review.

Inappropriate use of the ingredient was linked to roughly 8,000 emergency room visits in 2008, according to the FDA. That number was up by more than 70 percent from 2004.

Analysis by the FDA concluded that dextromethorphan is abused less often than the popular painkiller codeine, but, more often than pseudophedrine, a cold medicine ingredient that can be processed into methamphetamine.

The FDA is now looking to reconsider how it regulates the drug after the Drug Enforcement Agency raised concerns about the increasing abuse among adolescents.

The FDA will ask a panel of outside experts on September 14 whether dextromethorphan should be available only as a prescription. The agency is not required to follow the panel’s advice, though it usually does.

Mandating a prescription for cough suppressants that have dextromethorphan would be a major upset for over-the-counter drug makers, which use the drug in dozens of cold medications.

Brands such as Wyeth’s Dimetapp, Bayer’s Alka-Seltzer Flu Plus and Proctor and Gamble’s Vicks cough medicines all contain dextromethorphan. The drug is available in pills, capsules, liquids and other forms.

Most industry experts do not believe the FDA will impose a prescription on those products due to the enormous workload it would create for doctors and pharmacists.

A possible alternative would be to place such medicines behind the counter, although the FDA review did not disclose any specific proposals.

The over-the-counter industry supports the prohibition of sales of the medicines to people under the age of 18. Such age restrictions would require legal changes, and the industry’s trade association has lobbied on the issue at the state and federal levels.

The Consumer Healthcare Products Association has been working to reduce abuse of over-the-counter drugs since 2003 by sponsoring educational campaigns that target parents, teenagers and school nurses, a spokeswoman for the agency told the Associated Press.

On the Net:

New Mothers Have Problems With Sleep Quality, Not Hours

The popular consensus may be that new mothers do not get enough sleep, but that may be wrong.

A new study, published in the American Journal of Obstetrics & Gynecology, suggests new mothers may often get a decent amount of sleep in their babies’ first few months, but it is usually not good-quality sleep.

The study, which followed a group of new moms, found that on average, moms got just over 7 hours of sleep per night during their babies’ first four months. That is within what is generally recommended for adults, and is more than what the average American gets, based on past studies.

However, the study also found that the sleep is frequently disrupted, with most new mothers being awake a total of two hours during the night.

The results may not sound surprising, especially to parents. But the study does challenge assumptions about new mothers’ typical sleep patterns, according to lead researcher Dr. Hawley E. Montgomery-Downs, an assistant professor of psychology at West Virginia University in Morgantown.

The assumption has been that most new mothers are sleep-deprived, Montgomery-Downs told Reuters Health.

So the advice on how to combat daytime fatigue has focused on countering sleep deprivation, she said. A popular piece of advice could be for moms to “nap when your baby naps.”

The results of the study, however, suggest that new moms’ highly fragmented sleep is what’s behind their daytime fatigue.

The sleep pattern in new mothers is very similar to what is seen with sleep disorders, such as sleep apnea, where people are in bed enough hours each night, but get very little restorative, good-quality sleep.

Sleep occurs in cycles that each last about 90 to 120 minutes. Depending on how often a new mother is waking up, she may get few or no full cycles of sleep during the night, Montgomery-Downs noted, adding that a quick daytime nap is not likely to counter that.

“We need to think about what kinds of strategies can help consolidate sleep” for new moms, she said. One strategy, she suggested, could be for breastfeeding moms to find time to pump milk and store it in bottles so that they do not have to be the ones to always get up with the baby.

And although quick naps do not offer much help, she said that “if you’re one of the lucky parents” whose infants may nap for at least two straight hours, taking that time to sleep could be helpful.

The study involved 74 new mothers who were followed between either the second and 13th week of their infants’ lives, or between the ninth and 16th week. The mothers kept track of their sleep patterns using sleep journals, and also wore a wristwatch-like device called an actigraph that recorded their movements during the night.

Researchers found that the women’s average sleep time was what is typically should be, at 7.2 hours. The problem lied with sleep fragmentation.

Only a few mothers tried napping to countermeasure the lack of full night’s rest. By the third week of their infants’ lives, less than half of the women in the study said they took daytime naps, and among those who did, the average was only twice per week.

Daytime fatigue in new mothers is a real concern for several reasons, according to Montgomery-Downs. One reason is that, in some women, sleep problems and exhaustion may contribute to postpartum depression. Fatigue can also hinder the ability to drive safely and could hurt work performance.

Fragmented sleep and daytime fatigue in new mothers call for reconsideration of maternity work leave in the US, Montgomery-Downs argues. Currently, national policy states that workplaces with more than 50 employees have to offer up to 12 weeks of unpaid leave.

Many women, Montgomery-Downs noted, may have to go back to work at a time when “they should really be taking care of themselves.”

On the Net:

23rd ECNP Congress: Pioneering CNS Research — Translating Neuroscience Into Clinical Progress

Aug. 28-Sept. 1, 2010, Amsterdam, the Netherlands

Mental disorders, such as depression, anxiety disorders, addiction and schizophrenia are the core challenge of most health care systems around the world. In the EU alone, each year 27% of the total adult population ““ this corresponds to 83 Million citizens ““ suffer from mental disorders. Depression alone affects almost 20 million ranking in the EU as the most disabling disorder of all diseases. Unless appropriately treated, mental disorders are typically associated with a wide range of complications and sequelae for the subjects affected, their partners and families as well as society as a whole, and they can be lethal. Suicide ““ a frequent complication of depression and other mental disorders ““ is a major cause of premature death in Europe with over 160.000 completed suicides every year; rates of attempted suicides are at least 10 times higher. Nevertheless ““ despite the tremendous suffering and burden of mental disorder and the fact that mental disorders are treatable ““ the majority of persons with mental disorders in the EU remain untreated.

The EU over the past three years has recognized with increasing emphasis the urgent need to change this, calling for concerted mental health action on all levels: science and research, improved public health and outreach activities and improved policies in its member states. The 2008 ‘European Pact for Mental Health and Well-being’ reflects the EU’s strong commitment for this mission, highlighting that mental health and well-being in the population is a key resource for the success of the EU as a knowledge-based society and economy. Confronting the high and increasing prevalence of mental disorders and their currently deficient care in many areas, health care systems and schemes are encouraged to act, striving for improved early recognition and diagnosis and ensuring the provision of adequate and state of the art treatment and comprehensive rehabilitation programmes for all.

Mental disorders are “complex disorders of the brain”, bound to the way we perceive, think, feel and behave. Understanding such brain dysfunctions in mental disorders is of core relevance for their prevention and their treatment. The interdisciplinary field of neuropsychopharmacology links the core disciplines of neuroscience, psychology and pharmacology and is devoted to this aim. It covers basic and clinical neuroscience from the molecule to system approaches over the establishment of improved diagnostic and treatment standards to fostering their implementation in the health care system for patients with neurological and mental disorders. The European College of Neuropsychopharmacology (ECNP) is Europe’s largest and most comprehensive interdisciplinary forum in this field, dedicated to translating new knowledge on fundamental disease mechanisms into clinical practice, paving the way for improved pharmacological and non drug treatments for the prevention and treatment of all mental disorders and disorders of the brain in general.

The 23rd ECNP Congress 2010 in Amsterdam is Europe’s leading and largest scientific meeting on mental health in Europe, providing insight into the latest achievements in brain research and treatment, providing a unique and stimulating forum for scientists and clinicians in the mental health field.

Mental health: an ever-changing challenge

Mental disorders cause immense suffering for individuals, families and communities, and represent over all and by far the leading cause of disability-associated burden in the EU of all diseases. Across the EU, pressure is being put on the health, the social welfare and the educational systems as well as the labour market, employers and economy in general. Due to the financial and economic crisis, the situation since 2008 has aggravated and poses even more and new challenges to these systems.

In its communication ‘Driving European recovery’, the European Commission highlights the need of supporting the EU population through the crisis and of reducing its human cost [1]. A key aspect in this context is to minimise its harmful impact on mental health, which is based on a complex interaction between neuropsychopharmacological and psychosocial factors. The crisis is not only believed to have a harmful effect on the provision of the already deficient health care system, but is also believed to be associated with a deterioration of several socio-economic determinants of mental health and well-being. While protective psychosocial factors, such as a stable professional and social life, get weakened, risks increase: impaired social contacts due to unemployment, social isolation, financial hardship, lack of personal recognition, fear and uncertainty about the future, etc. The European Union, therefore, has identified specific challenges and called for action in five priority areas: prevention of depression and suicide, mental health in youth and education, mental health in older people, mental health in workplace settings, and combating stigma and social exclusion.

The human brain, the most complex structure ever investigated by science, is the basis of our behaviour, mental functions and inner life. Over the past decades, new techniques of investigation of brain structure and function have become available, allowing further penetration of the mysteries of human feelings, thoughts, and emotions, and consequently even human values, relationships, and belief. Today scientists are truly beginning to learn about the structure and function of the human brain, which is physically shaped by contributions from our genes as well as from our experience. This understanding strengthens the view that mental disorders are both caused and can be treated by biological and experiential processes, working together. As the breathtaking progress in modern neuropsychopharmacology begins to integrate knowledge from biological as well as behavioural sciences, a fundamental realisation is taking place that treatment of mental disorders works ““ whether in the form of a somatic intervention such as a medication, or a psychosocial intervention such as psychotherapy ““ by actually changing the brain.

In recent decades, a wealth of information has become available about brain function and dysfunction in neuropsychiatric disorders. As a consequence, many people have benefited from treatments that have arisen from our understanding of how the brain works and how it may be disordered in neuropsychiatric illnesses.

Neuropsychopharmacology: a comprehensive, interdisciplinary approach

Neuropsychopharmacology is the trans-disciplinary field of science that is of core relevance for examining and understanding how the brain works and functions. As such it includes many disciplines: the neurosciences (e.g. molecular biology, genetics, chronobiology, neuroimmunology, brain imaging), the psychological sciences (e.g. cognition, emotion, behaviour and environmental interactions), psychopharmacology (neurochemistry, pharmacodynamics and drug action), and the respective applied clinical fields (psychiatry, psychotherapy, clinical psychology and neurology.

The goal of neuropsychopharmacology is to understand (1) how the brain works and functions and how and why it may get dysfunctional promoting disorders of the brain. (2) Moreover it is the field that allows the derivation of effective pharmacological interventions, to treat and to prevent mental disorders. This implies to develop specific therapeutic agents to regulate the neurobiological mechanisms of mental disorders, understanding the causes of mental disorders and the investigation of the effect of drugs on the central nervous system (CNS), respectively their use in treating disorders such as anxiety, mania, depression, schizophrenia, dementia, addictive and neurological disorders in the most rational and empirical manner.

Clinical advances: from molecules to effective treatments

The pharmacological progress achieved in recent decades is based on fundamental biochemical and physiological discoveries, translated into clinical practice through randomised, double-blind and placebo-controlled drug studies.

Progress in neuropsychopharmacology has contributed in a crucial way to the fundamental transformation of our mental health systems, allowing in particular patients with severe mental disorders such as psychotic disorders and schizophrenia as well as recurrent and chronic depression to avoid long-term hospitalization and disability and to live and function independently in the community. So-called atypical antipsychotics were the outcome of a search for efficacious drugs for psychotic disorders showing a better tolerability profile than conventional neuroleptics, especially with regard to limiting motor side effects such as tardive dyskinesia improving significantly the prognosis and quality of life of patients affected. Similarly, the introduction of new antidepressants, including the generations of new selective serotonin reuptake inhibitors (SSRIs) that have fewer adverse effects and pose less danger in overdose than older agents, have improved dramatically the situation and prognosis of patients with anxiety and depressive disorders allowing the majority of patients to live without significant impairment and disability.

However, research in neuropsychopharmacology has also informed exciting non-pharmacological developments. One example are chronotherapeutics, which comprise direct manipulations of sleep (e.g. wake therapies) as well as controlled exposure to environmental cues (e.g. light therapy) in order to achieve therapeutic effects in patients with mental disorders. Normalisation of circadian rhythms by chronotherapeutics represents a promising new direction in the search for novel non-pharmacological and pharmacological treatments that might avoid the limitations of current drug treatments in this field. Benefits through chronotherapeutic applications have been achieved for a broad range of patients with depression, bipolar disorder, seasonal affective disorder (SAD), premenstrual dysphoric disorder, bulimia nervosa, attention-deficit/hyperactivity disorder (ADHD), dementia, Parkinson´s disease, and shift and jet lag disturbances [7].

Another example is the identification of areas in the central nervous system that correlate with pathological mood states, suggesting targets for novel therapeutic interventions. Through modern imaging techniques including functional magnetic resonance imaging (fMRI), positron emission tomography (PET) and single-photon emission computed tomography (SPECT) neuronal activity in psychiatric conditions can be monitored and measured. Modern imaging studies of the brain circuitry underlying normal and pathological behaviours may contribute to a better understanding of the neural basis of mental disorders and identify novel targets for pharmacological treatment. Brain imaging has revealed a breakdown in normal patterns of emotional processing that impairs the ability to suppress negative emotional states. Mood disturbances thus may reflect the exaggeration of emotional responses or abnormalities in emotional processing. Recent findings have identified an extended neural network during self-referential processing in the brain, contributing to the exploration of the concrete neural bases of the depressive self [3]. Imaging researchers are also studying depression-related circuits to see how they may arise from genetic variations known to put people at risk for depression. Imaging genetics is a novel research strategy that attempts to identify gene effects with regard to the brain and has provided significant contributions to the understanding of the complex impact of hereditary factors on psychiatric illness.

Clinical advances based on neuropsychopharmacological research are enabling people struck with mental disorders to make the way from isolation back to social reintegration.

Neuropsychopharmacology: mental disorders and beyond

In recent years, neuropsychopharmacology has expanded its interdisciplinary focus, and dialogues with other medical fields such as internal medicine have been initiated. The interaction between mental disorders and somatic disease is explored thoroughly, and the role of mental disorders in increasing vulnerability to physical morbidity and poorer outcomes is well documented [8] [9]. Mental disorders are frequently associated with metabolic disorders and cardiovascular disease, with evidence for reciprocal pathways and interactions. For example, chronic depression increases the risk for diabetes; conversely, in adult diabetics depression is much more frequent compared to metabolically healthy subjects [6]. Furthermore, research has shown links between depression and anxiety, and cardiovascular and cerebrovascular diseases. The influence of specific psychiatric disorders in contributing to adverse cardiac disease trajectories and death has been established, and depression has been identified as a risk factor for the development and progression of coronary artery disease [4].

In addition, the risk for metabolic syndrome in patients with schizophrenia and mood disorders is increased compared to the general population. Even the single components of the metabolic syndrome (overweight, hypertension, hyperlipidemia) are significantly more frequent in schizophrenia and mood disorders. Whether these metabolic and cardiovascular conditions are primarily due to the illness or secondarily induced by psychopharmacological treatment is subject to current research. Since metabolic and cardiovascular risk in long-term psychopharmacological treatment has been evaluated extensively, cardio-metabolic risk factors in patients with severe mental illness, especially when treated with antipsychotic agents, are now much better recognised, and efforts to ensure improved physical health screening and prevention are becoming established [2].

Psychopharmacologic approaches have increasingly expanded the boundaries to treat numerous other disorders not traditionally part of practice including obesity, hypoactive sexual desire disorder, fibromyalgia, perimenopausal vasomotor symptoms, numerous dementias, pain management, and even gambling [5]. Increasing evidence for the intrinsic analgesic effect of antidepressants and the antidepressant efficacy of neurological treatments in patients with Morbus Parkinson has opened up the way to new treatments. In the future, neuropsychopharmacologic approaches will have to increasingly deal also with non-pharmacologic devices, including not only classical electroconvulsive therapy but also vagus nerve stimulation, transcranial magnetic stimulation, and deep brain stimulation.

Neuropsychopharmacology is a dynamic field that is continuously expanding the boundaries of research and practice. New developments in neuropsychopharmacology are improving the interdisciplinary health care and patient management.

European College of Neuropsychopharmacology (ECNP)

The European College of Neuropsychopharmacology (ECNP) is Europe’s largest and leading forum for the exchange and dissemination of interdisciplinary research on the brain and brain dysfunction. ECNP is an independent scientific association founded in 1987 by European scientists and clinicians working in neuropsychopharmacology and related disciplines to encourage innovative research across the neurosciences and to translate new knowledge on fundamental disease mechanisms into clinical applications.

ECNP serves as a uniquely broad interdisciplinary platform, which strongly emphasises the complementary nature of research at the bench and at the bedside or clinic. This so-called ‘translational research’ is target-oriented, promoting the development of diagnostic procedures and laboratory outcomes in the service of discoveries that will result in improved patient care [10].

To achieve its objectives, ECNP has established a number of activities and programmes stimulating interdisciplinary forces designed to promote the communication and cross-fertilisation of research results and ideas in the field of neuropsychopharmacology. These include the ECNP Congresses, the largest high-level scientific meeting on neuropsychopharmacology and mental disorders in Europe, Regional Meetings, Seminars and Consultation Meetings, as well as activities for young scientists. The scientific journal of ECNP, European Neuropsychopharmacology (ENP), publishes original findings from basic and clinical research. Recently the ECNP School of Neuropsychopharmacology has been established with the aim of teaching junior clinicians high-standard practice in neuropsychopharmacology and to involve them in the development of local good practice in teaching and training. The second ECNP School of Neuropsychopharmacology has been held in Oxford, UK, from 11 to16 July 2010.

Through all its activities, ECNP aims to increase the understanding of brain disorders, helping to pave the way to improved treatments, and to promote the development of common standards in Europe.

Highlights of the 23rd ECNP Congress 2010

From 28 August to 1 September 2010, renowned experts and 7,000 anticipated participants will meet in Amsterdam to present, discuss and evaluate the latest achievements and future perspectives in the fields of schizophrenia, depression, bipolar disorder, drugs and addiction, Alzheimer´s disease, chronopsychiatry, eating disorders, autism spectrum disorders, as well as basic and clinical neuroscience and psychopharmacology. Great emphasis will be put on clear take-home messages that can easily be translated into clinical practice by medical professionals.

The scientific programme includes more than 35 sessions to be presented by more than 150 speakers from 20 countries, and will comprise, among others, the following topics:

    * The neural basis of the depressive self
    * Gene-environment interactions in psychosis
    * Circadian rhythms: their role and dysfunction in affective disorder
    * Predictors of relapse in alcohol dependence
    * Avenues to novel antipsychotics: moving to exploratory strategies
    * Fear memory and extinction: options for new treatments
    * Stress and affective disorders
    * Neuroprotective therapies: common potential targets in multiple sclerosis, stroke and neurodegenerative disease

The educational update sessions at the ECNP Congress will deal with the neurobiology and neuropharmacology of compulsivity in addictive behaviour, pain and neuropsychopharmacology, the placebo in psychiatry, neurostimulation techniques in mood disorders, metabolic and cardiovascular risks in the long-term psychopharmacological treatment of patients with psychiatric disorders, as well as psychiatric symptoms and their treatment in neurological disorders. Three poster sessions with in total more than 750 poster presentations from scientists from all over the world will offer an exciting insight into the research activities of (young) scientists.

Furthermore, ECNP is proud to present the results of the ECNP Consultation Meeting 2010 on ´The future of the placebo in clinical trials in brain diseases´. Through annual Consultation Meetings on specific topics, ECNP aims to facilitate the dialogue and exchange of advice between the participating parties, i.e. scientists, regulatory authorities and the pharmaceutical industry.

Invitation: meet the scientists!

Experts will be available for questions by journalists in the course of press conferences at the 23rd ECNP Congress. Please refer to the detailed schedule of press conferences in the enclosed ´press information and procedures´.

The 23rd ECNP Congress will once again present a high-calibre and balanced scientific programme, in which the latest achievements and future perspectives in neuropsychopharmacology and related disciplines of virtually all disorders of the brain are discussed, including the various aspects of pharmacotherapy in order to improve the life of patients with psychiatric and neurological disorders.

On the Net:

Big Tobacco Using YouTube To Shill Products?

Prohibited from using television, radio, and newspapers for advertising their products, tobacco companies may have taken their marketing to YouTube and other popular online media outlets, according to a study published in the online journal Tobacco Control on Wednesday.

Authors Lucy Elkin, George Thomson, and Nick Wilson, all members of the Department of Public Health from the University of Otago in Wellington, New Zealand, conducted their research at YouTube, and conducted searches featuring five of the top cigarette brands internationally: Marlboro, L&M, Benson and Hedges, Winston, and Mild Seven.

They then looked at the thematic content of a total of up to 40 of the most popular videos per brand–a total of 163. Of those, “a majority”¦ (71.2%, 95% CI 63.9 to 77.7) had pro-tobacco content, versus a small minority (3.7%) having anti-tobacco content (95% CI 1.4 to 7.8),” the researchers note.

“Most of these videos contained tobacco brand content (70.6%), the brand name in the title (71.2%) or smoking imagery content (50.9%),” they added. “One pro-smoking music video had been viewed over 2 million times. The four most prominent themes of the videos were celebrity/movies, sports, music and ‘archive’, the first three of which represent themes of interest to a youth audience.”

According to an August 25 press release from Tobacco Control parent British Medical Journal, tobacco companies have long “vehemently denied advertising on the Internet,” and several top manufacturers voluntarily agreed to restrict direct advertising online by the end of 2002. In the press release, the trio of researchers state that 20 of the 163 videos appeared to be “very professionally made,” and that each of the videos averaged 100,000 views, with one topping the two million view plateau.

Furthermore, according to AFP, “Many of the videos included old TV advertising and posters, which are outlawed in many countries”¦ There were also scenes from films with popular actors and a cigarette whose brand was visible, extracts of tobacco-sponsored sporting events, and TV footage from the 1950s and 1960s, including The Flintstones, The Beverly Hillbillies and even the Beatles.”

“Pro-tobacco videos have a significant presence on YouTube, consistent with indirect marketing activity by tobacco companies or their proxies,” Elkin, Thomson, and Wilson conclude in their study. “Since content may be removed from YouTube if it is found to breach copyright or if it contains offensive material, there is scope for the public and health organizations to request the removal of pro-tobacco content containing copyright or offensive material.”

On the Net:

Whale Sharks May Produce Many Litters From 1 Mating

How do female whale sharks meet their perfect mates and go on to produce offspring? While little is known about the reproductive behavior of these ocean-roaming giants, a newly published analysis led by University of Illinois at Chicago biologist Jennifer Schmidt reveals new details about the mating habits of this elusive, difficult-to-study fish.

Schmidt, a UIC associate professor of biological sciences, determined paternity of 29 frozen embryos saved from a female whale shark caught off the coast of Taiwan in 1995. The embryos, studied in collaboration with Professor Shoou-Jeng Joung at the National Taiwan Ocean University, are extremely rare.

The pregnant shark carried a surprisingly large number of embryos — 304 — still in the uterus and representing a spectrum of age and development stages ranging from being still egg-encased to developed, near-term animals.

Schmidt and her colleagues spent several years developing DNA genetic markers to study whale sharks, initially for population genetics, but in this study the tool was used to determine paternity.

Shark reproduction is still an emerging science, but what is known suggests that most broods are sired by more than one male. That is not what Schmidt found with this particular female whale shark.

“These differently aged embryos — itself unusual across animal species — had the same father,” Schmidt said. “We have to be very cautious in drawing conclusions from a single litter, but the data suggest female whale sharks store sperm after a single mating event, and subsequently fertilize their own eggs as they are produced.”

If the finding can be supported from analysis of other whale shark litters, Schmidt said, “it would suggest that there is no whale shark breeding ground where large numbers of animals meet to mate, but rather that mating occurs as an isolated event.”

Follow-up studies may be serendipity. International protocols protect whale sharks from capture, few are housed in aquariums, and those that are usually less than 25 years old and not yet sexually mature. Scientists typically study whale sharks at seasonal feeding grounds, but those animals are usually juveniles not mature enough to breed. Rarely are adult females observed in the wild.

“Protections for whale sharks have increased in many parts of the world, yet shark numbers seem to be declining, and the average size is getting smaller,” said Mark Meekan, principal research scientist with the Australian Institute of Marine Sciences.

“This is a classic sign of overfishing, where larger, more valuable animals are selectively removed,” he said. “Targeted fishing of breeding-age animals in a late-maturing species can be devastating for its survival.”

The findings are reported in the journal Endangered Species Research, published online Aug. 4. Other authors include Meekan; Joung and Chien-Chi Chen of the National Taiwan Ocean University; Saad I. Sheikh, formerly of UIC; and Bradley Norman of ECOCEAN Inc.

The work was funded by a grant from Project Aware.

Image 1: Whale shark. UIC researcher Jennifer Schmidt, associate professor of biological sciences, studies the large mammals. Credit: Jennifer Schmidt

Image 2: The 24 near-term embryos from a mother whale shark. Researcher Jennifer Schmidt, associate professor of biological sciences, is studying whale sharks. Credit: Jennifer Schmidt

On the Net:

Big Bear Solar Observatory Provides Amazing New Sun Images

NJIT Distinguished Professor Philip R. Goode and the Big Bear Solar Observatory (BBSO) team have achieved “first light” using a deformable mirror in what is called adaptive optics at Big Bear Solar Observatory (BBSO). Using this equipment, an image of a sunspot was published August 23 on the website of Ciel et l’Espace, as the photo of the day: http://www.cieletespace.fr/node/5752

“This photo of a sunspot is now the most detailed ever obtained in visible light,” according to Ciel et l’Espace. In September, the publication, a popular astronomy magazine, will publish several more photos of the Sun taken with BBSO’s new adaptive optics system.

Goode said that the images were achieved with the 1.6 m clear aperture, off-axis New Solar Telescope (NST) at BBSO. The telescope has a resolution covering about 50 miles on the Sun’s surface.

The telescope is the crown jewel of BBSO, the first facility-class solar observatory built in more than a generation in the U.S. The instrument is undergoing commissioning at BBSO.

Since 1997, under Goode’s direction, NJIT has owned and operated BBSO, located in a clear mountain lake. The mountain lake is characterized by sustained atmospheric stability, which is essential for BBSO’s primary interests of measuring and understanding solar complex phenomena utilizing dedicated telescopes and instruments.

The images were taken by the NST with atmospheric distortion corrected by its 97 actuator deformable mirror. By the summer of 2011, in collaboration with the National Solar Observatory, BBSO will have upgraded the current adaptive optics system to one utilizing a 349 actuator deformable mirror.

With support from the National Science Foundation (NSF), Air Force Office of Scientific Research, NASA and NJIT, the NST began operation in the summer of 2009. Additional support from NSF was received a few months ago to fund further upgrades to this new optical system.

The NST will be the pathfinder for an even larger ground-based telescope, the Advanced Technology Solar Telescope (ATST), to be built over the next decade. NJIT is an ATST co-principal investigator on this NSF project. The new grant will allow Goode and partners from the National Solar Observatory (NSO) to develop a new and more sophisticated kind of adaptive optics, known as multi-conjugate adaptive optics (MCAO).

The new optical system will allow the researchers to increase the distortion-free field of view to allow for better ways to study these larger and puzzling areas of the Sun. MCAO on the NST will be a pathfinder for the optical system of NSO’s 4-meter aperture ATST coming later in the decade.

Scientists believe magnetic structures, like sunspots hold an important key to understanding space weather. Space weather, which originates in the Sun, can have dire consequences on Earth’s climate and environment. A bad storm can disrupt power grids and communication, destroy satellites and even expose airline pilots, crew and passengers to radiation.

The new telescope now feeds a high-order adaptive optics system, which in turn feeds the next generation of technologies for measuring magnetic fields and dynamic events using visible and infrared light. A parallel computer system for real-time image enhancement highlights it.

Goode and BBSO scientists have studied solar magnetic fields for many years. They are expert at combining BBSO ground-based data with satellite data to determine dynamic properties of the solar magnetic fields.

Image Caption: The most detailed sunspot ever obtained in visible light was seen by new telescope at NJIT’s Big Bear Solar Observatory. Credit: Big Bear Solar Observatory

On the Net:

Cinnamon Could Help Fight Diabetes, Heart Disease

A study led by U.S. Department of Agriculture (USDA) chemist Richard Anderson suggests that a water soluble extract of cinnamon, which contains antioxidative compounds, could help reduce risk factors associated with diabetes and heart disease.

The work is part of cooperative agreements between the Beltsville Human Nutrition Research Center (BHNRC) operated by USDA’s Agricultural Research Service (ARS) at Beltsville, Md.; Integrity Nutraceuticals International of Spring Hill, Tenn., and the Joseph Fourier University in Grenoble, France. Anderson works in the Diet, Genomics and Immunology Laboratory of BHNRC. ARS is USDA’s principal intramural scientific research agency.

For the study, conducted in Ohio, coauthor Tim N. Ziegenfuss, now with the Center for Applied Health Sciences based in Fairlawn, Ohio, enrolled volunteers and collected samples.

Twenty-two obese participants with impaired blood glucose values–a condition classified as “prediabetes”–volunteered for the 12-week experimental research study. Prediabetes occurs when cells are resistant to the higher-than-normal levels of insulin produced by the pancreas (in an attempt to help remove elevated glucose levels from blood).

The volunteers were divided randomly into two groups and given either a placebo or 250 milligrams (mgs) of a dried water-soluble cinnamon extract twice daily along with their usual diets. Blood was collected after an overnight fast at the beginning of the study, after six weeks, and after 12 weeks to measure the changes in blood glucose and antioxidants.

The study demonstrated that the water-soluble cinnamon extract improved a number of antioxidant variables by as much as 13 to 23 percent, and improvement in antioxidant status was correlated with decreases in fasting glucose, according to Anderson.

Only more research will tell whether the investigational study supports the idea that people who are overweight or obese could reduce oxidative stress and blood glucose by consuming cinnamon extracts that have been proven safe and effective. In the meantime, weight loss remains the primary factor in improving these numbers, according to ARS scientists.

More details on the 2009 study can be found in the Journal of the American College of Nutrition.

On the Net:

Cognitive Behavior Therapy Improves Symptom Control In Adult ADHD

Skills-based treatment added to medication helps patients handle persistent symptoms

Adding cognitive behavioral therapy ““ an approach that teaches skills for handling life challenges and revising negative thought patterns ““ to pharmaceutical treatment for attention-deficit hyperactivity disorder (ADHD) significantly improved symptom control in a study of adult patients. The report from Massachusetts General Hospital (MGH) researchers appear in the August 25 Journal of the American Medical Association.

“Medications are very effective in ‘turning down the volume’ on ADHD symptoms, but they do not teach people skills,” explains Steven Safren, PhD, ABPP, director of Behavioral Medicine in the MGH Department of Psychiatry, who led the study. “This study shows that a skills-based approach can help patients learn how to cope with their attention problems and better manage this significant and impairing disorder.”

More than 4 percent of adults in the U.S. have ADHD, and while stimulants and other psychiatric medications are the primary first-line treatment, the study authors note that a significant number of patients who take and respond to these medication are still troubled by continuing symptoms. A few studies have investigated psychosocial treatment for ADHD, and although some have suggested benefits from cognitive behavioral therapy, they were small and short-term. The current study is believed to be the first full-scale randomized, controlled trial of the effectiveness of an individually-delivered, non-medication treatment of ADHD in adults.

The study enrolled adults diagnosed with ADHD who reported reduced but still significant symptoms while taking an ADHD medication. Randomly assigned to one of two therapeutic approaches, participants attended 12 weekly one-on-one counseling sessions with a psychologist or psychology fellow. The control group received training in muscle relaxation and other relaxation techniques, education on how to apply relaxation to ADHD symptoms, and supportive psychotherapy. The cognitive behavioral therapy sessions included skills training in areas such as organization and planning, setting priorities and problem solving, coping with distractions, and developing adaptive thought responses to stressful situations.

“Sessions were designed specifically to meet the needs of ADHD patients and included things like starting and maintaining calendar and task list systems, breaking large tasks into manageable steps, and shaping tasks to be as long as your attention span will permit,” Safren says. “The treatment is half like taking a course and half like being in traditional psychotherapy.”

Symptom assessments conducted at the end of the 12-week treatment period revealed that participants receiving cognitive behavioral therapy had significantly better symptom control than did those receiving relaxation training, benefits that were maintained three and nine months later. A standard rating scale for ADHD symptoms showed a 30 percent reduction in symptoms in more than two thirds of the cognitive behavioral therapy group but in only one third of the relaxation group.

“We know that ADHD medications are effective for patients who can take them, and without medications it would be harder to learn the skills taught in this study,” Safren adds. “But we have shown that learning self-management skills can help reduce symptoms even further. Now we need to determine the best ways to train clinicians in this approach and the best time to introduce this treatment, along with exploring other ways to help patients who did not benefit.” Safren is an associate professor of Psychology in the Harvard Medical School Department of Psychiatry.

On the Net:

Bioengineering Design Makes Health Diagnosis Simpler, Quicker

ASU bioengineering research produces design for new device to help detect diseases quickly and at lower costs

Arizona State University researchers have demonstrated a way to dramatically simplify testing patients for infectious diseases and unhealthy protein levels.

New testing instrumentation developed by Antonia Garcia and John Schneider promises to make the procedure less costly and produce results in less time.

Current testing is slow and expensive because of the complications of working with blood, saliva, urine and other biological fluids, said Garcia, a professor in the School of Biological and Health Systems Engineering, one of ASU’s Ira A. Fulton Schools of Engineering.

Such samples “are complex mixtures that require sophisticated instruments capable of mixing a sample with antibodies or other biological reactants to produce an accurate positive or negative reaction,” Garcia said.

He and Schneider, a bioengineering graduate student researcher, have come up with a testing method that enables the patient sample itself to act in concert with a rudimentary, low-cost testing device.

The method uses common light-emitting diodes (LEDs) and simple microeletronic amplifiers rather than more technologically intensive and costly lasers and robotics.

Fluids and light working together

Garcia and Schneider have demonstrated that superhydrophobic surfaces can shape blood, saliva, urine and other fluids into round drops. The drops can focus light and quickly mix and move microparticles and nanopartices that can be examined to reveal a specific infectious agent or protein.

Superhydrophobicity is a property of materials that repel water, such as ducks’ feather or leaves of the lotus plant. Such materials are used commercially in textiles, building materials and surface coatings.

The new device operates by placing a drop of nanoparticles or microparticles on top of a drop of a patient fluid sample on a superhydrophobic surface. The surface has a small depression that holds the liquid sample in place so that it forms a spherical drop.

The drop acts as a lens due to surface tension. An LED is shined on the drop and the drop shape focuses the light into an intense beam measured by a second LED.

Because the drop is slowly evaporating, Garcia explains, nanoparticles or microparticles quickly begin to stick together when the patient fluid sample contains the infectious agent or protein being targeted. The infectious agent or protein migrates to the center of the drop, leaving the particles that have not yet stuck together to move to the surface.

This leads to the self-mixing action that speeds up the diagnostic process so that detection can occur in less than two minutes, he said.

Measuring overall health

Because the fluid sample becomes integrated with the simple LEDs and microelectronics, the researchers call the new device design the Integrascope.

Garcia and Schneider have built several laboratory prototype devices based on the design and have demonstrated how the device can be used to measure C Reactive Protein in human serum, which is an indicator of a variety of inflammatory conditions when the protein is present at high levels.

High levels of protein can indicate cell and tissue damage, inflammation, disruption in kidney function, or an immune system that is pumping out antibodies due to an infection or autoimmune disease. Low protein levels can indicate malnutrition or the presence of diseases that prevent the body from producing sufficient blood protein.

The device also can be used to provide an indication of overall health by measuring total protein in human serum, saliva and urine.

Potential global impact

Development of the device was sparked during Schneider’s studies for his doctoral degree, as he experimented with shining an LED on a drop of liquid resting on a superhydrophobic surface. He was trying to see if he could detect changes in light transmission that would tell whether a protein was present in the liquid.

“To our surprise,” Garcia said, “we quickly realized that his laboratory set-up generated a very strong beam of light that could be easily measured using a fiber-optic light detector we had in the lab.”

The research results have been posted on the web site Nature Precedings. The report describes how the new device works and gives details of the information the diagnostic test provides within the first few minutes of its use.

Low-cost solutions

The most common low-cost devices on the market now are lateral-flow immunoassays similar in look and function to the early pregnancy test.

The biggest stumbling block in making low-cost diagnostic devices for many conditions and diseases is that sensitivity is compromised for specificity in these lateral-flow immunoassays.

A different strategy to miniaturize complex instruments suffers from the difficulty in reducing the cost to what most people would be able to afford ““ about $1 to $2 dollars per test ““ as well as the need for spare parts and special handling.

“To have a global impact, we need to have accurate and sensitive tools that can help health care providers treat patients at a low cost during their first visit”, Schneider said.

“Our goal is to translate this technology and design into a rugged and easy-to-use device that we would give away for free to clinics. The only costs involved with using the Integrascope would be in the drop of particles and a small piece of a superhydrophobic surface ““ about $1 to $2 dollars,” Garcia said.

International collaboration

With the repeated and more frequent spread of infectious diseases around the globe, it’s becoming more critical to have good diagnostic systems in poor countries so proper treatment can be provided rapidly ““ and so that there is a global early-warning system to alert the public if new and significant outbreaks of disease emerge, Garcia said.

To help accomplish that, Garcia and Schneider are teaming with nanotechnology experts Vladimiro Mujica and Manuel Marquez.

They hope to establish collaborations with Latin American universities, government leaders and entrepreneurs to develop the new diagnostic device.

“We believe a joint U.S.-Latin America technology development effort will spark economic activity that will benefit both regions and prevent disease outbreaks and social unrest in our part of the world”, said Mujica, a professor in the Department of Chemistry and Biochemistry in ASU’s College of Liberal Arts & Sciences.

Marquez, an entrepreneur and adjunct faculty member in the School of Biological and Health Systems Engineering, is president and research leader of the company YNANO. The company specializes in droplet-nanoengineering for biomedical applications, including Integrascope for disease diagnosis.

“I’m excited about the potential for this device, and that students can be directly engaged in the research and development process,” Marquez said. “I’ve devoted more than a decade of my career to enabling engineers and scientists to rapidly apply their basic discoveries to solving real-life problems.”

Image Caption: Pictured is a drop of blood on a prototype of a diagnostic device developed by ASU researchers. It works by shining a near-infrared light-emitting diode (LED) on a drop of whole blood sitting on a water-repellent surface. The shape of the drop focuses the light into an intense beam measured by a second LED. Nanoparticles or microparticles in the drop begin to stick together when the fluid sample from a patient contains an infectious agent or a protein. This leads to the self-mixing action that enables detection of indications of infectious diseases and unhealthy protein levels. Credit: ASU

On the Net:

MRI Detects Child Abuse

Whole-body magnetic resonance imaging (MRI), which is highly accurate at detecting soft-tissue abnormalities, may play a role in detecting child abuse in infants.

Currently, the diagnosis of abuse relies heavily on the presence of skeletal injuries, and high-quality skeletal surveys (a series of X-rays of all the bones in the body) are recommended to visualize the often subtle high-specificity fractures seen in infant abuse. Bruises are the most common sign of physical abuse, but subcutaneous tissue and muscle injuries are not currently evaluated with a global imaging technique in living children.

A study performed at Children’s Hospital Boston and Harvard Medical School in Boston, MA, included 21 infants who underwent whole-body MRI for the evaluation of suspected child abuse. Summary skeletal survey and whole-body MRI identified 167 fractures or areas of skeletal signal abnormality.

“Although our study results revealed that whole-body MRI is insensitive in the detection of classic metaphyseal lesions and rib fractures, we found it did identify soft-tissue injuries such as muscle edema and joint effusions that, in some cases, led to identifying additional fractures,” lead author Jeannette M. Perez-Rossello, MD, was quoted as saying.

“Although our study indicates that whole-body MRI is currently unsuitable as a primary global skeletal imaging tool for suspected imaging abuse,” said Perez-Rossello, “it may be useful as a supplement to the skeletal survey in selected cases, particularly with regard to soft tissue injuries.”

SOURCE:  American Journal of Roentgenology, September, 2010.

Isotope Shortage Could Jeopardize Care

Scientists fear that a global shortage of radioactive isotopes, required for the 20 million medical scans and treatments done each year, could jeopardize patient care and drive up health care costs.

Medical isotopes are used to diagnose and treat a variety of diseases. Isotopes injected into the body can enable doctors to determine whether the heart has adequate blood flow or whether cancer has spread to a patient’s bones. Isotopes help diagnose gallbladder, kidney, and brain disorders. When delivered into a malignant tumor, isotopes can kill the cancer cells while minimizing damage to nearby healthy tissue.

The shortage of radioactive isotopes also threatens basic and environmental research, oil exploration, and nuclear proliferation.

“Although the public may not be fully aware, we are in the midst of a global shortage of medical and other isotopes,” Robert Atcher, Ph.D., MBA, director of the National Isotope Development Center (NIDC), the U.S. Department of Energy unit responsible for production of isotopes nationwide, was quoted as saying. “If we don’t have access to the best isotopes for medical imaging, doctors may be forced to resort to tests that are less accurate, involve higher radiation doses, are more invasive, and more expensive.”

The shortage already is forcing some doctors to reduce the number of imaging procedures they order for patients, Atcher added.

Each day more than 50,000 patients in the U.S. receive diagnostic and therapeutic procedures using medical isotopes, particularly individuals with heart problems and cancer. Eight out of every 10 procedures require the isotope technetium-99m, which has a half-life of only six hours. Thus, technetium-99m cannot be stockpiled. It must constantly be made fresh and distributed quickly to medical facilities.

Wolfgang Runde, Ph.D., who works with Atcher at the Los Alamos National Laboratory in New Mexico, was quoted as saying that an unexpected shutdown of a major isotope production facility in Chalk River, Ontario, Canada, in 2009 precipitated the shortage. The Chalk River facility produces 50 percent of the U.S. supply of the isotope used to make technetium-99m. Simultaneous production problems at other isotope facilities compounded the problem. Remaining suppliers have not been able to make up the resulting shortage, leaving the U.S. in an isotope supply crunch. The Chalk River facility, which was scheduled to restart this summer, remained closed as of early August.

“Shortage of this key medical isotope makes it more difficult to carry out important medical procedures, such as finding out whether cancer has spread to the bones,” said Atcher. “Doctors have been trying everything they can think of to meet the needs of patients, including the use of other less-than-ideal isotopes, but it has been a real struggle.”

Atcher noted that the U.S. is highly dependent on foreign suppliers of medical isotopes. Only about 10 to15 percent of medical isotopes are produced domestically. The nuclear medicine community has been pressuring the U.S. government to develop improved domestic capability for producing these materials to reduce this dependence.

“The challenge we have is to produce enough materials to meet commercial needs as well as needs of the research community “” from nuclear physics, to environmental research, to medical research “” amid increasing demands and fewer isotope sources,” Atcher said. “The long-term solution to this crisis remains to be seen.”

SOURCE:  Presented at the National Meeting of the American Chemical Society, Boston, August 22, 2010.

Antibiotic May Reduce Stroke Risk And Injury In Diabetics

A daily dose of an old antibiotic may help diabetics avoid a stroke or at least minimize its damage, Medical College of Georgia researchers report.

Minocycline, a drug already under study at MCG for stroke treatment, may help diabetics reduce remodeling of blood vessels in the brain that increases their stroke risk and help stop bleeding that often follows a stroke, said Dr. Adviye Ergul, physiologist in the MCG Schools of Medicine and Graduate Studies.

“We know that diabetes is bad and that diabetics have more strokes and that when they have a stroke they do more poorly,” said Ergul, corresponding author on the study published in the Journal of Cerebral Blood Flow and Metabolism. Nearly 70 percent of the estimated 24 million Americans with diabetes list a major vascular event such as a stroke or heart attack as a cause of death, according to the American Diabetes Association.

To figure out why, the researchers focused on the blood vessels of diabetic rats, finding that even moderately elevated blood glucose levels can result in thicker, twisted blood vessels that tend to leak, resulting in the bleeding that can follow a stroke. Clot-based strokes are the most common type while hemorrhagic strokes tend to be most lethal. But diabetics are at risk for a sort of combination in which a clot causes the stroke and leaking from the blood vessels follows ““ called hemorrhagic transformation ““ a scenario that can dramatically worsen the stroke’s effect, Ergul said.

Much of the bad vascular remodeling that occurs in diabetes results from elevated glucose activating matrix metalloproteinases or MMPs. “They break down things and allow for cells to move so blood vessels change shape,” Ergul said. They also destroy the basement membrane of blood vessels, allowing the destructive bleeding that often follows a diabetic stroke. On the good side, MMPs help clean up damage to enable repair and recovery.

One way minocycline works is by blocking MMPs. Less directly, diabetes drugs like metformin, used to lower blood sugar, also reduce MMP levels.

Another MCG research team, led by Dr. David Hess, stroke specialist and chairman of the Department of Neurology, is showing that minocycline given alone or with tPA, the clot dissolver that is the only FDA-approved stroke treatment, can also work after a stroke to help minimize damage. One great synergy about the pair is that tPA increases bleeding risk and minocycline decreases it.

That could particularly benefit diabetics who already are at increased risk for bleeding, particularly when oxygen is restored to that area of the brain. This damage ““ called a reperfusion injury ““ is a primary reason that a diabetic stroke may look small on a magnetic resonance image but can have a devastating, effect, Ergul also has found.

Some of her next studies will include giving both tPA and minocycline to diabetic rats to study bleeding and the impact of the two drugs on blood vessels, particularly the tiny ones that are tightly connected to brain cells.

On the Net: