Playing An Instrument Helps Memory and Hearing

Researchers found that playing a musical instrument can help keep memories active and hearing working.

A new study found that musical training helps the brain to be more adaptable to aging and make adjustments for any decline in the ability to remember or ability to separate speech from background noise.

The research adds further weight to the benefits of musical training, which is also associated with greater learning ability in the classroom.

“Lifelong musical training appears to confer advantages in at least two important functions known to decline with age ““ memory and the ability to hear speech in noise,” Northwestern University’s Dr. Nina Kraus, who co-authored the study, said in a statement.

“Difficulty hearing speech in noise is among the most common complaints of older adults, but age-related hearing loss only partially accounts for this impediment that can lead to social isolation and depression.”

“It’s well known that adults with virtually the same hearing profile can differ dramatically in their ability to hear speech in noise.”

Researchers at the Auditory Neuroscience Laboratory made 18 musicians and 19 non-musicians between the ages of 45 to 65 carry out a number of tests for speech in noise, memory and processing ability.

The musicians beat the non-musician group in all tests except one.

Kraus said the experience of extracting meaningful sounds from a complex soundscape enhances the development of auditory skills.

“The neural enhancements we see in musically-trained individuals are not just an amplifying or ‘volume knob’ effect,” Kraus said in a statement.

“Playing music engages their ability to extract relevant patterns, including the sound of their own instrument, harmonies and rhythms.”

Kraus said music training “fine-tunes” the nervous system.

“Sound is the stock in trade of the musician in much the same way that a painter of portraits is keenly attuned to the visual attributes of the paint that will convey his or her subject,” she said in a statement.

“If the materials that you work with are sound, then it is reasonable to suppose that all of your faculties involved with taking it in, holding it in memory and relating physically to it should be sharpened.”

“Music experience bolsters the elements that combat age-related communication problems.”

The study was published in the journal PLoS One.

On the Net:

MRSA Discovered In Bedbugs

Despite the fact that bedbugs are generally not viewed as a major public health threat, drug-resistant staph bacteria has been found in bedbugs from three hospital patients from a subjugated neighborhood in Vancouver by a team of Canadian scientists.

Although bedbugs are not known to spread disease, they do lead to scratching, which may cause the skin to tear and make people at risk for bacteria, said Dr. Marc Romney, an author on the study.

A study published Wednesday in the journal Emerging Infectious Diseases by the US Centers for Disease Control and Prevention (CDC) suggests that the tiny pests could play a role in transmission of disease. A sampling of bedbugs collected from patients living in crowded conditions in the Vancouver neighborhood were found to carry the drug-resistant bacteria known as MRSA — methicillin-resistant Staphylococcus aureus.

But there is no evidence whether or not the bedbugs actually spread the bacteria or a less dangerous germ — vancomycin-resistant Enterococcus faecium — also found in the tiny pests, but the study is “an intriguing finding” that needs to be looked at further, said Romney, a microbiologist at St. Paul’s Hospital in Vancouver.

Romney said he and his colleagues decided to do the research after seeing a boom in bedbugs and MRSA cases from the Downtown Eastside neighborhood. They collected five bedbugs that they crushed and analyzed. Three were found to carry MRSA, which can become deadly if it gets through the skin and into the bloodstream.

Two of the bugs had VRE, which is less dangerous than MRSA.

Both germs can often be found in hospitals, and experts have been far more worried about nurses and other healthcare workers spreading the bacteria than insects.

It is not clear if the bacteria originated with the bedbugs or if the bugs picked it up from infected people, Romney added.

“While the findings of this study are likely to raise concerns about bedbugs and bacterial transmission in impoverished communities, our primary concern for the public at large remains to be the psychological impact bedbugs have on those suffering from infestations,” Jeffrey White, a research entomologist for Bedbug Central, a website dedicated to information concerning bedbug issues, told FoxNews.com.

“We understand the anxiety this study’s findings may cause amongst the general public, however, the study only confirms what has long been suspected and more research needs to be conducted to understand the value of this information,” White said.

While bedbugs were nearly exterminated in North America in the mid-20th century, increased global travel has contributed to their resurgence in recent years, according to the CDC. Bedbugs can hide in clothing and in furniture.

While there is no solid proof that people have caught MRSA from bedbug bites, the insects “may act as a hidden environmental reservoir” for the bacteria, the authors wrote. “Bedbugs carrying MRSA and/or VRE may have the potential to act as vectors for transmission.”

The researchers did not confirm whether the bacteria were on the outside of each bug or living and growing inside it, which would suggest the possibility of biological transmission, the researchers said.

But even if the bacteria was only carried on the bugs’ exteriors, the finding is still significant said Romney, because bedbugs could spread the germ from person to person, especially in crowded areas such as homeless shelters, like the ones in the downtrodden Vancouver neighborhood where the bedbugs with MRSA were found.

Image Credit: CDC/ Harvard University, Dr. Gary Alpert; Dr. Harold Harlan; Richard Pollack. Photo Credit: Piotr Naskrecki

On the Net:

The Urea Cycle: An Anabolic Steroid For Diatoms

The urea cycle is a metabolic pathway used in mammals to incorporate excess nitrogen into urea and remove it from the body. However, it appears to play a far more wide-ranging role in the group of algae known as diatoms. Scientists from the Max Planck Institute of Molecular Plant Physiology in Potsdam are part of an international team of researchers that has succeeded in identifying the urea cycle in diatoms as a distribution and recycling centre for inorganic carbon and nitrogen. The urea cycle plays a key role in the fixation of the two elements and also makes an important contribution to the fact that diatoms can recover from short-term nutrient withdrawal and respond immediately to the availability of a greater supply of food by increasing their metabolic and growth rates. Genes that have reached the diatom genome through lateral gene transfer contribute to this capacity.

Diatoms are the main component of phytoplankton and thus form the basis of the marine food chain. Because they carry out photosynthesis with their chloroplasts, they account for a large proportion of the oxygen production in the earth’s atmosphere. The fact that diatoms have a urea cycle, something that was originally believed to exist only in multicellular organisms, could explain their success in colonising the oceans. Moreover, the diatom cell nucleus contains genes that migrated to the diatom genome from chloroplasts and bacteria. An international team of researchers including Alisdair Fernie from the Max Planck Institute of Molecular Plant Physiology in Potsdam set itself the task of identifying the contribution made by the urea cycle to the metabolism of diatoms.

In the laboratory the researchers reconstructed the upwelling phenomenon found in the ocean, which causes nutrient-rich water to rise from deeper areas to the surface and therefore to the diatom habitat. Diatoms immediately respond to this excess supply of nutrients following a period of nutrient deprivation by increasing their rates of growth and proliferation. The researchers compared the reaction of normal cells with cells that do not have a functioning urea cycle in the laboratory. It emerged that the growth rate in the cell lines without a functioning urea cycle was 15 to 30 percent lower than that in the normal cells. It may be deduced from this that the urea cycle in diatoms serves in the formation of carbonaceous and nitrogenous compounds. This observation is extremely surprising as animals mainly use the urea cycle for the disposal of excess nitrogen and for the regulation of their mineral household.

In evolutionary terms, it would appear that the animal urea cycle developed from an older metabolic pathway. This discovery thus throws a new light on the phylogenetic relationships between diatoms, plants and animals. Before diatoms developed the capacity to carry out photosynthesis, which positions them closer to plants and green algae in phylogenetic terms, they may have been more closely related to the evolutionary ancestors of animals.

The same experiments revealed that a defective urea cycle also has a negative impact on other processes, such as cell wall synthesis and the citric acid cycle in the diatoms. “Our findings indicate that the different metabolic paths in diatoms are extremely well connected,” explains Fernie.

The number of metabolic paths branching off from the urea cycle in diatoms is particularly high. For example, arginine and ornithine, the intermediate products of urea synthesis, are used in the development of components of the cell wall. The diatoms obtained the enzymes necessary for this through lateral gene transfer from bacteria. The reason for the superiority of diatoms over other unicellular organisms found in the oceans lies in the good connectivity between the different metabolic pathways.

On the Net:

Neanderthals May Have Died Out Earlier Than Believed

Researchers have new evidence that suggest Neanderthals died out much earlier than previously thought, and possibly before modern humans arrived.
Carbon-dated Neanderthal remains from a cave in the foothills of the Caucasus Mountains in Russia were found to be 10,000 years older than previous research had suggested. The new evidence contradicts the popular theory that Neanderthals and modern humans interacted for thousands of years before the archaic species became extinct.
Instead, the researchers believe any co-existence between the two species is likely to have been far more restricted, perhaps at most a few hundred years. It is quite possible in some areas Neanderthals became extinct before modern humans moved out of Africa.
The remains from the cave, known as Mezmaiskaya, were dated with a precise carbon-dating technique, said paleoanthropologist Thomas Higham of the University of Oxford, UK, a co-author of a study published in the Proceedings of the National Academy of Sciences.
Higham’s team says the implication is that Neanderthals and humans may never have met in Europe. However, the Neanderthal genome, decoded by scientists last year, suggests that the ancestors of all humans, except those from Africa, interbred with Neanderthals somewhere.
“DNA results show that there was admixture probably at some stage in our human ancestry, but it more than likely happened quite a long time before humans arrived in Europe,” says Ron Pinhasi, an archaeologist at University College Cork in Ireland, who is lead author of the latest study. “I don’t believe there were regions where Neanderthals were living next to modern humans. I just don’t find it very feasible.”
The researchers found the fossil in question to be 39,700 years old, instead of the previous assigned 30,000 years old. The dating implies that Neanderthals did not survive at the cave site beyond that time.
The research suggests that if we are going to have accurate chronologies the data needs to be improved so possible associations between Neanderthals and early modern humans can be properly assessed. The previous dating processes seem to have “systemically underestimated” the true age of Late Middle Paleolithic and Early Upper Paleolithic deposits, artifacts and fossils by up to several thousand years, says the research report.
“The latest dating techniques mean we can purify the collagen extracted from tiny fragments of fossil very effectively without contaminating it,” said Higham. “Previously, research teams have provided younger dates which we now know are not robust, possibly because the fossil has become contaminated with more modern particles.”
“This latest dating evidence sheds further light on the extinction dates for Neanderthals in this key region, which is seen by many as a crossroads for the movement of modern humans into the wider Russian plains. The extinction of Neanderthals here is, therefore, an indicator we think, of when that first probably happened,” said Higham.
Carbon dating of stone tools characteristic to both humans and Neanderthals, as well as their remains, has previously suggested that the first humans to reach Europe, between 40,000 and 30,000 years ago, shared the land with Neanderthals that were believed to have been long established there.
However, carbon dating of remains older than 30,000 years is tricky because nearly all the radioactive carbon in the remains has decayed, said Higham.
The overlap in dating could also be the result of contamination of older finds with younger finds. “What we are finding is that the careful and patient excavation work of many archaeological sites has not been supported by accurate and reliable radiocarbon dating,” Higham said.
But, using the most up-to-date carbon-dating techniques, Pinhasi and his team were able to date the remains of two Neanderthal infants from the cave to close to 40,000 years old. The infants’ bones were found above the cave’s other Neanderthal remains, so they must have been the most recent, Higham explained.
Higham’s conclusion fits with another discovery made by David Reich, a geneticist at Harvard Medical School in Boston, Massachusetts, whose team found that all contemporary humans, except those who trace their roots back to Africa, owe about 1 to 4 percent of their DNA to interbreeding between early modern humans and Neanderthals.
Reich’s team did not find any proof that Neanderthals ever mated with the ancestors of modern Europeans specifically, however.
Yet he said that new, more sensitive methods for detecting interbreeding, as well as genome sequences from late Neanderthals could change that conclusion. “Absence of evidence isn’t evidence of absence,” said Reich.
Chris Stringer, a paleoanthropologist at the Natural History Museum in London, agrees that Neanderthals were rare in Europe after 40,000 years ago, but added that they might not have completely disappeared.
“It does seem that if Neanderthal populations existed after that time, they must have been small and scattered remnants,” he said.
There is evidence of more recent Neanderthal settlements. Clive Finlayson, director of the Gibraltar Museum, and his colleagues have dated a Neanderthal settlement in Gorham’s Cave in Gibraltar to as recently as 24,000 years ago.
“Eurasia is a big place and there doesn’t seem to be any reason why populations of Neanderthals may not have survived somewhere,” said Higham.
The University of Oxford and University College Cork researchers collaborated with the Laboratory of Prehistory at St Petersburg, Russia. The study was funded by Science Foundation Ireland.

On the Net:

Personality Affects How Likely We Are To Take Our Medication

The results of a unique study from the University of Gothenburg, Sweden, show that personality has an impact on how likely people are to take their medication. This is the first major study of its kind to be published in the online journal PloS ONE.

The study was based on 749 people with chronic diseases who responded to a questionnaire on medication adherence behaviour, in other words whether they take their medicine. Their personalities were also assessed using another questionnaire, the Five Factor Inventory (NEO-FFI), which comprises 60 statements with five different responses. The questionnaire was based on five personality traits: neuroticism, extroversion, openness to experiences, agreeableness and conscientiousness.

A person who is greatly influenced by conscientiousness can be described as target-oriented and structured. In the study this tied in well with how this type of person approached their medication as they were careful to follow the doctor’s prescription. However, the personality trait of neuroticism can mean that a person is fairly anxious which, according to the study, had a negative impact on taking medication.

The researchers’ results show that high scores for both of these personality traits can lead to lower levels of adherence. The same trend was evident when it came to agreeableness, which had a positive correlation with taking medication as prescribed.

“If the person with the trait of agreeableness also had a low score for conscientiousness, and is thus less methodical, this seemed to have a negative effect on medication adherence,” says Malin Axelsson.

Her explanation for this is that people with high scores for conscientiousness are perhaps more likely to stick to their medication on account of a more structured temperament. On the other hand, those with low scores for the same personality trait can be described as slightly more unstructured and perhaps less inclined to introduce an element of routine into taking their medication.

“Both types may need different kinds of education and/or support,” says Axelsson. “As such, it may be important to take different dominant personality traits into account when treating patients with chronic diseases. The results of similarly formulated interview questionnaires could help people to become more aware of their medication and access more tailored support and/or education from healthcare professionals.”

On the Net:

Spikemoss Genome Offers New Paths For Biofuels Research–Bridges Plant Development Gap

It’s not quite Christmas, but the DNA sequence of a small plant that resembles the seasonal conifers is providing biofuels researchers with information that could influence the development of candidate biofuel feedstock plants and offering botanists long-awaited insights into plant evolution.

“When you burn coal, you’re burning Selaginella’s ancestors,” said Purdue University botanist Jody Banks, who originally proposed that the U.S. Department of Energy (DOE) Joint Genome Institute (JGI) sequence the plant more commonly known as spikemoss as part of the DOE JGI’s 2005 Community Sequencing Program.

Published online May 5 in Science Express, a team of researchers from over 60 institutions, that included DOE JGI’s Dan Rokhsar and Igor Grigoriev, the senior authors of this work, reported the genome sequence of Selaginella moellendorffii and used a comparative genomics approach to identify the core genes that are likely to be present in a common ancestor to land plants.

Grigoriev noted that the Selaginella genome helps fill in a large gap in plant evolution from the unicellular green alga Chlamydomonas, sequenced at the DOE JGI and published in 2007, to flowering plants with vascular systems. “Selaginella occupies a phylogenetically important position for which we had no reference,” he said. “On one end of the spectrum we had mosses such as Physcomitrella” “” the first moss to have its genome sequenced and published by DOE JGI “” “and on the other are angiosperms such as grasses including Brachypodium,” whose genome was published by DOE JGI last year.

Spikemoss stands tall like grasses, but because it diverged from flowering plants more than 400 million years ago, it doesn’t have the roots and leaves like later plants. To help understand these relationships, the researchers compared the genome of Selaginella against those of Chlamydomonas, Physcomitrella and 14 angiosperms (flowering plants), including Arabidopsis and rice to identify common genes.

Banks said having the spikemoss genome revealed that the transition from mosses to plants with vascular systems didn’t involve as many genes as going from a vascular plant that doesn’t produce flowers to one that does. “We have a much better idea with Selaginella which genes evolved only in angiosperms. Plants need vascular tissues to be tall, to transport nutrients from roots to leaves,” she said. “That’s fairly complicated, but it turns out that process just didn’t need that many genes compared to inventing flowers.”

To help vascular tissues to stay upright, plants rely on lignin, a polymer biofuels researchers are targeting for investigation because its rigid structure is challenging to break down, impeding their use as potential bioenergy feedstocks. Banks’ colleague Clint Chapple, a coauthor on the paper and a Purdue colleague, has been using the Selaginella genome to study the pathways by which three different types of lignin are synthesized in plants.

“What we learned is that Selaginella not only invented the S type of lignin independently, maybe even earlier, than angiosperms but that they go about doing it through a related but different chemical route,” Chapple said. He described a recent project [funded by the National Science Foundation] in which enzymes from the lignin-synthesizing pathway in Selaginella were used to modify the canonical lignin-producing pathway in Arabidopsis to produce the polymer. Having the genome sequence offers strategic research opportunities, he said. “We’ve known for some time that if you alter the lignin building blocks you can improve biomass for agricultural and industrial uses.”

Banks also noted that the Selaginella research community has grown up around the availability of the genome, which was made publicly available through the DOE JGI’s plant portal Phytozome in 2009.  One metric she cites is the number of researchers who’ve contributed to the Selaginella Genomics wiki she helps maintain, whose existence spread solely by word of mouth. “There are more than 100 coauthors now just because people are interested in the genome,” she said. “There have been a large number of recent papers, all including Selaginella genes because it really helps the researcher understand the evolution of their favorite gene family. Selaginella represents a whole branch of the plant evolutionary tree that no one has sampled before, and it is really important. The lignin story is just one example.”

The U.S. Department of Energy Joint Genome Institute, supported by the DOE Office of Science, is committed to advancing genomics in support of DOE missions related to clean energy generation and environmental characterization and cleanup. DOE JGI, headquartered in Walnut Creek, Calif., provides integrated high-throughput sequencing and computational analysis that enable systems-based scientific approaches to these challenges. Follow DOE JGI on Twitter.

On the Net:

Vatican Science Panel Calls Attention To The Threat Of Glacial Melt

Pontifical Academy of Sciences working group of leading scientists to present report to Pope Benedict XVI

Scripps Institution of Oceanography / University of California, San Diego

A panel of some of the world’s leading climate and glacier scientists co-chaired by a Scripps Institution of Oceanography, UC San Diego researcher issued a report today commissioned by the Vatican’s Pontifical Academy of Sciences citing the moral imperative before society to properly address climate change.

The co-authors of “Fate of Mountain Glaciers in the Anthropocene” list numerous examples of glacial decline around the world and the evidence linking that decline to human-caused changes in climate and air pollution. The threat to the ways of life of people dependent upon glaciers and snow packs for water supplies compels immediate action to mitigate the effects of climate change and to adapt to what changes are happening now and are projected to happen in the future.

“We are committed to ensuring that all inhabitants of this planet receive their daily bread, fresh air to breathe and clean water to drink as we are aware that, if we want justice and peace, we must protect the habitat that sustains us,” the authors write in a declaration prefacing the report. “The believers among us ask God to grant us this wish.”

Scripps Climate and Atmospheric Scientist Veerabhadran Ramanathan co-chaired the working group with Nobel Laureate Paul Crutzen, formerly affiliated with Scripps and Lennart Bengtsson, former head of the European weather forecasting center. The group also included Nobel Laureate Carlo Rubbia, former director general of the CERN Laboratory. Among the rest of the 24 authors are Lonnie Thompson of Ohio State University, Wilfried Haeberli from Switzerland, Georg Kaser from Austria and Anil Kulkarni from India, considered among the world’s foremost experts on glacial change. Former Scripps Director Charles Kennel and Scripps Professor of Atmospheric Chemistry Lynn Russell are also members of the working group.

“The widespread loss of snow and ice in the mountain glaciers is one of the most visible changes attributable to global climate change. The disintegration of many small glaciers in the Himalayas is most disturbing to me since this region serves as the water tower of Asia and since both the greenhouse gases and air pollutants like soot and ozone contribute to the melting,” said Ramanathan, who has been a member of the Pontifical Academy of Sciences since 2004.

Report authors met at the Vatican from April 2 to April 4, 2011 under the invitation of Chancellor Marcelo Sanchez Sorondo of the pontifical academy. The report was issued by the Vatican today and will be presented to Pope Benedict XVI.

Though scientists usually refrain from proposing action, Ramanathan said the circumstances warranted advancing suggestions from the working group. The authors recommend pursuit of three measures: immediate reduction of worldwide carbon dioxide emissions, reduction of concentrations of warming air pollutants such as soot, ozone, methane and hydroflurocarbons by up to 50 percent, and preparation to adapt to climate changes that society will not be able to mitigate.

The report title refers to the term coined by Crutzen to describe what is considered a new geologic epoch that began when the impacts of mankind on the planet became a major factor in environmental and climate changes.

“The recent changes observed in glacial behavior are due to a complex mix of causal factors that include greenhouse gas forcing together with large scale emissions of dark soot particles and dust in ‘brown clouds’, and the associated changes in regional atmospheric energy and moisture content, all of which result in significant warming at higher altitudes, not least in the Himalayas,” the authors write.

“Changes of mountain glaciers all around the world are rapid and impacts are expected to be detrimental, particularly in the high mountains of South America and Asia,” said Kaser, of the Institute for Meteorology and Geophysics at the University of Innsbruck. “Yet, our understanding about glacier changes in these regions is still limited and ambitious and joint efforts are required to respond to these problems. With its report, the pontifical academy contributes considerably to raising awareness.”

“Glaciers are one of our most visible evidences of global climate change,” added Thompson. “They integrate many climate variables in the Earth system. Their loss is readily apparent and they have no political agenda. Glaciers remind us of the stunning beauty of nature and in turn the urgency of doing everything in our power to protect it.”

The authors conclude: “We appeal to all nations to develop and implement, without delay, effective and fair policies to reduce the causes and impacts of climate change on communiÂties and ecosystems, including mountain glaciers and their watersheds, aware that we all live in the same home. By acting now, in the spirit of common but differentiated responsibility, we accept our duty to one another and to the stewardship of a planet blessed with the gift of life.”

On the Net:

Youth Trained To Design, Develop And Market Apps

NSF-funded Youth Radio’s Mobile Action Lab trains young people to design, develop and market apps based on community needs

In nearly 20 years, Youth Radio has grown from a small radio skills training program in Berkeley, Calif. to a national organization with bureaus in Los Angeles, Atlanta and Washington, D.C. The program has helped young people develop marketable behind-the-scenes and on-air skills, winning it the most coveted awards in journalism.

Now it’s taking on another venture: app development.

Called the Mobile Action Lab, Youth Radio’s new initiative helps young people learn to propose, design and market computer and smartphone-based apps that serve community needs. Meanwhile, the skills participants learn are valuable commodities in today’s tech-driven economy.

Mobile Action Lab provides 14-24 year olds training and hands-on experience in media and science, technology, engineering and mathematics (STEM). It partners young people with professional app developers to create five mobile and web-based apps that serve real needs in the community, such as finding free food distribution, youth and police relations and other resources.

Youth Radio was one of the 2010 winners of the MacArthur Foundation’s Digital Media Learning Competition.  The award provided a key financial investment to start the Mobile Action Lab. Based in Oakland, Calif. and funded by the National Science Foundation, Mobile Action Lab officially launched in September of the same year.

“Youth Radio launched its Mobile Action Lab to expand our science and tech offerings for youth and to leverage the potential of mobile platforms to create high-impact digital projects,” said Elisabeth “Lissa” Soep, Senior Producer and Research Director of Mobile Action Lab. “Based on challenges in public education, transformations in media worlds and opportunities to spark STEM learning, Youth Radio decided to capitalize on the talent of its young people and its network of professional colleagues by teaching young people to create new technology platforms. Apps increasingly determine who knows what, how news travels and what makes change possible.”

According to Soep, the Mobile Action Lab strives to “lower barriers that have traditionally blocked teens and young adults from learning to develop innovative tech platforms, which is especially significant for those who haven’t had access to excellent, engaging STEM teaching in schools.”

Soep also explained how Mobile Action Lab provides a network between young people, especially low income youth and youth of color, with tech developers, engineers, and entrepreneurs; and prepares all graduates of Mobile Action Lab with the skill-sets to “configure design-development teams and play key roles in future tech-based projects–from conception through research, design and development, testing, launch and analysis.”

The app process

Although the specific development process differs for each app, in all of the projects, young people are deeply involved, from brainstorming the app’s concept to marketing the app and planning its distribution. After the general app concept has been defined, young people recruit a team to work on the app, research the app market and types of potential users, define and diagram the functionality of the app, and design the app’s “look and feel.” In quarterly developer workshops, young people learn how to program and code simple apps, which sometimes serve as prototypes for fully developed products, using the Google App Inventor.

“Through hands-on workshops, we’re not just theorizing about app development–we’re actually learning the coding side,” said Asha Richardson, a project associate at Mobile Action Lab. “That’s something not enough people know–what programming is–not enough schools talk about it. App Inventor uses coding blocks, and you have to know which parts go where in order to make your app do what you want it to do. You move the coding blocks around, then throw it on the phone, see if it works, fix it, and then try again.”

Austin De Rubira explained how the principles he learned at this workshop serve an integral role in Mobile Action Lab. One of these principles is the “iterative development process.”

“It’s basically creating something through trial and error–you try something, and if it doesn’t work, you fix it until it does work,” explained De Rubira, an intern at Mobile Action Lab. “That was a really hands-on definition of the iterative development process. It definitely clicked in my mind and it really made a lot of sense to me. Iterative development is a very effective way to work on apps, and that’s how I’ve been thinking about our entire process with the Mobile Action Lab.”

Once the coding process is complete, the apps are tested to make sure they function as designed and the final product is user-friendly. Young people research potential distribution partners and co-create promotion plans to market their product.

De Rubira described the tediousness and attention to detail required of this process. “I’ve learned that an app, if any part of it isn’t good, it’s not going to be a good app. If it doesn’t have good graphics, you won’t be interested,” said De Rubira. “If it’s a bad concept, obviously that won’t work. The interface is really important too. If it’s not intuitive to work with, the functionality has to be such that–in addition to being fun to play with, it has to do something that’ll make you interested in making you come back.”

App projects

For each of these app projects, Mobile Action Lab will work with students to track statistics relating to the apps usage, such as numbers of installs, new and returning visitors, page views and the duration of time spent using the app. Additional app development is underway, including an app that strives to increase communication and understanding between youth and the police, and educate youth about their rights and responsibilities as they relate to the legal system.

Forage City: The goal of Forage City is to create an app that mediates the process of gathering and redistributing excess food from backyard trees to people in need. This way, food that would otherwise rot and go to waste, can be fed to the needy. The app uses crowdsourcing, which enables the public to collectively complete specific tasks that traditionally may have been performed by specific employers or contractors. Users include residents and non-profit organizations, such as homeless shelters, foodbanks, youth organizations and afterschool programs across the United States.

“The goal is to improve food equity by enabling users to have access to fresh produce that is lacking in poor communities,” explained Soep.

The Forage City app is being developed in collaboration with Forage Oakland’s Asiya Wadud and designers from UC Berkeley’s Information School. This app will be accessed through the Web and smart phones and the Beta is anticipated for June 2011.

VoxPop: Soep explained that the goal of VoxPop is to provide mobile and interactive radio that is “glocal” (global and local). It enables “users to share stories and report news from around the world, including ‘hot spots,’ such as Japan and Egypt, and regions that often get partial and/or distorted treatment in mainstream media,” said Soep. VoxPop is being developed in collaboration with Youth Speaks, the nation’s leading producer and presenter of youth spoken word, and developers from Stanford University. This app will be accessed through the iTunes store. The Beta is anticipated for July 2011.

All Day Play: All Day Play is Youth Radio’s online radio station and music site that streams hip-hop/eclectic music, including music chosen by some of the area’s hottest DJs, new and established artists and Youth Radio interns and graduates who are pursuing music careers. The initial Beta version of the app was created at one of Mobile Action Lab’s “App Inventor” workshops and is anticipated to be released in May 2011. It will be accessible through the Android marketplace.

STEM careers and developing career-focused skills

Another one of the goals of Mobile Action Lab is to show young people that STEM subjects can be fun to study and can lead to career paths that are exciting and fulfilling, as well as practical.

“I hope we can get more young people excited about coding, math, designing and science,” said Richardson, who explained why she thought that many young people fail to find these subjects interesting or appealing.

“It doesn’t look as glorifying or amazing to be a computer scientist as it would to be a lawyer or doctor,” said Richardson. “The stereotype is people in cubicles who aren’t very social and lead awkward lifestyles. That’s how it’s presented.”

These stereotypes may be further generated because some young people might not be fully aware of what issues are addressed by different scientific fields. For instance, Richardson noted that for a long time, she didn’t know what was meant by the term “engineering.” “You weren’t shown that in school,” she said.

By contrast, the Mobile Action Lab engages young people in hands-on activities that address relevant, real-world issues. In addition to teaching young people the technical side of app programming, these activities teach them scientific careers that involve applying scientific, math and business skills outside the lab may be exciting.

“We’re meeting people in the field, developers and designers. And they’re outgoing people! They like to talk about their work and they travel … they get to do all these amazing things that you wouldn’t think a computer scientist does,” said Richardson. “It gives you a face to counteract the images out there.”

“We’re not just teaching science and math. We’re teaching how you develop something, how you sell it–even if it’s free, we want a lot of people to use it,” said Richardson about the app development process. “So now, even when I’m sitting down with something I’ve purchased, I look at how it works and why I like it. It makes us better designers of what we want to do. Whenever I download new apps, I’m looking at the color schemes, how many buttons it has, how it is actually working and whether it’s intuitive.”

“It makes me a more critical thinker all around,” said Richardson.

Ellen Ferrante, National Science Foundation

Image Caption: Asha Richardson takes a photo for a paper prototype video produced to demonstrate the key functions of the Forage City app. Credit: Youth Radio

On the Net:

Is Happiness Coded Into Our Genes?

Is our general outlook on life DNA coded into us at birth? Researchers at the London School of Economic and Political Science have discovered that those of us with a functional variant of the 5-HTT gene in our DNA tend towards general happiness about life, The Telegraph reports.

The 5-HTT gene comes in long and short versions and is involved with the transport of serotonin, a feel-good chemical in the brain. The longer variant allows more efficient release and recycling of the neurotransmitter that creates a self of well-being.

In a study of more than 2,500 Americans, the variants of the gene influence how satisfied ““ or dissatisfied ““ people were with their life, The Guardian UK is reporting. People born with two of the long versions of the gene were more likely to describe themselves as “very satisfied” with life than those who had the two short versions. One each of the gene is passed from each parent.

The study marks a tentative step towards explaining the mystery of why some people seem naturally happier than others.

“This gives us more insight into the biological mechanisms that influence life satisfaction,” said researcher Jan-Emmanuel De Neve, “If you’re feeling down, you can say it’s your biology telling you life is less rosy that it is,” he added.

Genetic coding makes up only one of the many reasons for peoples outlook. Research with twins suggest that genes account for roughly a third to a half of the variation in happiness between people. How many genes it takes to affect how cheerful we are is not yet determined.

The genetic makeup of 2,574 people was researched as representative of the general population by De Neve. Their medical histories were recorded for the US National Longitudinal Study of Adolescent Health.

Among the records were answers to a question participants were asked in their early 20s about life satisfaction. A list of 6 answers could be chosen ranging between “very satisfied” to “very dissatisfied”.

De Neve’s results, published in the Journal of Human Genetics, describe roughly 40 percent claiming to be “very satisfied” with life, and among these, 35.4 percent had two long variants of the gene and only 19.1 percent had two short versions.

Of those who were “dissatisfied” with life, 26.2 percent had two long variants of the gene, while 20 percent had two short versions, indicating a slight over-representation of the long variants in happier people.

Everything else being equal, De Neve concluded that having one long version of the gene increased the number of people claiming to be “very satisfied” by around 8.5 percent. Having two long versions raised the number by 17.3 percent.

De Neve emphasized however, that having two short versions of the gene did not mean a person was to lead a miserable life any more than two long versions would make someone impervious to a pessimistic outlook.

“This gene has an important influence, but you cannot say it causes happiness. Happiness is hugely complex and your experiences throughout the course of your life will remain the dominant force on that,” he said.

On the Net:

The TeraGrid Community Steps Up To Help Japan In Crisis

Advanced computation enabled by supercomputers enhances understanding of earthquake and tsunamis and their impacts

Summary: Earthquake and tsunami. March 11, 2011. Japan.

  • More than 26,000 people are dead or missing and an estimated 400,000 are homeless.
  • An estimated 25 trillion yen, or $330 billion dollars, in damages make it the most costly natural disaster on record. The estimates are more than three times those of the second most costly natural disaster, also an earthquake in Japan.
  • The ruptured Fukushima Daiichi power plant threatens people in northeastern Japan and may have an impact on the ocean and atmosphere far beyond Japanese shores.
  • A fractured power grid and rolling blackouts adversely affect essential services that rely on digital resources.
  • Nearly a quarter of Japan’s total geography has been altered.

Experts think it could take years before Japan’s basic resources are back online, and as spending is prioritized for more urgent humanitarian needs, restoring some resources may be prolonged even more. Digital resources fall into that category.

Essential services that are computer power-dependent, such as security, transportation, education and services like air conditioning and elevators in skyscrapers, probably will be given a lower primacy than other services such as those involving health care.

Moreover, many of Japan’s industrial and research communities have been severely impacted. A myriad of the products used in daily lives come from factories that were destroyed.

So researchers, and particularly members of the National Science Foundation’s TeraGrid community, have stepped up to offer help in the form of short-term and some long-term solutions.

What contributions can be made by the science community?

TeraGrid is the world’s most comprehensive cyberinfrastructure in support of open scientific research. The researchers who support and use this resource form a peerless, multidisciplinary fraternity of innovators and problem solvers.

The following are a few ways the TeraGrid community has begun to help the people of Japan–gestures that have minimal cost to the U.S. research community, while proving to be extremely beneficial to researchers in Japan in the wake of this global tragedy:

1. The Keeneland Project at Georgia Tech has collaborated closely with Tokyo Tech over the past two years on developing innovative computer architectures and software that use graphic processors. Georgia Tech’s Keeneland Initial Delivery system’s combination of architecture and software is nearly identical to Tokyo Tech’s TSUBAME2.0. Currently in preproduction mode, the Keeneland team is working with a select group of Keeneland early adopters to develop programming tools and libraries for applications on graphic processing units. As a result of the disaster, the Keeneland team is exploring ways to provide cycles and storage from Keeneland to colleagues at Tokyo Tech; Japanese researchers will be able to continue their important work during the summer, when the demand for power will exceed the available supply, and, consequently, lead to the temporary shutdown of TSUBAME2.0.

2. Indiana University (IU) provided assistance to the international emergency response community via the U.S. National Aeronautics and Space Administration (NASA)-funded E-DECIDER and QuakeSim projects in the weeks following the disaster. IU staff assisted with the creation of Level 0 satellite data products for the International Charter  which provides a unified system of space data acquisition and delivery to those affected by natural or man-made disasters. IU also made a global analysis image from the very coarse grained (250 m resolution) MODIS (NASA MODIS Rapid Response System) satellite data that revealed the tsunami inundation area by using change-detection algorithms that compared before and after images. Early analysis showed damage more than three kilometers inland in places, which was later confirmed with higher resolution images. Earthquakes are an inevitable threat to many areas of the U.S., not just the West Coast. The Great Central U. S. Shakeout is attempting to highlight this danger and help emergency responders, public health officials, government agencies, and the general public prepare. Understanding the recent major earthquakes in Haiti, Chile, and Japan is a worthy goal that will directly benefit United States citizens.

3. A Louisiana State University professor is collaborating with Japanese colleagues from the University of Massachusetts, Woods Hole Oceanographic Institution and others on a large scale tsunami simulation. With help from volunteers, they quickly prepared an extremely accurate global ocean model, using six different observations that were provided by Japanese collaborators.

LSU and the Louisiana Optical Network Initiative helped in the hours following Hurricane Katrina, by providing the National Oceanic and Atmospheric Administration with emergency access to its high-speed, high-bandwidth networking connections so they could share and transfer critical data quickly from New Orleans. This gesture helped first-responders provide rapid aid.

4. San Diego Supercomputer Center is providing cycles and storage on its Triton and Data Oasis resources to colleagues from the National Institute of Advanced Industrial Science and Technology and Tokyo Institute of Technology. These resources have enabled researchers to continue their Global Earth Observation (GEO) Grid activities, including generation of ground motion maps and analyzing satellite data related to the disaster (some results at disaster.geogrid.org were generated with Triton).

5.  The Texas Advanced Computing Center (TACC) which regularly provides a portion of its supercomputer cycles for emergency applications, just recently, provided Lonestar4 cycles to Japanese researchers from the University of Tokyo, and additional Japanese schools, to model the March 2011 earthquake and tsunami, and the route taken by radioactive content from the Fukushima Daiichi nuclear plant that was dispersed in the ocean and atmosphere.

TACC helped in the months following the Deep Horizons oil rig explosion by donating 6.5 million service units on Ranger to generate a simulation of the anticipated path of the oil spill. This drastically benefited mitigation efforts.

Recognizing TACC’s impact, Dell contributed technology to further expand the organizations efforts to support emergency response efforts. The TACC and Dell teams have since worked to bring together U.S. and Japanese universities in the wake of the earthquake and tsunami.

TeraGrid Forum Chair John Towns is pleased with the immediate response from TeraGrid partners so far, and hopes to see more. “We will work together to develop a more organized and integrated plan to assist Japanese researchers while minimizing the impact to the resources needed by the U.S. research community,” he said. “All requests for TeraGrid resources and services are received via the online system called POPS. Urgent requests are always considered separate and apart from our regular quarterly process,” he added.

The NSF encourages the community to apply for funds that will enable more support through the NSF RAPID grant program. RAPID grants are typically $50,000 to $200,000 for the most relevant projects. The program funds urgent proposals that address the availability of or access to data, facilities, or specialized equipment, including quick-response research on natural or anthropogenic disasters and similar unanticipated events. Applications are accepted via NSF Fastlane through April 29, 2011.

“This isn’t the first time our TeraGrid family took the initiative to help in a crisis,” said NSF’s Barry Schneider, TeraGrid Program Director. “Hopefully their efforts will help Japanese researchers return to some sense of normality, allow the world to gain a better understanding of earthquakes and tsunamis in general, and prevent future loss. It’s a great example of how the U.S. investment in science contributes to global scientific, social, and economic progress,” he added.

Image Caption: The magnitude-8.9 quake that struck Japan at 2:46 p.m. local time on Friday, March 11, spawned coast-slamming tsunamis that crossed the Pacific in less than 21 hours. The tsunami first reached a monitoring buoy just minutes after the quake occurred, and soon thereafter scientists released a forecast of wave heights and arrival times. Colors in this image depict peak wave heights. Near the undersea source of the temblor, about 375 kilometers north-northeast of Tokyo, and southeast of that epicenter, where much of the quakes energy was focused, the height of the tsunami wave likely exceeded 2.5 meters (depicted in black). But across most of the Pacific, the open-ocean height of the waves, which race across the sea at jetliner speeds, probably remained less than 20 centimeters (yellow and orange). Credit: NOAA

On the Net:

Nicotine and Cocaine: Similar Addiction?

(Ivanhoe Newswire) — New research from the University of Chicago Medical Center has given new insight into just what makes cocaine and nicotine so addictive. According to the research, the effects of nicotine on the regions of the brain associated with addiction are similar to those of cocaine””both create lasting changes in a person’s brain by affecting similar mechanisms of memory on first contact.

Scientists have already established that learning and memory are connected to one another via “synaptic plasticity”””the long-term strengthening and weakening of connections between neurons. The more often two neurons are activated together, the stronger the bond between them becomes. This leads to an increased ability for one neuron to excite the other.

Previous research by Daniel McGehee, Ph.D., at the University of Chicago Medical Center has also shown that nicotine can promote plasticity in the region of the brain called the ventral tegmental area, or VTA. Neurons originating from the VTA release dopamine, a neurotransmitter that plays a role in rewards, such as food and sex, and addictions. In a series of new experiments, Danyan Mao, Ph.D., also of the University of Chicago Medical Center, monitored the electrical activity of VTA dopamine neurons in slices of brain taken from adult rats. Every slice was soaked for 15 minutes in a concentration of nicotine that equaled the amount that would reach the brain after smoking one cigarette. After 3 to 5 hours, experiments were conducted in order to detect the presence of synaptic plasticity and find out which neurotransmitter receptors were involved in the development of the synaptic plasticity.

As a result of the experiment, Mao discovered two receptors necessary for synaptic plasticity in the VTA: the acetylcholine receptor located on the dopamine neurons, and the D5 dopamine receptor””a component previously associated with the consumption of cocaine. If either of these receptors were blocked during nicotine exposure, the drug’s ability to cause lasting changes in excitability was eliminated.

The findings from the experiment suggest a reason for why both cocaine and nicotine are such highly addictive substances. Daniel McGehee, Ph.D., neuroscientist and associate professor in the Department of Anesthesia & Critical Care at the Medical Center, was quoted as saying,

“We know without question that there are big differences in the way these drugs affect people. But the idea that nicotine is working on the same circuitry as cocaine does point to why so many people have a hard time quitting tobacco, and why so many who experiment with the drug end up becoming addicted.”

While the results of the experiment also pose possible strategies for preventing or treating cocaine and nicotine addictions, the use of D5 blockers to treat addiction may be further in the future””currently all known blockers of the D5 receptor also block the D1 dopamine receptor, which is important for healthy motivation and movement.

Source: Journal of Neuroscience, May 4, 2011

Climate Researchers Urged To Use ‘Plain Language’

Climate scientists gathering at a conference on Arctic warming were asked Wednesday to explain the dramatic melting in the region in layman’s terms, the Associated Press (AP) reports.

An authoritative report released at the meeting in Copenhagen showed melting ice in the Arctic could result in global sea levels rising 5 feet within this century, much higher than previous forecasts.

James White of the University of Colorado at Boulder told fellow researchers to use plain language when describing their research to a general audience. Focusing on the reports technical details could obscure the basic science. To put it bluntly, “if you put more greenhouse gases in the atmosphere, it will get warmer,” he said.

US climate scientist Robert Corell said it was pertinent to try to reach out to all members of society to spread awareness of Arctic melt and the impact it has on the whole world.

“Stop speaking in code. Rather than ‘anthropogenic,’ you could say ‘human caused,” Corell said at the conference of nearly 400 scientists.

The Arctic has been warming at twice the global average in recent decades, and the latest five-year period is the warmest since measurements began more than 100 years ago, according to the report by the Arctic Monitoring and Assessment Program.

The report highlighted “the need for greater urgency” in reversing global warming. But standstills between nations on how to reduce emissions of carbon dioxide and other greenhouse gases have lingered for the past two decades.

Andrew Steer, envoy on climate change for the World Bank, said the new findings “are a case for great concern.” Rising sea levels will affect millions of people in both wealthy and poor countries, but would especially affect the poor, because “they tend to live in the lowest lying land and have the fewest resources to adapt,” he said.

Studies on the topic showed that the costs of major flooding events on infrastructure and the economy could easily soar into billions of dollars, Steer said.

“It is clear that we are not on track in the battle against climate change,” he said.

Ocean currents expert, Bogi Hansen, said one problem is that scientists can come off as unsure about conclusions because they hesitate to report on anything with 100 percent certainty.

White agreed. At a news conference later Wednesday, he told AP’s Karl Ritter that those opposed to reducing the use of fossil fuels “sow the seeds of doubt that give the people the impression that … unless every single one of us lines up behind an idea, that decisions can’t be taken.”

The AMAP report will be delivered to U.S. Secretary of State Hillary Clinton and the foreign ministers of Canada, Iceland, Norway, Denmark, Sweden, Finland and Russia, at an Arctic Council meeting in Greenland next week.

On the Net:

Controlling Brain Circuits With Light

F1000 Biology Reports takes a look at the story behind the invention of optogenetics

Commenting on Edward Boyden’s article, Ben Barres, Head of the Neuronal & Glial Cell Biology Section of Faculty of 1000 and Professor at Stanford University School of Medicine said: “There will probably be a Nobel prize for optogenetics someday as it has revolutionized our attempts to understand how the brain works. This article provides a fascinating insight into the birth of optogenetics and the roles of the major players.”

The invention of optogenetics literally sheds light on how our brains work. Published in the May 2011 issue of F1000 Biology Reports, Edward Boyden’s revealing article gives a unique perspective on the birth of optogenetics tools, new resources for analyzing and engineering brain circuits. These ‘tools’ take the form of genetically encoded molecules that, when targeted to specific neurons in the brain, enable their activity to be driven or silenced by light, thus revealing how entire neural circuits operate.

By driving or quieting the activity of defined neurons embedded with an intact neural network, Boyden and his colleagues are able to determine what behaviors, neural computations, or pathologies those neurons were sufficient to cause or what brain functions, or pathologies, these neurons are necessary for.

These tools are also being explored as components of neural control prosthetics capable of correcting neural circuit computations that have gone awry in brain disorders. Part of a systematic approach to neuroscience that is empowering new therapeutic strategies for neurological and psychiatric disorders, optogenetic tools are widely accepted as one of the technical advances of the decade, and could one day be used to treat neurological disorders such as Parkinsons.

Using primary sources and his own experiences at Stanford, Boyden reconstructs a compelling case study of the development of optogenetic tools, providing an insight into the hard work and serendipity involved.

On the Net:

Massachusetts Courtroom Gets Media Upgrade

Starting today, Quincy District Court in Massachusetts is allowing laptops, iPads and smartphones in the courtroom, and is encouraging live blogging, Tweeting and Facebooking.

Court officials say this experiment will help establish suggested guidelines for courts as they grapple with how to use digital technology and how to accommodate citizen journalists and bloggers.

The pilot project is believed to be one of the broadest experiments in the country for using new media in the courts. 

The Quincy project is unusual because it will continuously stream live, unedited court proceedings all day. 

“In the past, reporters were the connection to the nation’s courts, but with the changes in the media landscape, there are just less and less journalists who are that bridge to the public,” John Davidow, executive producer of the “OpenCourt” project, told The Associated Press (AP).

“At the same time, there’s been the proliferation of reporting tools that are in the hands of all citizens, including iPhones and other smartphones that can record. People can Tweet, blog, report. The idea is to bring the courts and what goes on in the courts closer to the people so they understand how the law and the justice system work in this country,” he said.

The new tools are not widely embraced by the nation’s courts, where judges, jurors and lawyers are restricted in their use of digital technology and social media.

Jurors using portable electronic devices in some publicized case have caused mistrials and overturned convictions.  A judge in San Francisco dismissed 600 potential jurors after several acknowledged going online to research the criminal case they were called to consider.

The Quincy project is funded by a $250,000 grant through Knight News Challenge, which is a contest that encourages media innovation.

Davidow and others met regularly for months with court staff and lawyers to work out rules for the project.

Some defense attorneys and prosecutors in Massachusetts have not embraced the idea.

The court has had training sessions for lawyers to show them dead zones in the courtroom where they are able to have conservations that would be picked up by microphones.

“I’m not overly fond of the idea,” Richard Sweeney, a Quincy defense attorney who regularly defends criminal clients in the courtroom, now newly wired, told AP.

“I think there are a lot of pitfalls. I understand and respect the concept “” they want an open court. In this era of everyone having cellphones and videos, I can understand that, but it’s fraught with perils for attorneys with conversations that can be picked up.”

Court officials around the U.S. are watching the Quincy experiment as they try to come up with policies on dealing with live streaming, citizen journalists and bloggers.

“There’s no firm national standard on how to do this,” Gregory Hurley, an analyst for the National Center for State Courts in Williamsburg, Va, told AP.

“I do think this is the wave of the future. More courts are going to want to experiment with this and see if they want to make this available to the public.”

On the Net:

Infants Grow Taller After A Good Nap

Increased bursts of sleep among infants are linked to infant growth spurts in body length, a study published in the journal Sleep found.

Instead of relying on parental recall of infant sleep patterns and growth, the study recorded real time data over a duration of a four to 17-month span. Daily sleep patterns of infants were recorded by 23 parents involved in the study, and 5,798 daily recordings were analyzed. The study included fourteen girls and nine boys, with a median age of 12 days, who were all healthy at birth and free of colic or medical complications during their first year.

Mothers were asked to keep daily diaries detailing sleep onset and awakening, as well as noting whether their babies were breastfeeding, formula feeding, or both, and whether their infant showed signs of illness, such as vomiting, diarrhea, fever or rash.

Growth in body length was assessed using the maximum stretch technique, performed semi-weekly for 18 infants, daily for three infants and weekly for two infants. The monitoring lasted between four to seventeen continuous months.

The results revealed that infants had irregular bursts of sleep, with 24-hour duration increasing at irregular intervals by an average of 4.5 hours a day for two days.

Sleep episodes per day also increased in intermittent bursts of an average of three extra naps per day for two days, the study showed.

“These peaks in total daily sleep duration and number of sleep episodes were significantly associated with measurable growth spurts in body length, which tended to occur within 48 hours of the recorded bursts of sleep,” the study says.

The probability of a growth spurt increased by a median of 43% for every additional sleep episode and 29% for each additional hour of sleep, the study found after further analysis.

“The results demonstrate empirically that growth spurts not only occur during sleep but are significantly influenced by sleep,” says principal investigator and lead author Dr. Michelle Lampl, Samuel Candler Dobbs Professor in the department of anthropology at Emory University in Atlanta, Ga.

“Longer sleep corresponds with greater growth in body length.”

In addition, the study showed that the sex of the baby made a difference in sleep patterns relating to growth.

“Growth spurts were associated with increased sleep bout duration in boys compared with girls and increased number of sleep bouts in girls compared with boys,” says Lampl.

Boys, in general, had more sleep bouts and shorter sleep bouts than girls, but there were no significant effects on total daily sleep for neither the sex of the infant nor breastfeeding, the study says.

However, the study found that breastfeeding, as opposed to formula feeding, was associated with more and shorter sleep bouts.

Parents who can easily become frustrated with the varying and unpredictable infant sleep patterns can be comforted by the results, Lampl adds.

“Sleep irregularities can be distressing to parents,” says Lampl. “However, these findings give babies a voice that helps parents understand them and show that seemingly erratic sleep behavior is a normal part of development. Babies really aren’t trying to be difficult.”

The exact nature of the relationship between sleep biology and bone growth is still unclear, according to Lampl and co-author Michael Johnson, PhD, professor of pharmacology in the University of Virginia Health System.

But they do know that the secretion of growth hormone is known to increase after sleep onset and during the stage of slow wave sleep, which could help to stimulate bone growth. These hormonal signals could help support anecdotal reports of “growing pains,” the aching limbs that can wake children at night.

In some cases, Lampl and Johnson speculate that other parts of the body could be growing. The research also found that longer sleep bouts in both girls and boys predicted an increase in their weight and body-fat composition that ties to an increase in length, implying an anabolic process, or growth.

Even with a statistically significant link between bursts of sleep and growth spurts, no conclusive evidence was found. The study said that some sleep alterations occurred without a growth spurt, and that not every spurt was preceded by a burst of sleep.

An example in another new study being published this month found that infant head circumference grows in intermittent, episodic spurts, suggesting that sleep may be only one component of an integrated, physiological system that underlies growth timing.

“It opens another door to understanding why we sleep,” she said. “We now know that sleep is a contributing factor to growth spurts at the biological level.”

On the Net:

Mud Is More Than Cooling For Pigs

To find out what motivates pigs to frolic around in the mud, a scientist in the Netherlands looked at the wallowing behavior of its wild relatives.

Marc Bracke from Wageningen University and Research Centre carried out the study that suggests a pig’s love of mud is not just a way to keep cool, but is vital for the animals’ well-being.

Pigs are known to wallow in order to keep cool because they do not have normal sweat glands to regulate their body temperature.

Bracke searched through scientific literature for any evidence of what might motivate other animals to carry out the similar behavior.

He looked at animals such as the hippo, a pig’s close relative, who spend much of their time in the water in order to keep cool; and the deer because they roll around on the ground, not to keep cool, but to “scent mark” in order to attract a mate.

From these analyses, Bracke proposed that mud wallowing, similar to rolling, may be hardwired in pigs and could play a role in their reproduction.

“If so, wallowing could be an important element of a good life in pigs,” says Bracke.

Bracke also suggests that the rolling behavior of pigs was possibly evolved from its ancient relatives.

“Pigs are genetically related to particularly water-loving animals such as hippos and whales,” Bracke says.

“It seems to me that this preference to be in shallow water could have been a turning point in the evolution of whales from land-dwelling mammals,” he adds.

Watering holes are ideal places for predators to ambush their prey, and for many animals wallowing would be dangerous.

“But pigs, like many carnivores, are relatively large animals with enlarged canine teeth, so they would be better able to fend off an attack,” says Bracke.

In conclusion, Bracke thinks that pigs “did not evolve functional sweat glands like other ungulates because they liked wallowing so much,” and not because they need the mud to cool down because pigs do not have [functional] sweat glands.

The study is published in the journal Applied Animal Behaviour Science.

On the Net:

Secondhand Smoke Can Lead To High Risk Of Stillbirth

A new study adds more evidence to the case that even secondhand smoke can harm unborn babies and could lead to a higher risk of having a stillbirth.

Canadian researchers also found that newborns weighed less and had smaller heads if their mothers were passive smokers.

“This information is important for women, their families and healthcare providers,” Dr. Joan Crane of Eastern Health in St. John’s and colleagues wrote in the BJOG: An International Journal of Obstetrics and Gynecology.

Secondhand smoke is thought to expose people to about one percent of the smoke that active smokers inhale.

According to the researchers, “undiluted side-stream smoke contains many harmful chemicals and in greater concentration than cigarette smoke inhaled through a filter.”

Those chemicals may harm the fetus by restricting blood flow and possibly damaging the placenta.

The researchers used a database of pregnant women from the Canadian provinces of Newfoundland and Labrador to shed light on the question.

They also looked at other birth outcomes, like head circumference.

Eleven percent of the 12,000 women in the database said they had been exposed to second hand smoke.

The rate of stillbirth was 0.83 percent in passive smokers and 0.37 percent in women who did not breathe tobacco fumes.

The researchers accounted for several risk factors as well, including age and the women’s drinking and drug habits.  When these factors were accounted for, passive smokers had over three times the odds of stillbirth.

“This is huge,” Dr. Hamisu Salihu, an expert on stillbirth at the University of South Florida in Tampa, told Reuters. “We can now inform patients that exposure to secondhand smoke means they can lose their baby.”

The researchers also found that babies born to passive smokers weighed nearly 2 ounces less than babies whose mothers lived and worked in smoke-free households.

According to the study, the babies’ heads were an average of about 0.1 inch less as well.

“Policy makers should really take this matter seriously,” he told Reuters. “We need to enact laws to protect these babies.”

On the Net:

Newer Meds Better Than Diuretics For High Blood Pressure?

An analysis of fifteen past studies has revealed that when it comes to treating high blood pressure, many people may be more likely to use certain types of medications than others, researchers say.

Researchers found that, on average, people were less likely to remain on prescription diuretics than on relatively newer medications.

They found that patients were most likely to stick with angiotensin II receptor blockers, or ARBs — a group of drugs that includes names like valsartan (Diovan), candesartan (Atacand) and losartan (Cozaar).

ACE inhibitors were used secondly most often used. These include ramipril (Altace), lisinopril (Prinivel, Zestril) and captopril (Capoten).

Diuretic users were about twice as likely to stop taking their medication as ARB users, researchers reported in the medical journal Circulation.

Dr. Ian M. Kronish of Mount Sinai School of Medicine in New York, leader of the new research, said that various classes of blood pressure medicines differ in how they work and in their side effects.

However, the research findings do not necessarily mean that a person will stick with an ARB longer, or that those drugs should be a first choice for treating high blood pressure, said Dr. Niteesh K. Choudhry, of Brigham and Women’s Hospital and Harvard Medical School in Boston, who was not involved in the study.

In an interview with Reuters, Choudhry said national guidelines recommend diuretics as a “first-line” medication for high blood pressure, based on clinical trials showing their effectiveness.

“To me, that data still reigns supreme,” said Choudhry, who wrote an editorial published with the study. “These findings don’t support abandoning what we’re doing now,” he added.

Kronish and his colleagues combined the results of 15 studies looking at people’s adherence to their blood pressure prescriptions. Most of those studies defined “adherence” according to whether people persistently filled their prescriptions over the course of the study, which was a year in most cases.

On average, 65 percent of ARB users adhered to their medication, versus 58 percent of ACE inhibitor users and 51 percent of diuretic users. Beta-blockers had the lowest adherence : 28 percent.

When other factors were weighed in — including patients’ age, race and income — both diuretic and beta-blocker users were nearly twice as likely to stop taking their medications as ARB users were.

Choudhry said it is reasonable to believe that diuretic users could be more likely to quit because of the medications themselves.

It is likely that some people are bothered by the fact that diuretics cause frequent bathroom trips, as it helps to rid the body of water and salt.

Also, Choudhry said diuretics are older blood pressure drugs, and some patients perceive them as “not as good” as newer ARBs and ACE inhibitors.

“But we know from large-scale studies that people do just as well on diuretics as they do on other drugs,” Choudhry said.

A decade-old clinical trial, known as ALLHAT, found then that diuretics were more effective than other blood pressure medications at preventing heart failure, which was the key in making the most popular choice in the first place.

A range of factors, from price to potential side effects, can affect a person’s decision to adhere to any one type of blood pressure medication, said Choudhry. ARBs are more expensive than other drugs, while ACE inhibitors can cause a chronic cough.

But despite those factors, people should not stop taking medication on their own. “It’s very important to take the medication you are prescribed,” said Choudhry. “If you are having trouble, talk with your doctor.”

On the Net:

Concern Over ‘Excessive’ Doses Of Thyroid Drugs For Older Patients

Research: Levothyroxine dose and risk of fractures in older adults: Nested case-control study

Many older adults may be taking “excessive” doses of drugs for thyroid problems which can lead to an increased risk of fractures, finds a study published on bmj.com today.

The study raises concern that treatment targets may need to be modified in the elderly and that regular dose monitoring remains essential even into older age.

Levothyroxine is a synthetic form of thyroxine (thyroid hormone) and is widely used to treat an underactive thyroid gland (hypothyroidism).

Most hypothyroid patients are diagnosed in early or middle adulthood but, as people age, their thyroxine requirements fall. Although regular monitoring of patients on levothyroxine is recommended, doses often remain unchanged into old age.

This can lead to excess thyroid hormone levels (hyperthyroidism) which can increase the risk of fractures, particularly in older women.

Previous studies of the association between levothyroxine and fractures have had mixed results, so a team of researchers in Toronto, Canada set out to measure the effect of levothyroxine dose on the risk of fractures in older adults.

Using population-based data from Ontario, Canada, the study included 213,511 people aged 70 years or older with at least one levothyroxine prescription dispensed between April 1, 2002 and March 31, 2007. Hospital records were used to identify fractures and each case was matched with up to five controls from within the group who had not yet fractured.

Cases and controls were defined as current users, recent past users (discontinued 15-180 days prior to study) or remote users (discontinued more than 180 days prior to study) of levothyroxine.

A total of 22,236 (10.4%) individuals experienced at least one fracture during the study period.

Compared with remote use, current and recent past levothyroxine use was associated with a significantly higher fracture risk. Among current users, high and medium doses of levothyroxine were associated with a significantly higher risk of fractures compared with low dose levothyroxine.

Even after taking account of other fracture risk factors, a dose-related association was seen in both men and women, for hip fractures as well as for any fracture.

The authors conclude: “Our findings provide evidence that levothyroxine treatment may increase the risk of fragility fractures in older people even at conventional dosages, suggesting that closer monitoring and modification of treatment targets may be warranted in this vulnerable population.”

This view is supported in an accompanying editorial by Professor Graham Leese at Ninewells Hospital in Dundee, who warns that ideal thyroxine doses may vary with age and be unexpectedly low in elderly people.

It is 120 years since the effect of excess thyroid hormone on bone was first described, he writes, yet research in this area still lacks funding. “With the prevalence of treated hypothyroidism increasing, and the annual economic burden of fractures in the United Kingdom currently estimated at &eur;5.8bn (£5.1bn; $8.4bn), such research warrants a higher priority.”

On the Net:

Pediatric Flu Vaccination: Understanding Low Acceptance Rates Could Help Increase Coverage

A study of H1N1 and seasonal influenza vaccination in a sample of black and Hispanic children in Atlanta found a low rate of vaccine acceptance among parents and caregivers. Only 36 percent of parents and caregivers indicated they would immunize children against H1N1, and 22 percent indicated their children received the seasonal influenza vaccine in the previous three months. The majority of children in the sample (71 percent) were from households with less than $40,000 in annual income.

Researchers say this low level of vaccine coverage and acceptance highlights the importance of understanding individual and community concerns that influence parents’ decisions to have their children vaccinated.

The study is published in the Vaccine Safety Supplement of the April issue of the journal Pediatrics.

Children aged six months through 18 years, and caregivers of children younger than six months, were among the stated high-priority groups for the 2009 H1N1 vaccine. More recently, the ACIP has recommended that all persons older than six months should be vaccinated annually against influenza.

The study found that parents who said they were concerned about influenza, were concerned about H1N1 disease, and had confidence in vaccines and their preventive abilities were more likely to accept vaccination.

Although income did not correlate with vaccine acceptance, parents without health insurance were more likely to say they would vaccinate their children against H1N1 than were parents with insurance. The authors speculated this is due to concern related to treatment cost among parents without insurance.

Safety issues were generally not cited as a factor influencing decisions, but perceived greater risk of exposure and illness for children from the H1N1 virus was cited as a reason for those accepting vaccination. Other factors contributing to acceptance included lack of confidence in the effectiveness of hand washing, masks, and quarantine approaches over the H1N1 vaccine as prevention methods, and having a desire to promote inuenza vaccination in the community.

“The well-publicized risks to children of contracting the H1N1 virus may have outstripped vaccine safety concerns in this case,” notes lead author Paula Frew, PhD. “This shows that more comprehensive education of minority parents with regard to disease risk may provide a boost to vaccination rates.” Frew is assistant professor of medicine and director of community research in Emory University School of Medicine.

“Physicians have a central community leadership role in educating parents about the importance of influenza vaccination”¦” the authors write. “Moreover, our study results show parental confidence in the health departments to provide influenza vaccination compared with other community-based venues.”

“Physician support of vaccination can help increase vaccine coverage, and community health departments are ideal locations for vaccine administration,” says Frew.

On the Net:

Jump In Communication Skills Led To Species Explosion In Electric Fishes

A novel way to ramp up biodiversity

Bruce Carlson stands next to a fish tank in his lab, holding a putty colored Radio Shack amplifier connected to two wires whose insulation has been stripped. At the bottom of the tank a nondescript little fish lurks in a sawed-off section of PVC pipe.

Carlson sticks the two bare wires into the tank. Suddenly we hear a rapid-fire pop, pop, pop, pop, pop, pop. The pops, which are surprisingly loud, sound rather like the static on an old-fashioned tube radio tuned between stations.

“When there are many fish in a tank,” Carlson says, “it sounds like a frying pan”.

Carlson, PhD, assistant professor of biology in Arts & Sciences, is studying the African family of weakly electric fishes called the Mormyridae, or mormyrids.

Each fish in this family has an electric signal distinctive to its species, but also, to its sex, dominance status and even its individual identity.

The shape of the discharge is the fish’s “face,” says Carlson. “It’s how they recognize one another.”

The sensory pathway that detects and analyzes the electric discharges in the Mormyridae had been well studied, but only in two or three species, Carlson says, and the family has more than 200. Given its diversity Carlson asked whether changes in electrical communication might have influenced rates of speciation.

Three anatomical advances underlie the ability to send and receive diverse electrical signals: cells able to produce different discharges, a global distribution of the sensors that detect the discharges’ shape, and a more complex signal-processing area of the brain to analyze them.

In 2008 the National Science Foundation awarded Carlson a grant to travel to Gabon (where many mormyrid species are found) to study the mormyrid brain, and how brain anatomy maps onto the evolutionary tree of the fishes.

His team found that changes in brain anatomy and the resulting ability to fully exploit electric signal space did indeed lead to rapid speciation, a result published in the April 29 issue of Science.

The electric organ

Each pop is one discharge of an electric organ located at the base of a fish’s tail. The organs consist of stacks of disk-like cells called electrocytes, “pretty much like watch batteries in series,” says Carlson.

The electrocytes all fire action potentials simultaneously, and so their tiny action potentials sum to produce a discharge that is typically about a few volts.

“These signals don’t propagate as electromagnetic waves,” Carlson explains. “Instead they exist as an electrostatic field, just like you’d get by sticking a battery in the water.

“That’s why these fish are so good at recognizing pulses with different shapes,” he says. Waves are distorted during transmission, so that their fine temporal structure is smeared.

“The discharges are not distorted. They get weaker with distance, but their temporal structure stays the same. That’s one reason mormyrids evolved to be exquisitely sensitive to small timing differences in electric signals,” Carlson explains.

Detecting the pulse

Weakly electric fish have several types of electroreceptors but the ones important for communication are called knollenorgans, from the German word “Knolle,” or tuber, because they consist of bulbous cells buried just under the fish’s skin.

The knollenorgans respond to a voltage rise, firing a time-locked spike in response to outside positive-going voltage changes.

The knollenorgans on one side of a fish’s body respond to the start of a discharge and those on the opposite side respond to the end of a discharge. This lets a fish recognize a species-specific discharge by comparing the intervals between spikes coming from opposite sides of its body.

The spike time comparison occurs within the central nervous system, in a part of the brain called the extero-lateral nucleus, or EL.

Signal processing

When we began our work, the “standard anatomy” for the “mormyrid” brain””what you’d find if you looked in a textbook– says Carlson, was a two-part EL, with separate nuclei, or clumps of cells, in the anterior and posterior portions.

“We collected lots of brains in Gabon, and two collaborators, Saad Hasan, a former undergraduate at Washington University, who is now a medical student at Cornell, and Derek Miller, who is an undergraduate at Washington University, did all the histology on the brains.

“In addition to the standard anatomy, we were amazed to see another anatomy, where the EL is substantially smaller and not split into two portions.

“All the fish we looked at either had the large EL that was divided into anterior and posterior halves, or they had the small undifferentiated EL.

Working with collaborator Matthew Arnegard, PhD, a postdoctoral fellow at the Fred Hutchinson Cancer Research Center in Seattle, WA, the scientists mapped the brain anatomy onto a phylogenetic tree (an evolutionary tree based on the similarity of DNA sequences), and they could see that there were two equally parsimonious ways to reconstruct the fishes’ evolutionary history.

Either the complex brain was ancestral and the simpler brain evolved twice or the simpler brain was ancestral and the complex brain arose twice. To solve this riddle, they did what evolutionary biologists do, which is look at the “next outgroup member,” the closest related fish that’s not part of the Mormyridae family.

This fish has an area in the midbrain that is similar to a small, undifferentiated EL. This suggested the EL brain was probably the ancestral brain, and the more complex divided ELa/ELp evolved twice, once within the subfamily Mormyrinae and once within the subfamily Petrocephalinae.

Did fancy anatomy lead to rapid diversification?

If a communication system is to promote species diversity it must have both the capacity to create new signals (flexible stalk morphology) and the ability to distinguish those new signals from other signals (the broad distribution of knollenorgans and the complex brain).

“The only fish that have all three is a group of mormyrids we ended up calling Clade A for simplicity’s sake,” Carlson says.

To test the importance of these traits on signal divergence we analyzed the discharges of fish collected in two locales: the Ivindo River of Gabon, home to the largest known assemblage of the subfamily Mormyrinae; and Odzala National Park in the Republic of the Congo, home to the largest known assemblage of the subfamily Petrocephalinae.

“Statistical analysis showed us that the rate of signal divergence in Clade A was 10 times higher than among other fish within the Mormyridae,” Carlson says. Further analysis by collaborator Luke Harmon, PhD, assistant professor of biology at the University of Idaho, revealed that the number of species in clade A has been increasing three to five times faster than the number of species in other mormyrid lineages.

In other words, the fancier the fishs’ communication kit, the more likely it was to come up with new electric discharges and new species that identified one another by those discharges.

Putting it to the test

It all worked out statistically and logically but was it what the fish actually experienced?

“After all,” says Carlson, “this sensory world is totally foreign to us. I’ve worked with these fish a long time, so I can tell a few of the discharges apart by ear. But for the most part I need an oscilloscope to see the differences.

Can the Clade A fish tell the difference between discharges? To test them, Carlson ran behavioral playback experiments on fish caught in Gabon.

“A fish would be going pop, pop, pop and we’d pulse it. Depending on the fish, it would either discharge more rapidly, brrrrrrrrr, or stop discharging altogether.

“But if we repeated the stimulus again and again the fish would stop responding. Once it stopped responding, we hit it with a phase-shifted version of the same pulse. If the fish could tell the difference, the discharge rate or pause duration would increase. If it couldn’t tell the difference, there would be no change.

The experiments showed that mormyrid fish in Clade A were able to distinguish among pulses, but other mormyrids (those with the EL brain) were not.

Did the evolution of a fancy signal-processing brain drive speciation in the Mormyridae? “It’s always difficult with evolutionary studies to say that any one trait is the cause or the trigger for another,” Carlson says. “But in this case we were able to show that the complex signal-processing brain evolved before a burst of speciation, that signal variation was higher among fishes with that brain, and that these fishes could distinguish among subtly different pulses, whereas others could not.

“Together it adds up to a strong case for brain evolution triggering increased diversification.”

On the Net:

Mystery Solved: How Sickle Hemoglobin Protects Against Malaria

Unraveled the molecular mechanism whereby sickle cell hemoglobin confers a survival advantage against malaria

The latest issue of the journal Cell carries an article that is likely to help solve one of the long-standing mysteries of biomedicine. In a study that challenges currently held views, researchers at the Instituto Gulbenkian de Ciência (IGC), in Portugal, unravel the molecular mechanism whereby sickle cell hemoglobin confers a survival advantage against malaria, the disease caused by Plasmodium infection. These findings, by the research team lead by Miguel P. Soares, open the way to new therapeutic interventions against malaria, a disease that continues to inflict tremendous medical, social and economic burdens to a large proportion of the human population.

Sickle cell anemia is a blood disease in which red blood cells reveal an abnormal crescent (or sickle) shape when observed under a conventional microscope. It is an inherited disorder ““ the first ever to be attributed to a specific genetic modification (mutation), in 1949 by Linus Pauling (two-times Nobel laureate, for Chemistry in 1954, and Peace, in 1962). The cause of sickle cell anemia was attributed unequivocally to a single base substitution in the DNA sequence of the gene encoding the beta chain of hemoglobin, the protein that carries oxygen in red blood cells.

Only those individual that inherit two copies of the sickle mutation (one from their mother and the other from their father) develop sickle cell anemia. If untreated, these individuals have a shorter than normal life expectancy and as such it would be expected that this mutation would be rare in human populations. This is however, far from being the case. Observations made during the mid-20th century and building on Pauling’s findings, revealed that the sickle mutation is, in fact, highly, selected in populations from areas of the world were malaria is very frequent, with sometimes 10-40% of the population carrying this mutation. ”Individuals carrying just one copy of the sickle mutation (inherited from either the father or mother) were known not to develop sickle cell anemia, leading rather normal lives. However, it was found that these same individuals, said to carry the sickle cell trait, were in fact highly protected against malaria, thus explaining the high prevalence of this mutation in geographical areas where malaria is endemic.

These findings lead to the widespread believe in the medical community that understanding the mechanism whereby sickle cell trait protects against malaria would provide critical insight into developing treatment or a possible cure for this devastating disease, responsible for over a million premature deaths in sub-Saharan Africa. Despite several decades of research, the mechanism underlying this protective effect remained elusive. Until now.

Several studies suggested that, in one way or another, sickle hemoglobin might get in the way of the Plasmodium parasite infecting red blood cells, reducing the number of parasites that actually infect the host and thus conferring some protection against the disease. The IGC team’s results challenge this explanation.

In painstakingly detailed work, Ana Ferreira, a post-doctoral researcher in Miguel Soares’ laboratory, demonstrated that mice obtained from Prof. Yves Beuzard’s laboratory, that had been genetically engineered to produce one copy of sickle hemoglobin similar to sickle cell trait, do not succumb to cerebral malaria, thus reproducing what happens in humans.

When Prof. Ingo Bechman observed the brains of these mice he confirmed that the lesions associated with the development of cerebral malaria where absent, despite the presence of the parasite.

Ana Ferreira went on to show that the protection afforded by sickle hemoglobin in these mice, acts without interfering directly with the parasite’s ability to infect the host red blood cells. As Miguel Soares describes it, “sickle hemoglobin makes the host tolerant to the parasite”.

Through a series of genetic experiments, Ana Ferreira was able to show that the main player in this protective effect is heme oxygenase-1 (HO-1), an enzyme whose expression is strongly induced by sickle hemoglobin. This enzyme, that produces the gas carbon monoxide, had been previously shown by the laboratory of Miguel Soares to confer protection against cerebral malaria. In the process of dissecting further this mechanism of protection Ana Ferreira demonstrated that when produced in response to sickle hemoglobin the same gas, carbon monoxide, protected the infected host from succumbing to cerebral malaria without interfering with the life cycle of the parasite inside its red blood cells.

Miguel Soares and his team believe that the mechanism they have identified for sickle cell trait may be a general mechanism acting in other red blood cell genetic diseases that are also know to protect against malaria in human populations: “Due to its protective effect against malaria, the sickle mutation may have been naturally selected in sub-Saharan Africa, where malaria is endemic and one of the major causes of death. Similarly, other clinically silent mutations may have been selected throughout evolution, for their ability to provide survival advantage against Plasmodium infection”.

On the Net:

Labor Induction

Labor Induction is a process of giving an artificial start to birth with medical intervention or other methods. When an induction is not performed for emergency or other medical reasons, the method is considered an elective process. The decision to induce labor has increased in recent years due to its convenience or because it easily accommodates busy schedules.

The American College of Obstetricians and Gynecologists, however, say that labor should only be induced when it is more risky for the baby to remain in the mother’s uterus than to be born.

There are several reasons why labor induction should be performed. These include:

-Pregnancy lasting more than 42 weeks. After 42 weeks the placenta normally stops functioning properly enough for the baby to receive adequate nutrition and oxygen.
-Pregnancy lasting more than 38 weeks when having twins.
-The pregnant woman has high blood pressure caused by pregnancy.
-The pregnant woman has an infection in her womb.
-The woman’s water has broken, but contractions have not begun.
-The woman has health problems, such as diabetes.
-There are health risks to the woman if pregnancy is continued.
-A growth problem is causing the baby to be too small or too big.
-Intrauterine fetal growth retardation (IUGR).
-Premature rupture of the membranes (PROM). This occurs when the membranes have ruptured, but labor does not start within a certain amount of time.
-Premature termination or abortion.
-Fetal death.

If an induction causes complications, a Caesarean section is almost always conducted in place of inducing. An induction will most likely be successful when a woman is close to or in the early stages of labor. Signs of impending labor may include softening of the cervix, dilation and increasing frequency or intensity of contractions. The Bishop score may be used to assess how suitable induction would be.

The Bishop score, which is also used to assess the odds of spontaneous preterm delivery, grades patients who would be most likely to achieve a successful induction. The duration of labor is inversely correlated with the Bishop score; a score that exceeds 8 describes the patient most likely to achieve a successful vaginal birth. Bishop scores of less than 6 usually require that a cervical ripening method be used before other methods.

Use of medication is a common method in labor induction.

-Intravaginal, endocervical or extra-amniotic administration of prostaglandin, such as dinoprostone or misoprostol. Extra-amniotic administration has appeared to be more efficient than Intravaginal or endocervical administration in the few controlled studies that have been done.
-Intravenous administration of synthetic oxytocin preparations, such as Pitocin.
-Natural Induction. Natural induction includes the use of herbs, castor oil or other medically unconventional agents to stimulate or advance a stalled labor.
-Mifepristone use has been described.
-Relaxin has been studied, but is not a commonly used medication.

There are also other processes and methods for inducing labor besides the use of medication.

-Stripping the membranes (separating the amniotic sac from the wall of the uterus): The amniotic sac is the lining inside the uterus that contains the baby. The doctor gently puts a gloved finger through the woman’s cervix. Using the finger, the doctor separates the sac from the uterine wall. The woman may feel some cramping or spotting with this method.
-Ripening the cervix: The doctor places a small tablet or suppository in the vagina up against the cervix. This helps to soften and thin the cervix. After receiving the suppository, the woman may start to have gentle contractions.
-Nipple Stimulation: This is a natural form of labor induction that can be done manually or with an electric breastfeeding pump. The hormone oxytocin will naturally be produced to cause contractions. The concept works the same as when a baby nurses right after birth, stimulating contractions, which slows the bleeding.
-Artificial rupture of the membrane (AROM): When the amniotic sac breaks or ruptures, production of the hormone prostaglandin increases, speeding up contractions. A doctor may suggest rupturing the amniotic sac artificially. A sterile, plastic hook is brushed against the membrane just inside the cervix. The baby’s head will move down against the cervix, which usually causes the contractions to become stronger. This method releases a gush of warm amniotic fluid from the vagina.

AROM has advantages and disadvantages.

Advantages include shortening labor by an hour or so, allowing the amniotic fluid to be examined for the presence of me conium, which can be a sign of fetal distress, and doctors can monitor heart rate with direct access to the baby’s scalp.

Disadvantages include the baby possibly turning to a breech position, making birth more difficult if the membranes are ruptured before the baby’s head is engaged, and leaving the possibility for the umbilical cord to slip out before the baby. Infection can occur if there is too much time between the rupture and the birth.

When to Induce

Until recently, the most common practice has been to induce labor by the end of the 42nd week of pregnancy. While this practice is still very common, recent studies have shown an increasing risk of infant mortality for births in their 41st and 42nd week of gestation, as well as higher risk of injury to the mother and child. The recommended date for induction of labor has now been moved to the end of the 41st week of gestation in many countries including Canada and Sweden.

Risks of Induction

Like any medical procedure, labor induction has potential side effects and health risks to both the mother and the child. Some common ones include:
-Oxytocin can make contractions quite strong and lower the baby’s heart rate. Throughout the induction process, it is important for the baby’s heart rate to be monitored. Adjusting the dosage of a drug can increase the strength of the contractions and reduce the effect on the baby’s heart.
-Women who have inductions are at an increased risk of having an infection, and so are their babies.
-The umbilical cord may slip out into the vagina before the baby does. This is more likely to occur if the baby is breech. Also, the cord may become compressed, decreasing the baby’s oxygen supply.
-Often the treatment may not work properly and the mother has to have an emergency cesarean delivery.

A less common complication with induction is uterine rupture, which can cause severe bleeding. Women who have previously had a C-section are at an increased risk of uterine rupture, as cesarean deliveries leave a scar in the uterus.

There is also a risk of babies being born “late preterm.” Inductions may contribute to the growing number of “late preterm” births between 34 and 36 weeks gestation. While babies born at this time are usually considered healthy, they are more likely to have medical problems than babies born a few weeks later at full term (37-42 weeks).

While induction has risks, it is sometimes needed to protect the health of the mother and the baby. The pregnant woman needs to understand both benefits and risks of labor induction.

In most cases, labor induction goes well, and the woman can deliver her baby through the birth canal normally. An induction can take anywhere from two or three hours to as long as two or three days, depending on how the woman’s body responds to the treatment she is receiving. An induction may take longer if the woman is pregnant for the first time or if the baby is not full term.

Every pregnancy is different. Having an induction is not a sign of failure and it may be the best thing for both the health of the baby and the mother. Medicines used for inducing labor may upset a woman’s stomach so normally it is recommended that she eats lightly before going to the hospital. Foods such as Jell-O and soup are good light foods. Medicines may also cause strong contractions. The woman should know that she can always ask if she needs help for her pain.

As induced labor tends to be more intense and painful for women, it can lead to increased use of analgesics and other pain-relieving medications. These medications have been said to lead to an increased likelihood that the pregnancy might result in cesarean delivery for the baby.

However, studies into the issue indicate that labor induction has no effect on the rates of cesarean deliveries. Two recent studies have shown that induction may increase the risk of C-section if performed before the 40th week of gestation, but it has no effect or actually lowers the risk if performed after the 40th week of gestation.

At least one study has indicated that cesarean delivery rates increase with induction. Research published in the Journal of Perinatal and Neonatal Nursing showed induction increased a woman’s likelihood of having a C-section by two to three times.

Versatility Of Stem Cells Controlled By Alliances, Competitions Of Proteins

Like people with a big choice to make, stem cells have a process to “decide” whether to transform into a specific cell type or to stay flexible, a state that biologists call “pluripotency.” Using a technology he invented, Brown researcher William Fairbrother and colleagues have discovered new molecular interactions in the process that will help regenerative medicine researchers better understand pluripotency.

In a paper published in advance online in the journal Genome Research, Fairbrother’s team showed that different proteins called transcription factors compete and cooperate in the cells to produce complex bindings along crucial sequences of DNA. This game of molecular “capture the flag,” played in teams and amid shifting alliances, appears to be a necessary part of what determines whether stem cells retain their pluripotency and whether specialized, or differentiated, cells can regain it.

In recent years scientists have reported spectacular successes in turning fully differentiated cells back into pluripotent stem cells, a process called reprogramming. But the animals derived from these cells often suffer higher rates of tumors and other problems, Fairbrother said. The reason may be because the complex details of the reprogramming process haven’t been fully understood. He said there are many misconceptions about how reprogramming transcription factors interact with DNA.

“Most people think of a protein binding to DNA as a single, surgical thing where you have this isolated binding event,” Fairbrother said. “But in fact we show that sometimes these binding events occur over hundreds of nucleotides so they seem more like great greasy globs of proteins that are forming. In addition, the proteins interact with each other, diversifying their function by appearing in complexes with with different partners at different places.”

By employing a high-throughput, high-resolution binding assay that he’s dubbed MEGAShift, Fairbrother and his colleagues, who include pathology researchers from the University of Utah School of Medicine, were able to analyze the interactions of several key transcription factors in a region of 316,000 letters of DNA with a resolution as low as 10 base pairs. Through hundreds of thousands of array measurements, lead authors Luciana Ferraris and Allan Stewart, Fairbrother, Alec DeSimone, and the other authors learned previously unspotted patterns of protein interactions.

“How do stem cells stay in the state where they can keep their options open?” Fairbrother said. “A key player is POU5F1. But what are the key players that could interact with it and modulate its function? We’ve developed technology to look at that question.”

One of several findings in the paper concerned POU5F1 and its archrival, POU2F1, which binds to exactly the same eight-letter DNA sequence. Which protein binds to the sequence first influences whether a stem cell specializes or remains pluripotent. Experiments showed that a determining factor was a third protein called SOX2. SOX2 helped both proteins bind, but it helps POU2F1 more than POU5F1. In contrast, the team found that another player, NANOG, exclusively helps POU5F1.

“Who binds next to a protein is a determinant of who ends up binding to a sequence,” Fairbrother said.

With support from the National Institutes of Health, Fairbrother’s group is also applying MEGAShift to other questions, including how protein-protein interactions affect the formation of RNA-protein complexes, which can be even more complicated than binding DNA.

They will also look at the problem of narrowing the field of hundreds of genomic sequence variations that exist naturally in the population down to the real genetic “causal variants” of disease risk. MEGAShift can sort through which variants associated with disease result in an altered binding event that results in a clinical manifestation, such as diabetes or lupus.

On the Net:

Five-minute Screen Can Catch Signs Of Autism

A checklist given to parents to fill out in pediatrician waiting rooms may help early detection of autism spectrum disorder (ASD), according to a study published in the Journal of Pediatrics.

Funded by the National Institutes of Health (NIH), the five-minute questionnaire identifies children with autism at an early age to allow them to start treatment sooner, which can greatly improve their development and learning.

“Beyond this exciting proof of concept, such a screening program would answer parents’ concerns about their child’s possible ASD symptoms earlier and with more confidence than has ever been done before,” notes Thomas R. Insel, M.D., director of the National Institute of Mental Health (NIMH), part of NIH.

The study also found that a significant delay exists between the time parents first report concerns about their child’s behavior and the ASD diagnosis.  Some children do not actually receive a diagnosis until they’re well into starting school.

“There is extensive evidence that early therapy can have a positive impact on the developing brain,” says Karen Pierce, PhD, assistant professor in the UC San Diego Department of Neurosciences. “The opportunity to diagnose and thus begin treatment for autism around a child’s first birthday has enormous potential to change outcomes for children affected with the disorder.”

Researchers at the UC San Diego Autism Center of Excellence (ACE), led by Dr. Pierce, gathered together 137 pediatricians in the San Diego area to help initiate a systematic screen program for all infants at their one-year checkup.

More than 10,000 one-year-old infants were screened with the new approach. Parents were given a brief questionnaire called the Communication and Symbolic Behavior Scales Developmental Profile Infant-Toddler Checklist, which asked questions about a child’s use of eye contact, sounds, words, gestures, object recognition and other forms of age-appropriate communication.

Out of these infants, the study found that 184 of them failed the initial screening and were referred to the ACE for further testing and re-evaluation every six months until they turned 3-years old.

So far out of the 184, the study has reported that 32 children have received a provisional or final diagnosis of ASD, 56 children with language delay, 9 with development delay and 36 children diagnosed with “other.” By using the simple five-minute screening technique, an accurate diagnosis was given 75% of the time.

The screening allowed toddlers with ASD or developmental delay, as well as 89% of those with language delay, to be referred for behavioral therapy by an average age of 19 months, says the study.

In comparison, the average age of ASD diagnosis found in a study conducted by the Center for Disease Control and Prevention in a 2009 was around 5.7 years of age, and treatment did not start until sometime later, the press release cited.

In addition to tracking infant outcomes, researchers in the study also surveyed the participating pediatricians.

“When we started giving parents the survey, I found that they listened more carefully to what I had to share with them and paid more attention to their child’s development,” says pediatrician Chrystal E. de Freitas, MD, FAAP, who participated in the study.

“In addition to giving me the opportunity to do a more thorough evaluation, it allowed parents time to process the information that their child might have a development delay or autism ““ a message no parent wants to hear. But, by addressing these concerns early, the child can begin therapy that much sooner.”

After the study, 96% of the pediatricians rated the program positive and 100% of them have continued to use the screening tool, reports the study.

Dr. Pierce points out that “Given lack of universal screening of infants for such disorders at 12 months, this program could be adopted by any pediatric office, at virtually no cost, to aid in the identification of children with developmental delays.”

And she says, “Importantly, parents will be able to get help for their children at a much earlier age than before.”

The researchers say that future studies are needed to further validate and refine the screening tool, track children until a much older age and assess any barriers to treatment follow ups.

On the Net:

Tired Brains Take Short ‘Naps’ While Awake

A new study suggests that parts of the brain can actually fall asleep for a fraction of a second when tired, despite the fact that the organ as a whole is awake at the time.

The implications of the study, which was conducted on rats, are far-reaching, particularly for people performing tasks where sleep deprivation could be dangerous, the scientists said.

“Even before you feel fatigued, there are signs in the brain that you should stop certain activities that may require alertness,” said psychiatry professor Chiara Cirelli at University of Wisconsin at Madison.

The researchers found that the more the rats were sleep-deprived, the more some of their neurons took mini-naps, with consequent declines in task performance.  Even though the animals were awake and active, brainwave measures revealed that scattered groups of neurons in the thinking part of their brain, or cortex, briefly fell asleep.

“Such tired neurons in an awake brain may be responsible for the attention lapses, poor judgment, mistake-proneness and irritability that we experience when we haven’t had enough sleep, yet don’t feel particularly sleepy,” said Dr. Giulio Tononi of the University of Wisconsin at Madison.

“Strikingly, in the sleep-deprived brain, subsets of neurons go offline in one cortex area but not in another ““ or even in one part of an area and not in another.”

Previous studies had suggested that such local snoozing with prolonged wakefulness might be occurring, but little was known about how the underlying neuronal activity might be changing.

To further study the issue, Tononi and colleagues tracked electrical activity at multiple sites in the cortex as they kept rats awake for several hours.  They put novel objects into their cages ““ colorful balls, boxes, tubes and odorous nesting material from other rats ““ and found that the sleepier the rats became, the more the cortex neurons switched off, seemingly randomly, in various localities.

These tired neurons’ electrical profiles resembled those of neurons throughout the cortex during NREM, or slow wave, sleep.  However, the rats’ overall EEG, a measure of brain electrical activity at the scalp, along with the rats’ behavior, confirmed they were indeed awake.

Neuronal tiredness differs from more overt microsleep that is sometimes experienced with prolonged wakefulness.  Instead, neuronal tiredness is more analogous to local lapses seen in some forms of epilepsy, the researchers said.

However subtle, having tired neurons did interfere with task performance. If neurons switched off in the motor cortex within a split second before a rat tried to reach for a sugar pellet, it decreased its likelihood of success by 37.5 percent.  And the overall number of such misses increased significantly with prolonged wakefulness. This suggests that tired neurons, and the accompanying increases in slow wave activity, might account for the some of the impaired performance of sleep-deprived people who may seem behaviorally and subjectively awake.

Subsets of neurons going offline with longer wakefulness is, in many ways, the mirror image of progressive changes that occur during recovery sleep following a period of sleep deprivation.

Tononi suggests that both serve to maintain equilibrium, and are parts of the compensatory mechanisms that regulate sleep need. Just as sleep deprivation produces a brain-wide state of instability, it may also trigger local instability in the cortex, possibly by depleting levels of brain chemical messengers. So, tired neurons might nod off as part of an energy-saving or restorative process for overloaded neuronal connections.

“Research suggests that sleep deprivation during adolescence may have adverse emotional and cognitive consequences that could affect brain development,” said NIMH Director Dr. Thomas Insel.

“The broader line of studies to which this belongs, are, in part, considering changes in sleep patterns of the developing brain as a potential index to the health of neural connections that can begin to go awry during the critical transition from childhood to the teen years.”

The researchers report their findings online in the April 28, 2011 issue of the journal Nature.

On the Net:

Standardized Heart Attack Treatments Improve Cardio Health

An increasing number of heart attack victims survive longer when doctors follow guidelines for treating patients after cardiac arrest, according to a new study from Sweden published in the Journal of the American Medical Association (JAMA), reports Reuters Health.

“Things that we test in clinical trials do work in real life (and) make huge impacts on mortality,” said Dr. Debabrata Mukherjee, a cardiologist at the Texas Tech University Health Sciences Center in El Paso who wrote a commentary published with the study in JAMA.

Improvements can still be increased, despite better adherence to guidelines, in Sweden and abroad, the researchers said.

In the current study, Dr. Tomas Jernberg of the Karolinska Institutet in Stockholm and his colleagues consulted a database of over 61,000 patients treated in Sweden between 1996 and 2007 for a certain kind of heart attack that damages a much of the heart muscle.

Sweden introduced new guidelines during that period, outlining the best way to treat patients who had just had a heart attack. Over those 12 years, treatments that have been proven to help heart attack patients, including prescribing drugs to break up blood clots and procedures to open arteries, became commonplace.

Statins for example, were prescribed to 23 percent of heart attack patients in 1996 and 1997, compared to 83 percent in 2006 and 2007. Only 12 percent of patients underwent an angioplasty after a heart attack at the beginning of the study, compared to 61 percent by the end.

In the study’s later years, fewer patients died after having a heart attack and the chance of dying in the year following a heart attack dropped from 21 percent to 13 percent.

Standardizing treatments in all hospitals has been slow to take hold however, Jernberg explains. “There are variations (in) how quickly they adopt new treatments. These variations are not as large as in previous years, but there is still room for improvement.”

Mukherjee tells Reuters Health that the story is similar in the United States, with hospitals understanding the positive results of standardizing treatments, but not everyone is fully on board to implementing those changes. “We’ve certainly gotten better, (but) we’re not at 100 percent. And even now, hospitals differ.”

Patients can help ensure they get the best treatment after a heart attack by educating themselves on different procedures and medications, and by understanding that treatment continues even after they leave the hospital.

“We see a lot of problems with patients who stop taking their medication when they start feeling better but those medications need to be taken long-term to prevent future heart problems.”

Researchers also proposed quicker feedback for doctors and “decision support” – which requires doctors to answer questions about the treatment they are giving a patient as they give it – to improve adherence to guidelines.

Improvements in treatment need to happen faster, said Dr. Eric Peterson, a cardiologist at the Duke University Medical Center in Durham, North Carolina, who is also an editor at JAMA. “The bad news is that was a 12-year period of time to do something that we needed to do,” Peterson told Reuters Health of the new Swedish findings.

“That’s way too long. Think about how many patients along the way … didn’t get treated before we got to those high rates of adherence.”

On the Net:

Scientists Create Stable, Self-renewing Neural Stem Cells

Abundant precursor cells can become many types of neurons without introducing tumor risk

In a paper published in the April 25 early online edition of the Proceedings of the National Academy of Sciences, researchers at the University of California, San Diego School of Medicine, the Gladstone Institutes in San Francisco and colleagues report a game-changing advance in stem cell science: the creation of long-term, self-renewing, primitive neural precursor cells from human embryonic stem cells (hESCs) that can be directed to become many types of neuron without increased risk of tumor formation.

“It’s a big step forward,” said Kang Zhang, MD, PhD, professor of ophthalmology and human genetics at Shiley Eye Center and director of the Institute for Genomic Medicine, both at UC San Diego. “It means we can generate stable, renewable neural stem cells or downstream products quickly, in great quantities and in a clinical grade ““ millions in less than a week ““ that can be used for clinical trials and, eventually, for clinical treatments. Until now, that has not been possible.”

Human embryonic stem cells hold great promise in regenerative medicine due to their ability to become any kind of cell needed to repair and restore damaged tissues. But the potential of hESCs has been constrained by a number of practical problems, not least among them the difficulty of growing sufficient quantities of stable, usable cells and the risk that some of these cells might form tumors.

To produce the neural stem cells, Zhang, with co-senior author Sheng Ding, PhD, a former professor of chemistry at The Scripps Research Institute and now at the Gladstone Institutes, and colleagues added small molecules in a chemically defined culture condition that induces hESCs to become primitive neural precursor cells, but then halts the further differentiation process.

“And because it doesn’t use any gene transfer technologies or exogenous cell products, there’s minimal risk of introducing mutations or outside contamination,” Zhang said. Assays of these neural precursor cells found no evidence of tumor formation when introduced into laboratory mice.

By adding other chemicals, the scientists are able to then direct the precursor cells to differentiate into different types of mature neurons, “which means you can explore potential clinical applications for a wide range of neurodegenerative diseases,” said Zhang. “You can generate neurons for specific conditions like amyotrophic lateral sclerosis (ALS or Lou Gehrig’s disease), Parkinson’s disease or, in the case of my particular research area, eye-specific neurons that are lost in macular degeneration, retinitis pigmentosa or glaucoma.”

The new process promises to have broad applications in stem cell research. The same method can be used to push induce pluripotent stem cells (stem cells artificially derived from adult, differentiated mature cells) to become neural stem cells, Zhang said. “And in principle, by altering the combination of small molecules, you may be able to create other types of stem cells capable of becoming heart, pancreas, or muscle cells, to name a few.”

The next step, according to Zhang, is to use these stem cells to treat different types of neurodegenerative diseases, such as macular degeneration or glaucoma in animal models.

Funding for this research came, in part, from grants from National Institutes of Health Director’s Transformative R01 Program, the National Institute of Child Health and Development, the National Heart, Lung, and Blood Institute, the National Eye Institute, the National Institute of Mental Health, the California Institute for Regenerative Medicine, a VA Merit Award, the Macula Vision Research Foundation, Research to Prevent Blindness, a Burroughs Wellcome Fund Clinical Scientist Award in Translational Research and the Richard and Carol Hertzberg Fund.

Co-authors of the study include Wenlin Li, Yu Zhang, Wanguo Wei, Rajesh Ambasudhan, Tongxiang Lin, Janghwan Kim, Department of Chemistry, The Scripps Research Institute; Woong Sun, Xiaolei Wang, UCSD Institute for Genomic Medicine and Shiley Eye Center, Department of Anatomy, Korea University College of Medicine, Seoul, Korea; Peng Xia, Maria Talantova, Stuart A. Lipton, Del E. Webb Center for Neuroscience, Aging and Stem Cell Research, Sanford-Burnham Medical Research Institute; Woon Ryoung Kim, Department of Anatomy, Korea University College of Medicine, Seoul, Korea.

Image 1: This image depicts cultured, self-renewing primitive neural precursors derived from human embryonic stem cells using molecule inhibitors. Credit: UC San Diego School of Medicine

Image 2: This image depicts stained mature neurons, derived from precursor cells, expressing the neurotransmitter dopamine. Credit: UC San Diego School of Medicine

On the Net:

Low IQ Could Result From Lack Of Motivation

According to new research, intelligence tests are as much a measure of motivation as they are of mental ability.

Researchers from Pennsylvania found that a high IQ score required both high intelligence and high motivation, but a low IQ score could be the result of a lack of either.

The team also found that incentives helped increase IQ scores by a noticeable margin.

The researchers analyzed previous studies of how material incentives affected the performance of over 2,000 people in intelligence tests.

Researchers from the University of Pennsylvania, Philadelphia, found that incentives increased all IQ scores, but particularly for those with lower baseline IQ scores.

The team tested how motivation impacted on the results of IQ tests and also on predictions of intelligence and performance later in life.

They were able to conclude that some individuals try harder than others in conditions where the stakes are low.

The study says, “relying on IQ scores as a measure of intelligence may overestimate the predictive validity of intelligence.”

Achieving a high score on an IQ test requires high intelligence and competitive tendencies.

Dr James Thompson, senior honorary lecturer in psychology at University College London, said in a statement that it has always been known that IQ test results are a combination of innate ability and other variables.

“Life is an IQ test and a personality test and an IQ result contains elements of both (but mostly intelligence). If an IQ test doesn’t motivate someone then that is a good predictor in itself.”

The study was published in the Proceedings of the National Academy of Sciences.

Angela Lee Duckworth, a psychologist who led the study, said in a statement: “IQ scores may predict various outcomes in life, but in part for reasons that intelligence tests weren’t designed for.”

“I hope that social scientists, educators, and policy-makers turn a more critical eye to any kind of measure, intelligence or otherwise as how hard people try could be as important to success in life as intellectual ability itself.”

On the Net:

Electronic Cigarettes To Get FDA Regulation

Electronic cigarettes that are marketed for non-therapeutic use will be regulated by the U.S. Food and Drug Administration (FDA) as a tobacco product, the agency announced after a federal court ruled that the products were not a drug or medical device, and cannot be regulated as such.

The e-cigarette maker Sottera Inc., based in Scottsdale, Arizona, and doing business as Njoy, argued in the federal lawsuit that its products are tobacco products and not drugs. These battery-powered devices generate a nicotine vapor instead of smoke and are marketed as a tobacco alternative for “smoking pleasure” rather than therapeutic use, the company says.

E-cigarette supporters say that these devices provide an alternative to traditional cigarettes and may be less harmful due to the lack of smoke produced.

On the other side, the FDA argued that e-cigarettes might work with smokers in the same way methadone clinics wean heroin addicts by giving them a less harmful form of an addictive substance, reports Bloomberg.

And the American Lung Association, who is against smoking, has urged the FDA to suspend sales of e-cigarettes until the manufacturers prove that their products are safe and effective in clinical trials.

In December, the U.S. Court of Appeals in Washington ruled that the FDA can regulate e-cigarettes only as a tobacco product as long as these were not marketed for therapeutic purposes.

The decision allows the FDA to review new e-cigarette products before they are sold, but they are not allowed to require the manufacturers to conduct animal and human studies that are required for FDA approval of drugs and medical devices, reports Bloomberg.

“The government has decided not to seek further review of this decision, and FDA will comply with the jurisdictional lines established” by the ruling, a letter by Lawrence Deyton, director of the agency’s Center for Tobacco Products, and Janet Woodcock, director of the Center for Drug Evaluation and Research stated.

The Campaign for Tobacco-Free Kids said that it was disappointed the U.S. government would not appeal the federal appeals court ruling.

According to the group, the ruling has opened a loophole that lets manufacturers add nicotine to products, allowing them to bypass the regulations that traditionally apply to smoking cessation medications and other non-tobacco products that include nicotine, reports the Telegraph.

Deyton and Woodcock say that the agency is considering whether to issue rules or industry guidance on what types of marketing would qualify as a “therapeutic.”

In a separate summary of its letter, the FDA says that e-cigarettes that it determines to be marketed for therapeutic purposes “will continue to be regulated as drugs and/or devices.”

Five e-cigarette companies were warned last year that they were illegally marketing their products as smoking-cessation aids because they did not obtain approval from the FDA as drug-delivery devices, repots Bloomberg.

It is not mentioned by Deyton and Woodcock if the e-cigarette companies are still claiming therapeutic products, and agency spokesman did not immediately respond to an email sent by Bloomberg seeking comment.

Authority to regulate tobacco products that were not marketed as drugs or medical devices were given to the FDA in 2009.

The FDA will soon be subjecting e-cigarette companies to the same rules that already exist for makers of regular cigarettes. For example, they will be required to provide the government agency with a list of their product ingredients, the agency says on its website.

“We look forward to working with the FDA toward the creation of a regulatory framework that we can all work under together,” Sottera President Craig Weiss told Bloomberg in a phone interview.

There is no health or therapeutic claims made by the company and it will “flourish in a regulatory environment,” Weiss says.

The Tobacco Vapor Electronic Cigarette Association (TVECA), a group that represents companies that make such products, said in an email to the Telegraph that it always wanted the electronic cigarettes to be regulated as tobacco products.

Chief executive of TVECA says that the electronic cigarette is 14,000 times less harmful than a regular cigarette and does not alter mind or body functions.

“This product delivers five ingredients. All five are approved by the FDA,” he says.

These battery operated devices consist of a heating element and a cartridge that contains nicotine in a liquid suspension. Once the user inhales the cartridge, the liquid is then heated and a vapor is emitted. The nicotine is retrieved from tobacco plants.

According to the U.S. Centers for Disease Control and Prevention, smokers spend $1.2 billion on smoking-cessation products and $80 billion on cigarettes a year.

Some smoke-cessation products on the market include prescription nasal sprays and over-the-counter gums, patches and lozenges that have been approved as drugs by the FDA.

On the Net:

Genetically Modified Virus Helps Power Solar Cell

MIT reported on Monday that researchers have a used genetically modified virus to produce structures that improve solar-cell efficiency by about one-third.

MIT researchers said they have found a way to make significant improvements to the power-conversion efficiency of solar cells by using a tiny virus to perform detailed assembly work at the microscopic level.

Sunlight hits a light-harvesting material in a solar cell, which releases electrons that can produce an electric current. 

The research of this new study is based on findings that carbon nanotubes can enhance the efficiency of electron collection from a solar cell’s surface.

However, previous attempts to use the nanotubes had been thwarted by two problems.

The first problem is that creating carbon nanotubes generates a mix of two types, some of which act as semiconductors and some as metals.

The new research has shown that the effects of these two types tend to be different. 

The second problem is that nanotubes clump up together, which reduces their effectiveness.

Graduate students Xiangnan Dang and Hyunjung Yi – working with Angela Belcher, the W. M. Keck Professor of Energy, and several other researchers – found that a genetically engineered version of a virus called M13 can be used to control the arrangement of the nanotubes on a surface, keeping the tubes separate so they cannot short out the circuits.

The system the researchers tested used a type of solar cell called dye-sensitized solar cells, which is a lightweight and inexpensive type where the active layer is composed of titanium dioxide.

The researchers said the same technique could be applied to the types as well. 

During the study the team enhanced the power conversion efficiency to 10.6 percent from 8 percent by adding the virus-built structures.

This improvement takes place even though the viruses and the nanotubes make up just 0.1 percent by weight of the finished cell.

“A little biology goes a long way,” Angela Belcher said in a press release.

The first step is for the energy of the light to knock electrons loose from the solar-cell material.  After this, those electrons need to be funneled toward a collector, from which they can form a current that flows to charge a battery or power a device.

The team has previously used differently engineered versions of the same virus to enhance the performance of batteries and other devices.  However, Belcher said that the method used to enhance solar cell performance is different.

The research was funded by the Italian company Eni, through the MIT Energy Initiative’s Solar Futures Program. 

It was published online this week in the journal Nature Nanotechnology.

Image Caption: In this diagram, the M13 virus consists of a strand of DNA (the figure-8 coil on the right) attached to a bundle of proteins called peptides “” the virus coat proteins (the corkscrew shapes in the center) which attach to the carbon nanotubes (gray cylinders) and hold them in place. A coating of titanium dioxide (yellow spheres) attached to dye molecules (pink spheres) surrounds the bundle. More of the viruses with their coatings are scattered across the background. Image: Matt Klug, Biomolecular Materials Group 

On the Net:

UK Planning To Rid Itself Of Invasive Parakeet

The British government said this week that a species of parakeet which threatens wildlife and crops will be removed from the wild.

The Department for Environment, Food and Rural Affairs (DEFRA) said in a statement that the monk parakeet was an invasive species.

It announced measures to either rehouse the birds, remove their nests or shoot them.

DEFRA said there are about 100 of the birds in the U.K., mainly in the southeast of England.

DEFRA said that they have the potential to threaten “national infrastructure,” even though they have not caused any damages yet.

It said extensive damage to crops had been reported in both North and South America, and the birds could cause power cuts when their nests were built on electricity pylons.

A spokesman for DEFRA said it would try to rehouse the birds in aviaries and if that fails their nests would be moved.

“In extreme cases, it could mean we have to shoot some, but we haven’t tried that yet,” he said.

The 1ft-tall parakeet builds huge communal nests and are identified by their green body, yellowish belly, pale grey face and breast and pale bill.

“Control work is being carried out as part of a DEFRA initiative to counter the potential threat monk parakeets pose to critical national infrastructure, crops and native British wildlife,” a DEFRA spokesman said in a statement.

“This invasive species has caused significant damage in other countries through nesting and feeding activity and we are taking action now to prevent this happening in the UK.”

A spokesman for the RSPB bird conservation group said “Our understanding is that they are going to be brought into captivity; we don’t see it’s necessary for them to be culled.

“We’re happy action is taking place in that they’re being removed from the wild.

“It’s a small population at large, as the birds are colonial and are concentrated in one or two sites, so it will be possible to deal with as we think it could be a problem.”

On the Net:

Anti-helium Discovered In The Heart Of STAR

Berkeley Lab nuclear scientists join with their international colleagues in the latest record-breaking discovery at RHIC

Eighteen examples of the heaviest antiparticle ever found, the nucleus of antihelium-4, have been made in the STAR experiment at RHIC, the Relativistic Heavy Ion Collider at the U.S. Department of Energy’s Brookhaven National Laboratory.

“The STAR experiment is uniquely capable of finding antihelium-4,” says the STAR experiment’s spokesperson, Nu Xu, of the Nuclear Science Division (NSD) at Lawrence Berkeley National Laboratory (Berkeley Lab). “STAR already holds the record for massive antiparticles, last year having identified the anti-hypertriton, which contains three constituent antiparticles. With four antinucleons, antihelium-4 is produced at a rate a thousand times lower yet. To identify the 18 examples required sifting through the debris of a billion gold-gold collisions.”

Collisions of energetic gold nuclei inside STAR briefly recreate conditions in the hot, dense early universe only millionths of a second after the big bang. Since equal amounts of matter and antimatter were created in the big bang they should have completely annihilated one another, but for reasons still not understood, only ordinary matter seems to have survived. Today this excess matter forms all of the visible universe we know.

Roughly equal amounts of matter and antimatter are also produced in heavy-ion (gold nuclei) collisions at RHIC. The resulting fireballs expand and cool quickly, so the antimatter can avoid annihilation long enough to be detected in the Time Projection Chamber at the heart of STAR.

Ordinary nuclei of helium atoms consist of two protons and two neutrons. Called alpha particles when emitted in radioactive decays, they were found in this form by Ernest Rutherford well over a century ago. The nucleus of antihelium-4 (the anti-alpha) contains two antiprotons bound with two antineutrons.

The most common antiparticles are generally the least massive, because it takes less energy to create them. Carl Anderson was the first to find an antiparticle, the antielectron (positron), in cosmic ray debris 1932. The antiproton (the nucleus of antihydrogen) and the antineutron were created at Berkeley Lab’s Bevatron in the 1950s. Antideuteron nuclei (“anti-heavy-hydrogen,” made of an antiproton and an antineutron) were created in accelerators at Brookhaven and CERN in the 1960s.

Each extra nucleon (called a baryon) increases the particle’s baryon number, and in the STAR collisions every increase in baryon number decreases the rate of yield roughly a thousand times. The nuclei of the antihelium isotope with only one neutron (antihelium-3) has been made in accelerators since 1970; the STAR experiment produces many of these antiparticles, having baryon number 3. The antihelium nucleus with baryon number 4, just announced by STAR based on 16 examples identified in 2010 and two examples from an earlier run, contains the most nucleons of any antiparticle ever detected.

“It’s likely that antihelium will be the heaviest antiparticle seen in an accelerator for some time to come,” says STAR Collaboration member Xiangming Sun of Berkeley Lab’s NSD. “After antihelium the next stable antimatter nucleus would be antilithium, and the production rate for antilithium in an accelerator is expected to be well over two million times less than for antihelium.”

NSD’s Maxim Naglis adds, “Finding even one example of antilithium would be a stroke of luck, and would probably require a breakthrough in accelerator technology.”

If antihelium made by accelerators is rare, and heavier antiparticles rarer still, what of searching for these particles in space? The Alpha Magnetic Spectrometer (AMS) experiment, scheduled to be launched on one of the last space-shuttle missions to the International Space Station, is an instrument designed to do just that. A principal part of its mission is to hunt for distant galaxies made entirely of antimatter.

“Collisions among cosmic rays near Earth can produce antimatter particles, but the odds of these collisions producing an intact antihelium nucleus are so vanishingly small that finding even one would strongly suggest that it had drifted to Earth from a distant region of the universe dominated by antimatter,” explains Hans Georg Ritter of Berkeley Lab’s NSD. “Antimatter doesn’t look any different from ordinary matter, but AMS finding just one antihelium nucleus would suggest that some of the galaxies we see are antimatter galaxies.”

Meanwhile the STAR experiment at RHIC, which has shown that antihelium does indeed exist, is likely to hold the world record for finding the heaviest particle of antimatter for the foreseeable future.

This work was supported by the DOE Office of Science.

NOTES:

STAR’s Time Projector Chamber is a cylinder filled with dilute gas and placed in a uniform (solenoidal) magnetic field (STAR stands for Solenoidal Tracker at RHIC). Charged particles created in beam collisions ionize the gas as they streak through it; how much the trails bend in the magnetic field reveals their momentum. Detecting antihelium also depends on measuring particle mass, which can be learned from how much energy the particles lose in flight and how long it takes them to reach the sides of the cylinder.

A series of “event triggers” picks out particles matching the specifications of those the researchers are looking for. The data generated by STAR travel to the National Energy Research Scientific Computing Center (NERSC), managed by Berkeley Lab, via DOE’s high-bandwidth Energy Sciences Network (ESnet).

Time projection chambers were invented by David Nygren of Berkeley Lab’s Physics Division and are used at accelerators around the world; the time-projection-chamber principle inspired the original proposals for the STAR experiment within Berkeley Lab’s NSD. STAR’s Time Projection Chamber was designed by NSD’s Howard Wieman, built by Berkeley Lab physicists and engineers, and shipped to Brookhaven on a C-5 cargo plane.

The installation of a large time-of-flight detector in 2009 added measurement capabilities to STAR that were vital to the identification of antihelium-4. It was constructed jointly by U.S. and Chinese institutions and jointly funded by DOE’s Office of Science and the National Natural Science Foundation of China, China’s Ministry of Science and Technology, and the Chinese Academy of Sciences.

Image 1: Roughly equal amounts of matter and antimatter are created in the collision of energetic gold nuclei inside STAR, but because the fireball expands and cools quickly, antimatter can survive longer than that created in the big bang. In this collision an ordinary helium-4 nucleus (background) is matched by a nucleus of antihelium-4 (foreground). Credit: STAR Collaboration and Lawrence Berkeley National Laboratory

Image 2: The track of an antihelium-4 nucleus, highlighted in red, appears in a myriad of tracks produced by a gold-gold collision inside the STAR detector at RHIC. STAR’s Time Projection Chamber measures the momentum and mass of the collision events, which are sifted to select particles with specific characteristics. Credit: STAR Collaboration

On the Net:

Probing The Impact Of Climate Change

What do polar bears, hummingbirds, clams, bowhead whales and invasive plant species have to do with Earth science spacecraft orbiting overhead 24/7? Soon observations from NASA’s Earth-observing satellites of our planet’s climate will be brought to bear on understanding how different species and ecosystems respond to climate changes and developing tools to better manage wildlife and natural resources.

NASA has joined with the U.S. Geological Survey, National Park Service, U.S. Fish and Wildlife Service and Smithsonian Institution to initiate new research and applications efforts that will bring the global view of climate from space down to Earth to benefit wildlife and key ecosystems.

This is the first time NASA has targeted research investigating the intersection of climate and biological studies. The projects are sponsored by the Earth Science Division in NASA’s Science Mission Directorate.

The wildlife species to be studied include polar bears in Greenland, bowhead whales in the Arctic Ocean, and migratory birds and waterfowl in the United States. Other studies will focus on species of commercial interest such as clams, oysters and other bivalves in U.S. coastal waters, and Atlantic bluefin tuna in the Gulf of Mexico.

To learn more about climatic effects on plants, researchers will focus on the loss of cordgrass marshes in coastal wetlands of the southeastern states. They also will examine the stresses to native tree species, many of commercial value, across the western states and Canada.

“We know very little about how the majority of species and ecosystems will respond to environmental changes related to changing climates,” said Woody Turner, manager of NASA’s Ecological Forecasting program in Washington. “These projects bring together NASA’s global satellite data of the physical environment with ground-based data on specific species and ecosystems and computer modeling to detect and understand biological responses to climate.”

The studies will use long-term observations of Earth from space, including data on sea surface temperature, vegetation cover, rainfall, snow cover, sea ice and the variability in the microscopic marine green plants that form the base of ocean food chains.

Below are the 15 new projects and their principal investigators:

Bird Populations and Extreme Climate Events
Patricia Heglund, U.S. Fish and Wildlife Service, La Crosse, Wisconsin
Project title: “Effects of extreme climate events on avian demographics: The role of refugia in mitigating climate change”

Bowhead Whales
Elizabeth Holmes, National Oceanic and Atmospheric Fisheries Service, Seattle, Wash.
Project title: “Forecasting changes in habitat use by bowhead whales in response to Arctic climate change”

Clams, Oysters and Other Bivalves
David Wethey, University of South Carolina, Columbia
Project title: “Physiological impacts of climate change using remote sensing: An integrative approach to predicting patterns of species abundance and distribution and thresholds of ecosystem collapse”

Coastal Salt Marshes
Ilka Feller, Smithsonian Environmental Research Center, Edgewater, Md.
Project title: “Sensitivity of coastal zone ecosystems to climate change”
Website: http://www.serc.si.edu/labs/animal_plant_interaction/Trail/VirtualTour.html

Elk and Caribou
Mark Hebblewhite, University of Montana, Missoula
Project title: “Global population dynamics and climate change: Comparing species-level impacts on two contrasting large mammals”

Global Biodiversity of Land Vertebrates
Walter Jetz, Yale University, New Haven, Conn.
Project title: “Integrating global species distributions, remote-sensing information and climate station data to assess recent biodiversity response to climate change”

Habitat Modeling
Jeff Morisette, U.S. Geological Survey Fort Collins (Colo.) Science Center
Project title: “Using the U.S. Geological Survey’s ‘Resources for Advanced Modeling’ to connect climate drivers to biological responses”
Website: http://www.fort.usgs.gov/RAM/

Hummingbird Diversity
Catherine Graham, Stony Brook University, Stony Brook, N.Y.
Project title: “Combining remote-sensing and biological data to predict the consequences of climate change on hummingbird diversity”
Website: http://www.hummonnet.org/index.html

Migratory Species
Gil Bohrer, Ohio State University, Columbus
Project title: “Discovering relationships between climate and animal migration with new tools for linking animal movement tracks with weather and land surface data”
Website: http://www.movebank.org

Migratory Fish Habitat
Mitchell Roffer, Roffer’s Ocean Fishing Forecasting Services, Inc., Melbourne, Fla.
Project title: “Management and conservation of Atlantic bluefin tuna (Thunnus thynnus) and other highly migratory fish in the Gulf of Mexico under International Panel on Climate Change (IPCC) climate change scenarios: A study using regional climate and habitat models”
Website: http://www.roffs.com/NASA_NMFSBluefinTuna.html

Native Tree Species
Richard Waring, Oregon State University College of Forestry, Corvallis
Project title: “Mapping of stress on native tree species across the western United States and Canada: Interpretation of climatically induced changes using a physiologically based approach”
Website: http://www.pnwspecieschange.info/

Plants, Prey and Predators
David Mattson, U.S. Geological Survey Colorado Plateau Research Station, Flagstaff, Ariz.
Project title: “Spatial responses to climate across trophic levels: Monitoring and modeling plants, prey, and predators in the intermountain western United States”

Polar Bears
Kristin Laidre, University of Washington, Seattle
Project title: “Climate change, sea ice loss, and polar bears in Greenland”

“WhaleWatch”
Helen Bailey, University of Maryland Center for Environmental Science, Solomons, Md.
Project title: “WhaleWatch: A tool using satellite telemetry and remote-sensing environmental data to provide near real-time predictions of whale occurrence in the California Current System to reduce anthropogenic impacts”

Wildlife and Ecosystem Management
Andrew Hansen, Montana State University, Bozeman
Project title: “Using NASA resources to inform climate and land use adaptation: Ecological forecasting, vulnerability assessment, and evaluation of management options across two U.S. Department of Interior Landscape Conservation Cooperatives”

Image Caption: Satellite observations of vegetation on land and microscopic marine plants that form the base of ocean food chains are some of the NASA datasets that will be used in the new studies. (Credit: NASA)

On the Net:

Behavior Problems In Childhood Start With ‘Cry-babies’

A greater risk of serious behavior problems later in life has been linked to infants who cry excessively, and who have problems sleeping and feeding, scientists reveal in a research that will be published in the journal Archives of Disease in Childhood.

Persistent crying, sleeping and/or feeding problems are known as regulatory problems.

Researchers found that around 20% of all infants show some sign of these regulatory problems in their first year of life. This can disrupt the families and increase costs for health services, say the researchers.

In previous studies, it has been suggested that behavioral or cognitive development later in childhood is a result of these regulatory problems, but conclusive evidence was found, the study says.

22 studies that were conducted between 1987 and 2006 were analyzed by researchers from the University of Basel in Switzerland, University of Warwick in the U.K. and the University of Bochum in Germany.

Collectively the analysis involved about 16,848 children, of which 1,935 children who had regulatory problems were tested.

Possible association between regulatory problems in early infancy and childhood behavioral problems were analyzed by the researchers.

The twenty-two studies involved ten of the studies with consequences of excessive crying, four were on sleeping problems, three on feeding problems and five were studies about multiple regulatory problems.

The researchers divided childhood behavior problems into four categories: Internalizing, or anxiety, depression or withdrawal; externalizing, or aggressive or destructive behavior, conduct problems or temper tantrums; attention deficit/hyperactivity (ADHD); and general behavior problems.

The study was not able to tell if issues as a baby cause behavioral problems later in life. It is possible that it could be an early symptom of those later problems.

However, infants with regulatory problems were more likely to have behavior problems as children than compared to infants without any of these problems, say the researchers.

The analysis by researchers found that children who had regulatory problems in infancy were more likely to have externalizing problems and ADHD.

A major reason why parents seek professional help is due to their concerns about their baby’s crying, sleeping, or feeding problems.

Professor Mitch Blair, officer for health promotion at The Royal College of Pediatrics and Child Health, points out that this “is an important study.”

He told BBC News that the study “really reinforces the need for attention at an early stage to prevent issues later in childhood,” and that parents are good at knowing when something is wrong with their children.

However, “It would be wrong for people to get overly alarmed. I don’t think on the basis of this report people should be going to their GPs,” Jane Valente, a consultant pediatrician at Great Ormond Street Hospital, told BBC News.

She says, “If a baby is not behaving like other babies it is probably worth discussing with a midwife or health visitor.”

A baby with multiple risk factors is even more likely to develop behavior problems, the research shows.

Clinically referred children often also come from families that have a range of risk factors such as obstetric, interactional, or psychosocial problems.

“It is about a 100% increase in risk, a doubling of risk of behavioral problems with excessive crying, sleeping and eating problems,” Professor Dieter Wolke, from the University of Warwick, told BBC News.

Professor Wolke says that while there are treatments for crying, feeding and sleeping problems in babies, there is no research that assesses its impact later in life.

He adds, “If you could prevent behavioral problems with an early intervention, in a public health-sense it could be very important.”

On the Net:

Scientists Tracking Black Carbon In The Arctic

Researchers from six countries are in the Arctic studying the potential role that soot, or black carbon, has on the rapidly changing Arctic climate.

Although the Arctic is typically viewed as a vast white wasteland, scientists believe a thin layer of soot is causing it to absorb more heat. They want to find out if that is the main reason for the recent rapid warming of the Arctic, which could have an impact on the world’s climate for years to come.

Black carbon is produced by vehicle engines, aircraft emissions, burning forests and wood- and coal-burning stoves.

“Carbon is dark in color and absorbs solar radiation, much like wearing a black shirt on a sunny day. If you want to be cooler, you would wear a light-colored shirt that would reflect the sun’s warmth,” said Tim Bates, a research chemist at NOAA’s Pacific Marine Environmental Laboratory (PMEL) in Seattle and co-lead of the U.S. component of the study.

“When black carbon covers snow and ice, the radiation is absorbed, much like that black shirt, instead of being reflected back into the atmosphere,” explained Bates.

Scientists from The United States, Norway, Russia, Germany, Italy and China are a participating in the Coordinated Investigation of Climate-Cryosphere Interactions (CICCI) project. The team’s goals are to coordinate more than a dozen research activities so they are done in tandem providing, for the first time, a vertical profile of black carbon’s movement through the atmosphere, its deposition on snow and ice surfaces, and its affect on warming in the Arctic.

“The Arctic serves as the air conditioner of the planet,” said Patricia Quinn of the National Oceanic and Atmospheric Administration (NOAA), and research chemist at Pacific Marine Environmental Laboratory (PMEL).

Heat from other regions of the Earth moves to the Arctic via circulating air and ocean currents, where some of that warmth does radiate into space along its journey. At the same time, some of the incoming heat from the sun that tends to be absorbed in other regions is reflected by ice and snow, allowing the polar regions to help cool the planet.

But soot in the Arctic could be changing that game.

In recent years, the Arctic has been warming more rapidly than other areas of the planet and the “warming of the Arctic has implications not just for polar bears, but for the entire planet,” Quinn told the Associated Press (AP).

“We need to better understand the behavior of black carbon in the Arctic,” said Quinn. “This coordinated study will give us a snapshot so we can see all of it at once.”

For the NOAA part of the study, called Soot Transport, Absorption, and Deposition Study (STADS), conducted between April 7 and May 6, researchers are using two small unmanned aircraft outfitted with sensors to take samples of the air in the Arctic. 

The aircraft will measure aerosol size, number, light absorption and chemical composition. A PMEL sensor will also be outfitted on the aircraft for the first time.

Also, a Norwegian unmanned aircraft will measure incoming radiation from the sun and the reflectivity of snow and ice covered surfaces.

NOAA will also collect falling and newly fallen snow to check levels of black carbon and for chemical tracer analysis. Chemical tracers yield information on the source of the black carbon, which is necessary for creating strategies on how best to deal with the impact black carbon has on the Arctic climate.

Also, Robert Stone of the Cooperative Institute for Research in Environmental Sciences will lead a study to measure reflectivity using a snowmobile-pulled sled on Svalbard at the Holtadalfohna Plateau, Kongsfjord fast ice, and the sea ice north of Spitsbergen.

Cutting carbon dioxide and other greenhouse gases is the mainstay of any effort to fight global warming, especially within the Arctic, said Quinn.

But studies suggest that cutting the concentration of short-lived pollutants, such as soot, will reduce the rate of warming in the Arctic faster than cuts in carbon dioxide and other greenhouse gases, which last much longer in the atmosphere, she said.

The UN Environmental Program urged for cuts in soot emissions in February. The call came for a variety of reasons, including the threat to human health from inhaling it and the potential warming of the polar regions.

The Arctic Council, which represents eight countries that border the Arctic, is deciding whether to seek reductions in soot from other nations and will be using data from the new research project to help make its decisions.

The surface air temperature in the Arctic has increased about twice as fast as the global average rate over the past 100 years, Quinn told AP in an interview by email. “Over the past 50 years, annual average surface air temperatures have increased from 2 to 3 degrees Celsius (3.6 to 5.4 degrees Fahrenheit) in Alaska and Siberia. The annual average temperature globally has increased by about 0.7 degree C (1.3 F) over the same time period.”

The Arctic warming has resulted in an earlier spring melt, a longer melting season, and a decrease in the extent of sea ice, she said. That raises concerns for polar bears, which depend on sea ice to hunt for food. When the highly reflective snow and ice melt, the darker surfaces of land and water that are beneath absorb more heat, adding to the growing global warming problem.

Soot’s greatest effect in the Arctic is likely to be warming caused by the particles floating in the atmosphere, according to Jack Dibb, an atmospheric chemist at the University of New Hampshire. It may also reduce the reflectivity of the snow and ice enough to cause warming, but that is going to be harder to document, he said in a telephone interview with AP.

The NOAA team includes leaders Bates and Quinn, as well as PMEL engineers Scott Stalin, Nick Delich, and Dirk Tagawa; Joint Institute for the Study of Atmosphere and Ocean scientists Jim Johnson and Drew Hamilton, and NOAA scientist Derek Coffman.

Participating institutions and research centers include the Arctic and Antarctic Research Institute, the Alfred Wegner Institute, the Institute of Atmospheric Sciences and Climate of the Italian National Research Council, the Chemistry Department of Florence University, the Norwegian Institute for Air Research, the Northern Research Institute, the Norwegian Institute for Polar Research, and the Chinese Academy of Meteorological Sciences.

Image 2: Trish Quinn shows off the first snow pit. Credit: NOAA-STAD

On the Net:

Prospects Grow For Blindness Recovery

Prospects for recovery of lost vision have brightened with the release of new scientific findings showing that the use of gentle near infra-red light can reverse damage caused by exposure to bright light, up to a month after treatment.

The Vision Centre’s Dr Krisztina Valter and doctoral researcher Rizalyn Albarracin have successfully demonstrated recovery of vision cells in the retina following near infra-red treatment applied after damage was sustained.

Their advance has raised hopes for the development of a practical, low-cost and painless treatment for damaged eyes ““ including for patients suffering from dry macular degeneration (dry AMD), now the most common cause of blindness in developed countries.

The finding, made using an animal model, builds on the evidence the team has established showing that pre-treatment of eyes with near infra-red can help to minimise damage caused by bright light and enhance recovery.

“Macular degeneration is responsible for around a half of the cases of blindness in Australia. The dry form, for which there is still no cure, accounts for 80-90 per cent of cases,” says Dr Valter, of The Vision Centre and Australian National University. “Our research shows clear evidence of recovery of vision cells from light damage, a good model for what happens in dry AMD.”

“Given the very high costs of blindness to any economy, it is encouraging to know that there is a simple, affordable technology in prospect which could help to reduce it.”

Ms Albarracin said that treating the retina with just a few minutes exposure to soft near-infra-red light a day for less than a week had produced a remarkable recovery in damaged photoreceptors (vision cells) which ordinarily would have died.

“You only have one set of vision cells, so if you lose them they can never be replaced. When they are damaged or stressed, they shut down and gradually die or kill themselves. You get a horrible “Ëœhot spot’ of dying cells in your retina, which gradually spreads out in a sort of domino effect until your vision is gone,” she explains.

“We have found that treating the cells before, during or even after light damage raises their protective factors and resistance to stress, and slowly allows their vision function to return. The retina looks really sick ““ but then it just bounces back. It’s almost a kind of a resurrection.”

Since only a few people know in advance they may suffer vision damage from bright light and can be pre-treated, knowing that near infra-red treatment soon after injury also causes the cells to heal well is an important step towards developing a practical therapy for people who are losing their sight either from injury or slow-onset conditions.

The technique could potentially be used to treat a wide range of forms of vision loss, including dry AMD, retinitis pigmentosa, inflammation of the retina and some diseases of the optic nerve, the researchers say.

“We’re using an array of small LEDs (light emitting diodes) that have been tuned to produce near infra-red light at a particular wavelength. These are fairly cheap, making a potential treatment very affordable ““ especially when you consider the overall costs of blindness,” Dr Valter says.

She says that the evidence yielded by the latest research is now so persuasive that the team could move to human trials this year, if they can secure a clinical partner.

“Near infra-red therapy is very benign and involves no discomfort to the patient. It is already approved by the US Food and Drug Administration for use in sports medicine, for hair loss and so on ““ so developing a novel therapeutic application for the eyes is likely to be less complex and protracted than, say, developing a new drug,” she adds.

Their paper “Photobiomodulation protects the retina from light-induced photoreceptor degeration,” by Rizalyn Albarracin, Janis Eells and Krisztina Valter appears in the latest issue of the journal Investigative Ophthalmology and Vision Science

The Vision Centre is funded by the Australian Research Council as the ARC Centre of Excellence in Vision Science.

On the Net:

Some Parents Not Happy With McDonald’s Kids Meals

The McDonald’s hamburger chain has responded to a lawsuit by suggesting that parents just tell their children “no” when it comes to buying Happy Meals if they do not want their children to have them. The lawsuit accuses McDonald’s of unfairly using toys to lure children into its restaurants, Reuters is reporting.

The plaintiff, Monet Parham, a Sacramento, California mother of two, charges that the company’s advertising violates California consumer protection laws.

The long-available Happy Meal has been a huge hit for McDonald’s, in effect making the burger and fries seller one of the world’s largest toy distributors also.

The use of toys in the Happy Meal has come under fire as well from public health officials, parents and lawmakers who are frustrated with rising childhood obesity rates and weak, mostly self-regulated, anti-obesity efforts from restaurant operators.

Parham, who filed suit last December, is represented by the Center for Science in the Public Interest, a nutrition advocacy group. In the lawsuit, Parham admits she frequently tells her children “no” when they ask for Happy Meals, McDonald’s said in Monday’s court filing.

McDonald’s, in asking for a dismissal, claims: “She was not misled by any advertising, nor did she rely on any information from McDonald’s.”

The suit was moved to a federal court at McDonald’s request, but the plaintiff insisted on seeing the lawsuit proceed before a California state judge. Should Parham’s lawsuit be allowed, it would spawn a host of other problematic legal proceedings, McDonald’s said.

“In short, advertising to children any product that a child asks for but the parent does not want to buy would constitute an unfair trade practice.”

Stephen Gardner, litigation attorney for the public interest group, said McDonald’s is using a cookie cutter approach to dismissing the lawsuit, with one key difference. “What is different about this motion is that McDonald’s has chosen to blame the victim — saying that it’s all Monet Parham’s fault if she doesn’t force her daughter to ignore the onslaught of McDonald’s marketing messages.”

“McDonald’s makes a lot of money by going around parents direct to kids, and it wants to continue with that strategy,” Gardner concluded.

The ban on advertising to children is self-regulated in the food industry with mixed results. The industry also has argued that government attempts to limit advertising is strangling free speech protections. Restaurants and food manufacturers have successfully fended off obesity-related lawsuits for years and have even pushed through state laws that ban obesity-related lawsuits.

On the Net:

Charging Electric Cars At Night Better For The Ozone

The harmful effects of ozone that exist at low levels in the earth’s atmosphere can be reduced when charging is done at night for electric vehicles, a new study has found.

In the stratosphere, ozone becomes a protective layer of film that filters out ultraviolet rays that can cause skin cancer and DNA mutations in plants, reports AFP.

However, ozone that is formed from the reaction between hydrocarbons and nitrogen oxide with sunlight can irritate the airways of people with cardiac or respiratory, in addition to harming sensitive plants.

The popularity of Plug-in Hybrid Electric Vehicles (PHEVs) resulted from high gasoline fuel costs, increased inefficiency, and positive impact on the environment due to the lack of exhaust fumes.

It is already known that charging PHEVs at night is much more cost-effective and reliable, and research has also found that nighttime charging can lead to lower levels of pollution on average.

Researchers at the Massachusetts Institute of Technology (MIT) and the University of Texas published their study in IOP Publishing’s journal Environmental Research Letters. They modeled the effects of replacing 20% of the vehicle miles travelled (VMT) by gasoline-run cars with PHEVs.

In the study, their computer model predicted the emissions of nitrogen oxides for 2018 in three different scenarios and in four major Texas cities: Dallas/Ft. Worth, Houston, Austin and San Antonio. Nitrogen oxide is the basic ingredient for ground-level ozone.

AFP reports that the power generation for this grid in 2009 was provided by 46% gas, 35% coal, 13% nuclear and 4.5% wind.

The first scenario involved charging the car at off-peak time in the night. The second scenario was based on charging to maximize battery life, which means charging just before use and only the amount of charge needed to complete the trip); and the third scenario was charging the battery when it was a convenient time for the driver, which is usually just after vehicle use.

As a result, the study showed that the overall levels of pollution were lower from the electricity generating unit emissions associated with charging than the level of pollution that resulted from emissions associated with 20% of gasoline VMT.

Although nighttime charging yielded the highest amount of nitrogen oxides, it produced the least amount of ozone on average across all the cities because there was no sunlight to react with the emissions, the study showed.

By morning, pollutants are dispersed and diluted by wind and other processes.

“The results in general show positive air quality results due to the use of PHEVs regardless of charging scenario with the nighttime charging scenario showing the best results on average by a small margin,” says lead author Dr. Tammy Thompson of MIT.

“This further supports efforts to develop regulation to encourage nighttime charging; an example would be variable electricity pricing. As more of the fleet switches over to PHEVs and a larger demand is placed on the electricity grid, it will become more important that we design and implement policy that will encourage charging behaviors that are positive for both air quality and grid reliability.”

The researchers hope that the study’s findings will guide policies on how to encourage cleaner cars.

On the Net:

Brazilian Forests Win Big Just Before Earth Day

Lee Rannals for RedOrbit.com

Just days before Earth Day, tropical dry forests in the Brazilian state of Minas Gerais received good news as members of the Superior Court of Minas Gerais overturned a state law that altered the status of the 6,000 square-mile forest’s protection.

The tropical dry forests can thank a project called Tropi-Dry for the success of overturning the law that left the area unprotected from logging.

The effort taken by these scientists rings of truth that man can still make a difference on this planet.  The state law that was overturned would have allowed a 70 percent clearing of Minas Gerais’ forests.

However, despite the victory in Minas Gerais, many of the world’s forests still face the threat of deforestation.

A National Geographic report said that land the size of Panama is stripped of its trees each year.

According to NASA, about half of all species on Earth live in only about 7 percent of the Earth’s tropical forests.

Experts believe that in less than 40 years, our planet has lost 8 percent of its rainforests.

Brazil led the pack of countries between the years of 1990 and 2005 by clearing over 42 million hectares of tropical forest regions. 

Let us all take an extra step this Earth Day by learning a little bit more about what we can do to help stop deforestation.

How can I help?

The simplest step to take for most would be to recycle the paper we use on a daily basis. 

According to A Recycling Revolution, if every American recycled just one-tenth of their newspapers, we would save about 25,000,000 trees a year.

The same source also said that about 1 billion trees worth of paper are thrown away every year in the U.S.

One step further would be to do more research into companies to determine which ones plant trees in place of those that were cut down to help print that receipt you threw away.

Though deforestation remains a problem, acts taken by projects like Tropi-Dry deserve a big thumbs-up for helping this Earth Day look a little greener.

Image Caption: A Brazilian tropical dry forest during the rainy season, which can last 4-6 months. Credit: Diego Brandao

On the Net:

Researchers Halt AIDS Pill Study

Researchers are stopping tests of a daily pill to prevent infection with the AIDS virus in thousands of African women.

The study was stopped because partial results show no signs that the drug is preventing the virus.

The study found that women taking Truvada, made by Gilead Sciences Inc., are just as likely to get HIV as other women who have been given dummy pills.  Researchers said that even if the study were to continue it would not be able to determine whether the pills help prevent infection, since the results are even this far along.

Another study last fall concluded that Truvada did help prevent infections in gay and bisexual men when given with condoms, counseling and other prevention services.  Many AIDS experts view that as a breakthrough that might help slow the epidemic.

Family Health International announced the new results on Monday.  The nonprofit group launched the study two years ago and had enrolled about half of the 3,900 women treated in Kenya, Tanzania and South Africa.  As of last week, 56 new HIV infections had occurred, half in each group.

No safety problems were seen with Truvada, but women taking it were more likely to become pregnant.

“That’s both a surprising finding and one that we can’t readily explain” by what is known so far about Truvada’s effects on women using hormonal contraceptives, Dr. Timothy Mastro of Family Health International told The Associated Press (AP).

The study was sponsored by the U.S. Agency for International Development and the Bill & Melinda Gates Foundation. Gilead provided the drugs for the study.

A study performed last year in South Africa found that a vaginal gel spiked with tenofovir cut a woman’s chance of getting HIV from an infected partner in half.  Protection was greater for those who used it most faithfully.

A similar effect was seen in the study of Truvada in gay men.  The drug lowered the chances of infection by 44 percent, and by 73 percent or more among men who took their pills faithfully.

Dr. Robert M. Grant of the Gladstone Institutes, a private foundation affiliated with the University of California, San Francisco, told AP that in the new study, “it’s difficult to understand why they did not see protection.”

Grant led the study of Truvada in gay men and said “we are very confident that this approach is useful” for them.

The new study’s result “must be seen as what it is “” the closure of a single trial in a field that has generated exciting results in the recent past,” Mitchell Warren, head of the AIDS Vaccine Advocacy Coalition, told AP.

Truvada costs $5,000 to $14,000 a year in the U.S. but as little as $140 a year in some poor countries where it is available in generic form.

On the Net:

Researchers Link Alcohol-Dependence Impulsivity To Brain Anomalies

    * Alcohol dependence (AD) is strongly associated with impaired impulse control.
    * A new study used functional magnetic resonance imaging to examine impulsive choices among people with a range of alcohol use disorders (AUDs).
    * Findings suggest that impulsive choice in AD may be the result of functional anomalies in widely distributed but interconnected brain regions that are involved in cognitive and emotional control.

Researchers already know that alcohol dependence (AD) is strongly associated with impaired impulse control or, more precisely, the inability to choose large, delayed rewards rather than smaller but more immediate rewards. Findings from a study using functional magnetic resonance imaging (fMRI) to investigate the neural basis of impulsive choice among individuals with alcohol use disorders (AUDs) suggest that impulsive choice in AD may be the result of functional anomalies in widely distributed but interconnected brain regions that are involved in cognitive and emotional control.

Results will be published in the July 2011 issue of Alcoholism: Clinical & Experimental Research and are currently available at Early View.

“Individuals with AD score higher on questionnaires”¨that measure impulsivity ““ for example, ‘I act without thinking’ ““ are less”¨able to delay gratification, and are less able to inhibit responses,” said Eric D. Claus, a research scientist with The Mind Research Network and first author of the study.

Given that impulsive choice in AUDs has been associated with impairment ofӬfrontal cortical systems involved in behavioral control, Claus explained, this study was designed to examine the neural correlates of one specific aspect ofӬimpulsivity, the ability to delay immediate gratification and insteadӬchoose rewards in the future.

“We investigated this choice process in”¨individuals with alcohol use problems ranging from alcohol abuse to”¨severe AD that required treatment,” said Claus. “This is the”¨largest study to date that has investigated the neural correlates of”¨impulsive choice in AD, which enabled us to examine the full range of”¨AUDs instead of only examining extreme group”¨differences.”

Claus and his colleagues examined 150 individuals (103 males, 47 females) with various degrees of alcohol use. All of the participants completed a delay discounting task ““ during which two options were presented, a small monetary (e.g., $10) reward available immediately or a larger monetary reward (e.g., $30) available in time (e.g., two weeks) ““ while undergoing fMRI. Impulsive choice was defined as the selection of the more immediate option.

“We showed two things,” said Claus. “We replicated previous research by showing that AUD “¨severity was associated with a greater tendency to discount future”¨rewards. In addition, we showed that when individuals with more”¨severe AUDs did delay gratification, they engaged the insula and”¨supplementary motor area ““ regions involved in emotional processing and”¨response conflict ““ to a greater degree than individuals with less”¨severe AUDs. In summary, these findings suggest that the”¨dysfunction in these regions is graded and increases as a function of”¨AUD severity, rather than operating as an all-or-none function.”

“This work showed that the brains of alcoholics don’t behave all that differently from the brains of non-alcoholics during delay discounting but that the alcoholic brain had to work harder when they chose the delayed reward,” said Daniel W. Hommer, chief of the Section of Brain Electrophysiology & Imaging at the National Institute on Alcohol Abuse and Alcoholism. “Many different studies have shown similar results, that is, alcoholics have a greater increase in brain blood flow to perform the same task as non-alcoholics.”

“The current study suggests that”¨the neural dysfunction underlying impulsive choice seems to increase”¨with AD severity,” added Claus. “Now that we know that this neural dysfunction is associated with”¨impulsivity, the next steps are to determine whether this impulsivity”¨predates the onset of AD and whether neural measures of impulsivity”¨can predict who will respond best to particular types of treatment. Further, the particular neural dysfunction that we”¨observed indicates that individuals with more AD may be more impulsive”¨because their brain is aversive to delay gratification, and not because it is”¨rewarding to be impulsive. Clinicians might need to deal directly”¨with the aversion of choosing future benefits over immediate ones.”

“The most important thing about this paper is that it leads you to question what people mean by impulsive behavior and how should it be measured,” said Hommer. “The field has defined increased discounting of time ““ failure to delay gratification ““ as a good measure of impulsiveness, but the results reported in this paper say ‘Wait a minute, delay discounting does not correspond to what is usually meant by impulsiveness.’ Rather, brain activity during a delay discounting task looks more like how the brain responds during conflicted decision-making than it does during rapid, unconflicted choice of a highly valued goal.” Hommer added that this sort of debate is important to researchers, forcing them to think more carefully about what they mean by impulsive choice.

On the Net:

Cursing During Pain Does Make You Feel Better

Stubbed your toe lately or smashed a finger? Then you very likely uttered an expletive or two and felt better. Researchers have found that letting it out, at least, for those who don’t use expletives in normal, everyday speech, can lessen sudden and unexpected pains, the Daily Mail reports.

Researchers at Keele University wanted to study if cursing in response to pain had any actual benefits. Lead researcher Dr. Richard Stephens said the results show that swearing can release pain-killing endorphins, in other words, it really does make us feel better.

The study, involving 171 students, were divided into two groups for the study ““ those who routinely keep their language socially acceptable, uttering fewer than 10 swear words a day, and those who had fewer qualms about speaking out and swear up to 40 times daily.

Students were asked to dip their hands into ice water and hold them there as long as possible. At first, while repeating a non-swear word, then again while repeating a swear word of their choosing. It was found that students were able to keep their hands submerged in the icy water for longer when repeating the swear word, thus establishing a link between swearing and an increase in pain tolerance.

The team believes the pain-lessening effect occurs because swearing triggers the “fight or flight” response. Accelerated heart rates of the students repeating the swear word may indicate an increase in aggression, in a classic fight or flight response of “downplaying feebleness in favor of a more pain-tolerant machismo.”

Swearing in this study was found to not only trigger an emotional response, but a physical one too, which may explain why cursing developed centuries ago and why it still persists today almost without thought.

Dr. Richard Stephens, who worked on the project, tells The Telegraph, “Swearing has been around for centuries and is an almost universal human linguistic phenomenon. It taps into emotional brain centers and appears to arise in the right brain, whereas most language production occurs in the left cerebral hemisphere of the brain.”

“Our research shows one potential reason why swearing developed and why it persists.”

However, Stephens adds that to maintain the analgesic qualities of swearing folks should save it up for when it really matters, when they are in genuine pain. “I think the benefit of swearing as a response to pain lies in the field either before medical intervention arrives or for minor injuries.”

“You stub your toe, you let fly with some expletives and you move on. But as our new study shows ““ if you overdo casual everyday swearing, then it seems that you would not get the benefit of letting fly with an expletive at that moment when you injure yourself,” Stephens concluded.

The research will be presented at the British Psychological Society’s Annual Conference in Glasgow (4-6 May)

On the Net:

Treating Tourette Syndrome Without Drugs

(Ivanhoe Newswire) — A new study shows the use of cognitive-behavioral therapy to treat tics in patients with Tourette syndrome may be as effective as using medication in certain cases.

Tourette syndrome is a neuropsychiatric disorder characterized by motor and vocal tics that worsen during childhood and peak around age 11. It affects up to 3 percent of school-age children and can persist into adulthood.

For the study, the research team looked at one group of 10 adults with Tourette syndrome and another group of 14 adults with no neurological or psychiatric problems. Participants were asked to perform a series of tasks to stimulate specific regions in the brain. An electroencephalogram was recorded with each task.

After six months of therapy, the participants performed the same tests again. Results showed a significant reduction in tics. After the behavioral treatment, researchers also observed a quantifiable normalization of brain activity, which is linked to improvement of symptoms in Tourette syndrome.

“This discovery could have major repercussions on the treatment of this illness. In some cases, the physiological measures could allow for the improvement of the therapy in order to tailor it to a specific type of patient,” Dr. Marc Lavoie, certified researcher at Fernand-Seguin Research Centre of the Louis-H. Lafontaine Hospital and with the Psychiatry Department of Universit© de Montr©al, was quoted as saying.

SOURCE: International Journal of Cognitive Therapy, April 14, 2011

Radio Telescopes Could Help Find Exoplanets

Detecting exoplanets that orbit at large distances from their star remains a challenge for planet hunters.  Now, scientists at the University of Leicester have shown that emissions from the radio aurora of planets like Jupiter should be detectable by radio telescopes such as LOFAR, which will be completed later this year. Dr Jonathan Nichols will present results at the RAS National Astronomy Meeting in Llandudno, Wales, on Monday 18th April.

“This is the first study to predict the radio emissions by exoplanetary systems similar to those we find at Jupiter or Saturn.  At both planets, we see radio waves associated with auroras generated by interactions with ionized gas escaping from the volcanic moons, Io and Enceladus.  Our study shows that we could detect emissions from radio auroras from Jupiter-like systems orbiting at distances as far out as Pluto,” said Nichols.

Of the hundreds of exoplanets that have been detected to date, less than 10% orbit at distances where we find the outer planets in our own Solar System.  Most exoplanets have been found by the transit method, which detects a dimming in light as a planet moves in front of a star, or by looking for a wobble as a star is tugged by the gravity of an orbiting planet. With both these techniques, it is easiest to detect planets close in to the star and moving very quickly.

“Jupiter and Saturn take 12 and 30 years respectively to orbit the Sun, so you would have to be incredibly lucky or look for a very long time to spot them by a transit or a wobble,” said Dr Nichols.

Dr Nichols examined how the radio emissions for Jupiter-like exoplanets would be affected by the rotation rate of the planet, the rate of plasma outflow from a moon, the orbital distance of the planet and the ultraviolet (UV) brightness of the parent star.

He found that, in many scenarios, exoplanets orbiting UV-bright stars between 1 and 50 Astronomical Units (AU) would generate enough radio power to be detectable from Earth.  For the brightest stars and fastest spinning planets, the emissions would be detectable from systems 150 light years away from Earth.

“In our Solar System, we have a stable system with outer gas giants and inner terrestrial planets, like Earth, where life has been able to evolve.  Being able to detect Jupiter-like planets may help us find planetary systems like our own, with other planets that are capable of supporting life,” said Dr Nichols.

Image Caption: Image of Jupiter’s northern UV auroras obtained using the Advanced Camera for Surveys onboard HST in February 2007. Credit: Nichols/ESA/NASA/HST

On the Net:

Over 1,300 Gray Wolves Taken Off Endangered List

Federal wildlife officials said on Friday that they will take over 1,300 gray wolves in the Northern Rockies off the endangered species list in the next 60 days.

An attachment to the budget bill signed into law Friday by President Barack Obama has taken away the wolves’ protection in five Western states.

This will be the first time Congress has taken a species off the endangered list.

Idaho and Montana are planning public wolf hunts this fall.  The states had a similar mindset last year until a judge ruled that the species still remained at risk.

Wolves in Wyoming are still at risk because of its shoot-on-sight law for the canine-species.

Oregon and Washington have not announced immediate plans to hunt the wolf populations.

FDA Approves The NovoTTF-100A System For The Treatment Of Patients With Recurrent Glioblastoma Multiforme (GBM) Brain Tumors

First ever medical device therapy indicated as an alternative to chemotherapy for cancer

Novocure today announced that the U.S. Food and Drug Administration (FDA) approved the NovoTTF-100A System (NovoTTF) for the treatment of adult patients with glioblastoma multiforme (GBM) brain tumors, following tumor recurrence after receiving chemotherapy. The portable, wearable device delivers an anti-mitotic, anti-cancer therapy as patients maintain their normal daily activities. The NovoTTF is a novel, first-in-class treatment option for patients and physicians battling glioblastoma.

“Our device provides patients and physicians with a novel, non-invasive alternative to chemotherapy that is safe and effective,” said Eilon Kirson, M.D., Ph.D., Novocure’s Chief Medical Officer. “The device allows for continuous treatment without the usual, debilitating side effects that chemotherapies inflict on recurrent GBM patients and indirectly on their families.”

Results from a 237 patient randomized pivotal trial demonstrated that compared to patients treated with chemotherapy, NovoTTF treated patients achieved comparable median overall survival times, had fewer side effects, and reported improved quality of life scores.

Glioblastoma is the most aggressive and most common form of primary brain tumor in the United States. The disease affects approximately 10,000 Americans each year. The median overall survival time from initial diagnosis is 15 months with optimal therapy, and median survival from the time of tumor recurrence is only three to four months without additional effective treatment. The disease is widely recognized as one of the most aggressive and deadly forms of cancer.

“We move forward from today proud of the efforts and accomplishments of our team, thankful to our investors for their support and guidance, and humbled by the trust of our patients and physicians.” said Asaf Danziger, CEO of Novocure. “Our next task is to make NovoTTF therapy available as a treatment option for all recurrent GBM patients in the US.”

“The FDA approval of the NovoTTF device is the culmination of ten years of research, development and clinical trials conducted by an exceptional team of scientists, engineers, and clinicians and built on the original insights of our founder and CTO Yoram Palti, M.D., Ph.D.” said William F. Doyle, Novocure’s executive chairman. “We look forward to bringing this device to recurrent GBM patients and their families, and we look forward to developing NovoTTF therapy for a range of additional solid tumor cancers.”

Pivotal Trial Results

The FDA approval was based on data from a randomized pivotal trial of 237 patients with glioblastoma tumors that had recurred or progressed despite previous surgical, radiation and chemotherapy treatments. Patients treated with the NovoTTF alone achieved a comparable overall survival time to patients treated with the physician’s choice of the best chemotherapy. The rate of progression free survival at six months (PFS6) was 21% in the NovoTTF group compared to 15% in chemotherapy patients. Also, patients treated with the NovoTTF had a 14% tumor response rate (RR) compared to 10% in chemotherapy treated patients in the trial, and 3 complete radiographic responses were observed in the NovoTTF group compared to none in chemotherapy patients. NovoTTF treated patients reported better quality of life scores and fewer side effects during the trial compared to patients treated with chemotherapy. Specifically, quality of life using the device was better than that of chemotherapy patients in the following subscale domains: vomiting, nausea, pain, diarrhea, constipation, cognitive functioning and emotional functioning, all of which are hallmarks of patient suffering while receiving chemotherapy. The most commonly reported side effect from NovoTTF treatment was a mild-to-moderate rash beneath the electrodes.

On the Net:

Mortality Rate Is Increased In Persons With Autism Who Also Have Epilepsy

Autism Speaks and Miami Children’s Hospital examined co-morbidity of autism and epilepsy in ATP brain donations and CA DDS data to examine differences in mortality rates when both conditions are present

A comprehensive investigation of brain tissue donated to the Autism Speaks Autism Tissue Program (ATP), a postmortem brain tissue donation program, determined that one-third of the brain donors with autism also had epilepsy, and co-morbidity data from the California State Department of Developmental Services revealed a higher than expected rate of mortality in individuals with both autism and epilepsy than for individuals with autism alone.

“Mortality in Individuals With Autism, With and Without Epilepsy,” published today in the Journal of Child Neurology, reported that 39 percent of the confirmed cases of autism from ATP donations also had a confirmed diagnosis of epilepsy, which is significantly higher than the estimated rate of epilepsy among the general autism population. The study also reported that data from the California State Department of Developmental Services demonstrated a higher than expected rate of mortality in individuals with autism and epilepsy than autism alone. These data are consistent with past reports. The paper concluded that when epilepsy and autism occurred together, the mortality rates increased by more than 800 percent.

“This study highlights the importance of early identification of epilepsy in children with autism and of autism in children with epilepsy,” said Roberto Tuchman, MD, pediatric neurologist at Miami Children’s Hospital and member of the Autism Speaks Scientific Advisory Council. “The findings of this study should motivate the autism and epilepsy communities to increase their understanding of the risk factors and common mechanisms that can lead to epilepsy, autism, or both epilepsy and autism. Understanding these early determinants will allow for the development of effective interventions and preventive measures and ultimately better outcomes for children with autism and epilepsy.”

It is well established that epilepsy is a major medical disorder that is often co-morbid with autism in as many as 30 percent of children. As many as one in 20 children diagnosed with autism by age 3 could either already have epilepsy or develop epilepsy later in life. As noted by the ATP more than a decade ago, sudden unexplained death in epilepsy (SUDEP) has been identified as a cause of death in individuals with autism. Higher mortality rates than in the general population have been reported among individuals with autism, however, there is relatively little known about the specific risk factors that account for the reported higher-than-expected rate of mortality in autism.

“Sudden, unexpected or unexplained death in autism is often, but not always related to epilepsy and we need to use caution when interpreting these data,” explained Autism Speaks Vice President of Clinical Programs Clara Lajonchere, Ph.D. “These findings are important for understanding risk factors that may contribute to early death in individuals with autism and further underscore the need for more accurate and accessible records on cause of death in this population. Furthermore, state surveillance programs should implement better tracking mechanisms to help us better understand mortality for individuals with autism and co-occurring disorders such as epilepsy. Critical initiatives supported by Autism Speaks brain tissue program will help bring these issues to the fore and provide information our community needs to help prevent early death in persons with autism.”

On the Net:

Escherichia coli

Escherichia coli is a Gram-negative rod-shaped bacterium that is commonly found in the lower intestine of warm-blooded organisms. Most strains are harmless; however, some such as O157:H7 can cause food poisoning in humans and are often responsible for product recalls. The normal flora of the gut normally contains the harmless strains and often provide K2 to the body.

They are not always confined to the intestine and have the ability to survive briefly outside of the body. It grows easily and its genetics are simple and easily manipulated through a process of metagenics making it an ideal model for studying.

Theodor Escherich discovered E. Coli in 1885. E. coli is Gram-negative, facultative anaerobic and non-sporulating. Cells are typically rod-shaped and are about 2 micrometres (μm) long and 0.5 μm in diameter, with a cell volume of 0.6 – 0.7 (μm)3. It can live on a wide variety of substrates and uses mixed-acid fermentation in anaerobic conditions, producing lactate, succinate, ethanol, acetate and carbon dioxide.

Optimal growth occurs at 37°C but some laboratory strains can multiply at temperatures of up to 49°C. Aerobic and anaerobic respiration can drive growth. Any strain that possess flagella can swim and are motile.

E. coli possesses the ability to transfer DNA via bacterial conjugation, transduction or transformation, which allows genetic material to spread horizontally through an existing population. This process leads to the spread of the gene encoding shiga toxin from Shigella to E. coli.

Only 20% of the genome is common to all strains. From an evolutionary point of view genus Shigella are actually E. coli strains “in disguise”. E. coli is a sub-group within the species that has unique characteristics that distinguish it from other E. coli strains. The differences are detectable only at the molecular level; however, they may result in changes to the physiology or lifecycle of the bacterium.

Different strains are host-specific, making it possible to determine the source of fecal contamination in environmental samples. Knowing what E. coli strains are present helps researchers decide whether the contamination originated from human, another mammal or a bird. Strains of E. coli evolve through the natural biological process of mutation and through horizontal gene transfer.

Some strains can be harmful to a host animal. The strains typically cause a bout of diarrhea that is unpleasant in healthy adults and is often lethal to children in the developing world. More virulent strains can cause severe illness or death in the elderly.

E. Coli normally colonizes an infant’s gastrointestinal tract within 40 hours of birth where it adheres to the mucus of the large intestine. Without genetic elements that encode for virulence factors they remain benign commensals. Some strains produce lethal toxins and food poisoning usually happens due to consumption of unwashed food or undercooked meats. If the bacteria escape the intestinal tract and enter the abdomen they usually cause peritonitis which can be fatal.

Cooking food properly, preventing cross-contamination, instituting barriers such gloves, pasteurization of juice or dairy products are all ways to disrupt the transmission of the fecal-oral cycle.
Microscopy will show Gram negative rods, with no particular cell arrangement. Typical diagnosis has been done by culturing on sorbitol-MacConkey medium and then using an antiserum. Other methods include ELISA tests, colony immunoblots, and direct immunofluorescence microscopy of filters.

Bacterial infections usually treated with antibiotics. Antibiotic sensitivities of different strains of E. coli vary widely. Some are resistant to many antibiotics that are effective against Gram-positive organisms. Amoxicillin and other semi-synthetic penicillins are often used to treat. Antibiotic resistance is a growing problem. This is partly due to overuse of antibiotics and partly due to the use of antibiotics as growth promoters in food of animals.

‘Disfluences’ Help Toddlers Learn To Communicate

Cognitive scientists conducting a study at the University of Rochester’s Baby Lab suggest that parents who stumble and hesitate — known as disfluences — with words like “um” and “uh” when talking to their toddlers are actually helping them learn language more efficiently.

Scientists found that the disfluences signal to the toddler that something important is trying to be said and they should be more attentive, according to researchers.

Toddlers have a lot of information they have to process while they are listening to an adult talk, including many new words they have never heard before, said Dr. Richard Aslin, a professor of brain and cognitive sciences at the University of Rochester, and co-author of the study.

If a child’s brain waits until a new word is spoken and then tries to figure out what it means afterward, it becomes a much more difficult task and the child may miss what is said next, according to Aslin.

“The more predictions a listener can make about what is being communicated, the more efficiently the listener can understand it,” he told The Telegraph.

The researchers studied three groups of children between 18 and 30 months old. Each child sat in front of a monitor with an eye-tracking device while on his or her parent’s lap. Two images appeared on the screen: one image of a familiar item and one made-up item with a made-up name.

The researchers found that when a recorded voice talked about the images with simple sentences, the child instinctively looked at the made up image much more often — about 70 percent of the time — than the familiar image after the voice stumbled and said “Look at the, uh”¦”

“We’re not advocating that parents add disfluences to their speech, but I think it’s nice for them to know that using these verbal pauses is okay ““ the “Ëœuhs’ and “Ëœums’ are informative,” said Celeste Kidd, a graduate student at University of Rochester, and lead author of the study.

She said the effect was only significant in children over the age of two.

The researchers believed that younger children had not yet learned the fact that disfluences tend to precede novel or unknown words.

When kids are between the ages of two and three, they are typically at the stage of development where they can construct rudimentary sentences of three or four words and have a vocabulary of around a few hundred words.

An earlier study conducted by Jennifer Arnold, a scientist at the University of North Carolina and a former postdoctoral fellow at Rochester, found that adults can also use “ums” and “uhs” to their advantage in understanding language.

And work by Anne Fernald at Sanford University has shown that the quantity of speech a child is exposed to is most important for learning rather than the quality of the speech.

The current study, which was conducted by Kidd, Aslin, and Katherine White, a former postdoctoral fellow at Rochester who is now at the University of Waterloo, was published online this week in the journal Developmental Science.

On the Net: