Amsterdam Protesters Gather For Legal Marijuana

150 peaceful protesters gathered in Amsterdam Saturday as part of a global action for the legalization of marijuana.

They congregated on a public square, paying attention to pro-legalization speeches as marijuana was smoked and music blasted from speakers around the crowd. T-shirts with pro-cannabis messages, as well as snacks made with marijuana were sold at stands in the square.

“Prohibiting something that people will always want causes illegality and the emergence of criminal gangs,” Daan Rosenberg Polak, a publisher of pro-legalization literature, said to AFP at the meeting.

“In the Netherlands, we’ve had a good system since the 1970s, but recent governments have been trying to take us back to a more conservative system,” he said, insisting that the reasonable use of marijuana had zero danger.

Saturday’s protesters condemned Dutch law as deceitful for permitting the use of five grams of cannabis, but outlawing its growth and mass retail. 700 Dutch coffee shops have distinctive licenses to vend marijuana but cannot have more than 500 grams at their business.

A few Dutch municipalities have of late made known plans to shut down all or part of the coffee shops, mainly to dissuade crime and drug tourism.

Rowena Huijbregts of the Association for Cannabis Consumers insists that full legalization is the single decision, “otherwise, people are forced to buy their soft drugs from dealers who also sell hard drugs.”

On Friday, it was announced at the protest that there were protesters in 250 cities in the world, like Paris, Berlin and Madrid, in connection to the 10th annual Global Marijuana March.

The goal was to set in motion the full legalization of cannabis, from growth to the final sale.

“Prohibiting cannabis has undesirable effects: it promotes trafficking, criminality, a black market economy and a poor quality product,” contended Jacqueline Woerlee, a spokeswoman for the Association for the Abolition of Cannabis Prohibition.

Frogs Rescued From Deadly Fungus Ravaging Montserrat

Scientists are rescuing dozens of one of the world’s most rarest species of amphibians, the mountain chicken frog.

The frogs are being airlifted to safety from Montserrat in a final attempt to save it from the deadly chytrid fungus, which is ravaging their shrinking habitat and threatening extinction worldwide.

Montserrat is a tiny British Caribbean territory that is one of only two sites where the once-prevalent mountain chicken is found, but hundreds of the frogs have been killed in just the last few weeks by the fungus.

The 2-pound mountain chicken, named because locals claim its flesh tastes like chicken, is threatened by hunting and loss of habitat from a volatile volcano, and most recently by outbreaks of the chytrid fungus.

The volcano has erupted constantly since 1995 and more than half of the island’s 12,000 people have had to evacuate.

Ironically enough, the volcano might be the frog’s lifeline. Local officials plan to eventually reintroduce the frog to a region cut off by lava and ash that is inaccessible by foot, and hopefully fungus free.

Scientists estimate that there are only a few thousand surviving frogs and the number is decreasing everyday.

Chytrid fungus is a disease that infects the skin through which many amphibians drink and breathe. Chytridiomycosis causes lethargy and convulsions, and thickens the skin. It has spread quickly in recent decades, and some scientists believe the situation is becoming more severe as climate change is causing elevated temperatures.

“Its impact has been catastrophic,” Andrew Cunningham, senior scientist with the Zoological Society of London, said about the chytrid fungus. “The mountain chicken frog has been virtually wiped out.”

Gerardo Garcia, director of the herpetology department at the British-based Durrell Wildlife Conservation Trust says that experts have found 300 dead frogs and have reason to believe that hundreds more have died since the fungus appeared in late February.

Scientists are treating some of the frogs with anti-fungal baths and flying dozens of others to zoos in Britain and Sweden, where they live in temperature-controlled rooms with automatic spray systems. About 50 have been flown off the island, costing $14,000.

“We’re in a situation where the species could become extinct forever,” Garcia said.

Andrew Terry, Durrell’s conservation manager, says that the frogs should ultimately be kept in their natural habitat, but flying them out was the only immediate solution available considering the fungus-free areas could not provide enough food.

The other stronghold of the species, nearby island Dominica, saw populations plummet beginning around 2002 as a result of the disease, which is believed to have spread to Montserrat late last year or earlier this year.

Natives on both islands used to love the frog’s meat, but it is mostly tourists that request it now, said director of Montserrat’s Department of Environment, Gerard Gray.

Experts are working hard to find a way to eliminate the fungus, which has killed various frog species from Asia to South America.

The Durrell trust has housed its rescued frogs in a bio-secure unit at its wildlife park in Jersey and is hoping to breed the frogs to create a population that can be reintroduced to their natural habitat in as little as two years.

ZSL London Zoo will now protect mountain chicken frogs from both Dominica and Montserrat in its captive breeding unit which includes temperature controlled rooms, automated spray systems, and areas for growing live food.

Bio-security involves full paper suits, masks and gloves worn by keepers, to make sure that no pathogens, such as the fungus, are able to enter.

The large frogs’ croak is described as sounding like a small howling dog.

Gray recounts a night he heard a mountain chicken frog croaking so loudly that he thought it was under his bed.

“My wife laughed at me,” he recalled, “It was in the forest where it was supposed to be.”

On the Net:

Tiny Beetle Threatens Florida’s Avocado Industry

Florida’s thriving avocado industry could be in danger due to the arrival of the redbay ambrosia beetle.

Scientists say the little beetle (Xyleborus glabratus) spreads a fungus called laurel wilt disease that kills off avocado trees. The implications of the beetles’ arrival into Florida could spell out danger for the state as it is the second-largest source of avocados in the US.

The adult female redbay ambrosia beetle carries in a special pouch in its mouth — called a mycangia — the spores of the fungus that causes laurel wilt (Raffaelea lauricola).

University of Florida plant pathologist Randy Ploetz told the Associated Press scientists are monitoring the beetle’s journey throughout the southeastern US. So far, they have only reported infestations in the avocado trees of homeowners, not in commercial farms in South Florida.

The beetle has been discovered in Okeechobee County, which is not too far away from Miami-Dade County, where Florida’s largest concentration of commercial avocado trees are located.

“It could wipe out the entire industry,” University of Floridan agricultural economist Gilly Evans told the AP.

The beetle is said to be from Asia, and was brought into the US through a shipping port in Georgia. The beetle naturally expands its range about 20 to 30 miles each year, but can also be transported when its host tree is used for firewood.

“When you’re moving wood around and you have some redbay growing out there and you have some backyard avocados growing as well, you may be providing an easy pathway for that pest to move south,” said Frank Koch, a researcher with the North Carolina State University who has projected the beetle and the disease it carries will hit South Florida around 2020 or sooner.

The avocado industry is a source of about $30 million in profits each year. Evans said a beetle infestation could result in a massive blow to the economy in terms of lost earnings and lost jobs.

He said about $250,000 has already been spent in order to find a treatment that could halt the deadly disease being spread by the beetle, but pesticides or fungicides could be unsafe and too expensive for farmers.

Craig Wheeling, CEO of Brooks Tropicals in Miami-Dade County, told the AP that the beetle issue is reminiscent of a citrus canker scare among orange and lime growers in the early 2000s.

“Having gone through that mess in the early 2000s, we’re very concerned when we see the redbay ambrosia beetle coming down,” he said.

The citrus canker disease resulted in the loss of millions of dollars for citrus farmers.

Most ambrosia beetles target trees that are stressed, dying, or dead, but the redbay ambrosia beetle is known to attack healthy trees.

“Trying to predict what this thing is going to do has been difficult, because this is actually a brand new disease,” Ploetz said. “We knew nothing about this before it showed up.”

Scientists say the beetle’s range currently stretches from southern Delaware to coastal Virginia to eastern Texas and includes coastal areas of North Carolina and South Carolina, Georgia, southern Alabama, Mississippi, parts of Louisiana, and all of Florida.

On the Net:

Research Finds Homicidal Poisoning Rising, More Likely In Infants And Elderly

 Homicidal poisonings are rare but on the rise””and infants are the most common victims””according to a new University of Georgia study that aims to raise awareness of this often overlooked crime.

Greene Shepherd, clinical professor in the UGA College of Pharmacy, and recent graduate Brian Ferslew examined seven years of recent federal mortality data and identified 523 deaths due to homicidal poisoning””a figure that corresponds to a rate of 0.26 poisonings per million people. The study found that although poisonings account for less than one percent of all homicides, they appear to be on the rise. The study documented a low of 0.20 cases per million in 2000 and a high of 0.35 in 2004. In 2005, the last year for which data is available, the rate was 0.3 per million people.

“Homicidal poisoning is rare relative to a lot of other causes of death, but the numbers are trending higher,” said Shepherd, whose results appear in the May issue of the journal Clinical Toxicology. “We may never know the true incidence because some cases undoubtedly evade detection and classification.”

Shepherd is a former poison control center director who had heard several anecdotal accounts of homicidal poisonings but found very little data on their incidence. Because such data is a critical starting point for efforts to reduce the risk of poisonings, he and Ferslew began combing through data compiled by the National Center for Health Statistics.

While books and television dramas often portray homicidal poisoning as a premeditated crime committed against adults, the researchers found that infants are the most common victims. Children less than one year old are approximately nine times more likely to be victims than the general population, the study found. Shepherd said that rather than being premeditated acts, the majority of these poisonings are likely negligent homicides committed by parents or caretakers.

“Anyone who has been a new parent knows about the long hours and the stress of an inconsolable child,” Shepherd says. “In some cases people make bad decisions and try to sedate their children with medication or alcohol. I think there’s a role for pharmacists and other health care workers to better educate new parents about the inappropriateness of sedating newborns.”

Further analysis by race found that African-American infants””who are 21 times more likely to be victims than the general population””are most at risk. Shepherd said this increased risk among African Americans is “a tragic result of socioeconomic status,” as stressful situations and poor coping skills are more common in young parents lacking family support and economic stability.

The study found that older adults also had a significantly higher rate of poisoning than the general population. Older adults who require institutional or home care are particularly susceptible to abuse, Shepherd said, and are at risk of being administered excessive doses of sedatives or other medications.

The study also found that drugs, medications and other biological substances accounted for 65 percent of the poisonings during the 1999 to 2005 study period, while assault by gasses and vapor accounted for 28 percent of poisonings. The remaining seven percent of assaults involved other chemicals, corrosive substances or pesticides.

“Though rare, these crimes do happen,” Shepherd said. “Now that we’ve identified at-risk populations, we have the potential to raise awareness and possibly save lives.”

———-

On The Net:

UGA

Citizens Frustrated Over Exaggerated Flu-Hype

As the swine-flu fears begin to subside, a number of people have begun questioning whether governments and World Health Organization officials didn’t go too far in hyping the threat that the virus posed.

Even as many have begun to question the organization’s judgment, WHO officials sounded another warning on Thursday that up to 2 billion people could catch the hitherto mild flu if the outbreak turns into a global epidemic.

Many have begun blaming such dramatic alarms for the wearisome media coverage and over-reactive measures of a number of governments that have led to massive disruptions in the lives of private citizens.

Schools have shut down, leaving healthy kids at home losing precious classroom time.  Countless businesses were forced to close, costing private entrepreneurs and small business millions in revenue.  Mexico’s forecasted tourism revenues have plummeted for the coming summer and pork producers worldwide are already reporting falling demand for their products.  Economists even say that the scare may have snipped early buds of recovery in an ailing global economy.

“I don’t know anyone who has it.  I haven’t met anyone who knows anyone who contracted it,” said Carl Shepherd, a resident of Chicago, Illinois ““ reportedly the worst hit state in the U.S.  “It’s really frightening more people than it should have.  It’s like crying wolf.”

It is now some two weeks since initial reports about the virus broke and there have been a total of 46 deaths ““ 44 of which were in Mexico.  Twenty-four countries have reported a total of more than 2,300 confirmed infections, only a small fraction of which required hospitalization.

Infection statistics are far lower than health officials initially predicted and pale in comparison to the number of illnesses and deaths associated with annual seasonal strains of flu. Every year worldwide deaths associated with seasonal flu strains are usually in the hundreds of thousands to a million range.

“It’s been totally overblown,” said Miranda Smith, a former student at Cisco Junior College in central Texas whose graduation ceremony was canceled on account of the swine flu scare.

“Everyone seems to know it’s not going to kill you and it’s not as deadly as they think,” she added.  “Everybody needs to just calm down and chill out.”

Craig Heyl of Decatur, Georgia shared Smith’s sentiments.  “Swine flu is just another strain of flu.  People get the flu.  I guess you have to call it a pandemic when it’s a widespread virus, but I don’t think the severity of it is all that concerning.”

Public health authorities have slowly begun admitting that their worst fears about the virus did not materialize, but continue to maintain that the virus still has the potential to take a turn for the worse.

“People are taking a sigh of relief too soon,” said the Center of Disease Control and Prevention’s acting director, Dr. Richard Besser.

“The measures we’ve been talking about ““ the importance of hand-washing, the importance of covering coughs, the real responsibility for staying home when you’re sick”¦ I’m afraid people are going to say, “ËœAh, we’ve dodged the bullet. We don’t need to do that.'”

Concern about the deadly potential of the swine flu is generally shared by the leaders of most health organizations.  Elsewhere however, in chat-rooms, blogs and op-ed pieces, opinions that officials exaggerated the threat are rampant.

“Adults are acting like a bunch of crybabies in a B-rated science fiction germ-outbreak movie, wringing their hands, whining about what to do next,” read an opinion letter in the Dallas Morning News on Wednesday.

In Lake Oswego, Oregon, one reader wrote her local paper in frustration, “Is the daily front page body count really necessary?  In reading the entire content of the collected articles one learns that the H1N1 strain is not likely to be more lethal that its predecessors.  Give it a rest!”

On May 5th, during the peak of the flu scare, a USA Today/Gallup poll showed that only 25 percent of Americans said they were worried about getting the virus.

Dr. Robert Daum, an infection disease expert at the University of Chicago, basically says that he sees both sides of the issue.

“I think it was right to place everyone on high alert, and now right” to start letting things calm down, said Daum.

On the Net:

FDA Requires Black Box Warning For Testosterone Gel

Food and Drug Administration officials are warning parents to keep prescription testosterone gel far from children due to its serious side effects.

Testosterone can be good in some instances for adults, but the hormone can cause problems in children such as enlargement of the genital organs, aggressive behavior, early aging of the bones, premature growth of pubic hair, and increased sexual drive.

The agency says both boys and girls are at risk, and has ordered its strongest warning on the products. The so-called “black box” warnings will be included on the labels for Solvay’s AndroGel and Auxilium Pharmaceuticals Inc’s Testim.

The problems can arise from simple situations such as adults not washing their hands well.

Testosterone gel is usually applied to the upper arms or shoulders so adults must cover up to keep kids from accidentally touching a spot that has the medicine on it.

Men use testosterone gel when their bodies no longer make the sex hormone, or who have very low levels of it. Doctors also prescribe it to women to increase sexual drive, although the FDA has not approved that use.

In 2007, U.S. pharmacies dispensed about 1.8 million prescriptions for testosterone gel.

The leading brand, AndroGel, accounted for about three-fourths of the sales.

The required label changes will provide additional information about the risk of secondary exposure and the steps that should be taken to reduce the risk, the FDA said.

“These drugs are approved for an important medical need, but can have serious unintended side effects if not used properly,” Dr. Janet Woodcock, director of the FDA’s drug division, said in a statement. “We must ensure that the adults using them are well-informed about the precautions needed to protect children.”

Only a small fraction of cases in which there is a problem with a drug are reported to the FDA, so there could be many more, but since the beginning of December the agency received reports of eight cases in which children were accidentally exposed to testosterone gels.

The kids ranged in age from nine months to five years. Health officials said overall the symptoms went away once testosterone gel was identified as the cause of the problem.

However, for some kids enlarged sex organs did not return to their appropriate size, and bone age remained somewhat higher than the child’s chronological age.

The agency reported that one child even underwent surgery because the link to testosterone gel was not recognized right away.

Health officials recommend that adults who use testosterone gel wash their hands with warm soap and water after each use and cover their skin after the gel has dried.

Pregnant women, and those who may become pregnant, should avoid any exposure, since it could lead to birth defects.

On the Net:

Overweight Young Men Less Likely To Get Married

A researcher at an international conference on obesity in Amsterdam said men who were grossly overweight at the age of 18 had a nearly 50 percent less chance of being married by their 30s and 40s, the AFP reported.

The European Association for the Study of Obesity hosted the four-day gathering on Thursday.

The results could suggest that women rank a man’s appearance higher than other traits when choosing a partner, as the data held true regardless of the men’s intellectual performance or socio-economic position.

Researcher Malin Kark of the Swedish Karolinska Institute medical university told AFP that could be one explanation.

Her study was conducted among more than 500,000 Swedish men born between 1951 and 1961 and found that men who had been obese at 18 were 46 percent less likely to be married in 1991, when they were aged between 30 and 40, than men with no weight problem, and 45 percent less likely by 2004.

The chances of marriage were somewhat higher for men who were overweight but not obese at 18, the study found. Chances were also 10 percent lower than for men of normal weight in their 30s and nine percent lower in their forties.

Kark believes the data shows that there is a stigmatization of obese young men that continues into adulthood and it appears to be evident in their working life as well as in inter-personal relationships.

She added that other studies have found that obese adolescents were likely to become obese adults.

No information was available on the men’s adult weight.

For the purposes of the study, a person with a body mass index (the weight in kilograms divided by the square of the height in meters) of more than 30 was considered obese.

Around 1.6 billion adults in 2005 were overweight, of which at least 400 million were obese, according to estimates from The World Health Organization.

On the Net:

Rotator cuff tears may run in families

People with relatives who have had rotator cuff tears are at greater risk for similar injuries, U.S. researchers say.

The study, published in The Journal of Bone and Joint Surgery, finds the increased risk for these rotator cuff tears in family members of patients extended out and beyond third-cousin relationships — the great-great-grandchildren of one’s great-great-grandparents.

Rotator cuff healing is often incomplete and identifying a possible genetic link to the disease may provide targets for biologic treatments to improve the healing rates, Dr. Robert Tashjian of the University, the study leader from the Utah School of Medicine Orthopaedic Center in Salt Lake City, said in a statement.

While we have not determined the exact genetic component our family history data supports that heredity plays a role in the development of rotator cuff tearing.

Researchers used data from the Utah Population Database combined with the University of Utah Health Sciences Data Warehouse.

Rotator cuff tears — a shoulder injury usually found in people in their 50s and 60s — is believed to have both mechanical and environmental influences, but scientists are unclear why it occurs.

The potential impact of this research, says Tashjian, is knowing about a family history may alert patients to take some precautionary measures to protect against their own injuries.

50 Percent Rise In ‘Zombie’ Computers

Anti-virus software and computer protection company, McAfee, found that millions of computers have been taken over by cyber criminals since January.

The report found a 50% increase in the number of detected so-called “zombie” computers since 2008. This number is likely to be realistically much higher than McAfee is able to determine alone.

A zombie computer is a computer with Internet access that has been compromised by a hacker, a computer virus, or a trojan horse. A compromised machine is usually only one of many in a botnet (a collection of software robots), and will be used to perform malicious tasks under remote direction. The majority of zombie computer owners are oblivious to the fact that their system is being undermined in this way.

These figures shown in a report from Deloitte say indicate that a global response to cyber security is desperately needed.

“Doing nothing is not an option,” said Mr. Pellegrino of Global public sector industry leader at Deloitte Touche Tohmatsu (DDT).

He warned everything that relies on cyberspace is encountering unprecedented risks.

Pellegrino explains, “This issue is moving so quickly, and with so much at stake economically and in terms of safety and security for people, we don’t have 100 years to figure this out.”

McAfee also reported that the United States now has the world’s largest percentage of infected computers at 18% with China not too far behind with just over 13%.

Jeff Green, senior vice president of McAfee says, “The massive expansion of these bot-nets provides cyber criminals with the infrastructure they need to flood the web with malware.”

“Essentially, this is cyber crime enablement.”

The DTT findings reveal a growing awareness of the importance of the Internet in so many varying aspects of our lives such as security, commerce, transportation and communication.

“We are seeing this change from protecting the Internet to a conversation about how we succeed and prosper in cyberspace,” Mr. Pellegrino noted.

“Security spending is growing at a rate never seen before while the threat environment is growing at a pace of 40% a year.”

Pellegrino continued to say, “In terms of volume and severity of incidents, the math doesn’t work and we have to come up with a different approach that requires public and private sectors working together.”

Fellow author Gary McAlum, who is a retired US air force colonel and senior manager of security and privacy services at Deloitte added, “We are talking about daily living.”

“There is a lot of discussion about the economy and the military and the public and private sector, but we have now reached a sense of urgency about the interconnectedness of all these areas.”

That view was reaffirmed by a member of the US military top brass who just provided evidence to a branch of the House Armed Services Committee.

“Our economy, the nation’s critical infrastructure, and many of our military operations depend on unfettered access to cyberspace,” contributed Lieutenant General Keith Alexander, the director of the National Security Agency (NSA) who heads the Pentagon’s new Cyber Command.

“Maintaining freedom of action in cyberspace in the 21st century is as inherent to US interests as freedom of the seas was in the 19th century, and access to air and space in the 20th century.”

He is urging the U.S to reorganize its offensive and defensive cyber operations by creating a digital warfare force of some kind.

The Deloitte study involved many interviews conducted with government officials and industry experts from around the world, which revealed that this issue is gaining global prominence.

Pellegrino expressed that his group was very pleased to see that the concern, awareness and recognition of need for leadership was ubiquitous among the various nations.

U.S President Obama has made the issue of cyber-security a top priority. He ordered a 60-day review that has now been delivered to his desk shortly after taking office. 

Though the review has been delayed in its release due to the swine flu crisis, it is expected that the president will announce whom he will choose to lead the cyber-security efforts when the report is made public.

According to the Deloitte research, the United Kingdom is in the process of writing a national cyber strategy with an emphasis on public-private partnership.

On the continent, the European Commission has urged member states to work together on cyber-security measures while Latin America reportedly has a “diversity of approaches.”

Canada has now completed its cyber-security review and will be implementing the National Cyber Security Strategy as well as creating a new Directorate of Cyber Security with a mandate to engage closely with the private sector this year.

Even with all these initial efforts, the Deloitte authors remind that the issue is urgent and that time is running out.

“Not only do we have to take action, we don’t have enough time,” warned Mr. Pellegrino.
Mr. McAlum agreed adding that devising a clear strategy and acting swiftly is crucial.

“We need to get our house in order first so that we can interact with the rest of the world with one voice, with clear roles and responsibilities aligned.”

On the Net:

Flu Pandemic In Prison – Model For Public Health Preparedness

When pandemics occur, correctional facilities are not immune. With more than 9 million people incarcerated across the globe 2.25 million in U.S. jails and prisons alone it is vital that correctional officials and health professionals be prepared for a worst-case scenario that involves pandemic influenza reaching inmates and staff.

With collaborative planning and training, prison and public health officials can help control influenza outbreaks behind bars, according to an article in the April issue of the Journal of Correctional Health Care (published by SAGE).

A two-day conference on prison pandemic preparedness held in Georgia in 2007 could serve as a model for such training. Administrators, medical doctors, registered nurses, physician assistants, and pharmacists were among the participants, as well as state and local public health officials.

The objectives were to educate participants about pandemic flu issues in prison settings, provide impetus for initial planning in Georgia’s prisons, and elicit ideas about how the prisons could best prepare for and respond to pandemic flu. Topics included nonpharmaceutical interventions, health care surge capacity, and prison-community interfaces.

Effective training about pandemic influenza requires more than just classroom lectures or checklists, the authors write. The conference employed interactive methods and educational games that recent studies have found effective in training ”adult learners.” Experiential learning closely resembles the way adults learn on the job and offers a more hands-on approach compared to traditional didactic, classroom-based learning.

The training techniques appeared to be very effective. Scores on a test after the training were an average of 69% correct compared to a pretest, which had an average score of 42% correct.

As important, the conference served to forge new partnerships among correctional health and public health officials responsible for pandemic planning.

The article, “How Public Health and Prisons Can Partner for Pandemic Influenza Preparedness: A Report From Georgia” in the April issue of the Journal of Correctional Health Care.

On the Net:

Estrogen Controls How the Brain Processes Sound

Estrogen Found to Work Within Neurons to Facilitate Hearing and Memory

Scientists at the University of Rochester have discovered that the hormone estrogen plays a pivotal role in how the brain processes sounds.

The findings, published in yesterday’s issue of The Journal of Neuroscience, show for the first time that a sex hormone can directly affect auditory function, and point toward the possibility that estrogen controls other types of sensory processing as well. Understanding how estrogen changes the brain’s response to sound, say the authors, might open the door to new ways of treating hearing deficiencies.

“We’ve discovered estrogen doing something totally unexpected,” says Raphael Pinaud, assistant professor of brain and cognitive sciences at the University of Rochester and lead author of the study. “We show that estrogen plays a central role in how the brain extracts and interprets auditory information. It does this on a scale of milliseconds in neurons, as opposed to days, months or even years in which estrogen is more commonly known to affect an organism.”

Previous studies have hinted at a connection between estrogen and hearing in women who have low estrogen, such as often occurs after menopause, says Pinaud. No one understood, however, that estrogen was playing such a direct role in determining auditory functions in the brain, he says. “Now it is clear that estrogen is a key molecule carrying brain signals, and that the right balance of hormone levels in men and women is important for reasons beyond its role as a sex hormone,” says Pinaud.

Pinaud, along with Liisa Tremere, a research assistant professor of brain and cognitive sciences, and Jin Jeong, a postdoctoral fellow in Pinaud’s laboratory, demonstrated that increasing estrogen levels in brain regions that process auditory information caused heightened sensitivity of sound-processing neurons, which encoded more complex and subtle features of the sound stimulus. Perhaps more surprising, says Pinaud, is that by blocking either the actions of estrogen directly, or preventing brain cells from producing estrogen within auditory centers, the signaling that is necessary for the brain to process sounds essentially shuts down. Pinaud’s team also shows that estrogen is required to activate genes that instruct the brain to lay down memories of those sounds.

“It turns out that estrogen plays a dual role,” says Pinaud. “It modulates the gain of auditory neurons instantaneously, and it initiates cellular processes that activate genes that are involved in learning and memory formation.”

Pinaud and his group stumbled upon these findings while investigating how estrogen may help change neuronal circuits to form memories of familiar songs in a type of bird typically used to understand the biology of vocal communication. “Based on our findings we must now see estrogen as a central regulator of hearing,” he says. “It both determines how carefully a sound must be processed, and activates intracellular processes that occur deep within the cell to form memories of sound experiences.”

Pinaud and his team will continue their work investigating how neurons adapt their functionality when encountering new sensory information and how these changes may ultimately enable the formation of memories. They also will continue exploring the specific mechanisms by which estrogen might impact these processes.

“While we are currently conducting further experiments to confirm it, we believe that our findings extrapolate to other sensory systems and vertebrate species,” says Pinaud. “If this is the case, we are on the way to showing that estrogen is a key molecule for processing information from all the senses.”

On the Net:

International Year of Astronomy Educator Conference

May 30-31, 2009 at the Jet Propulsion Laboratory, Pasadena, CA

The Hubble Space Telescope was launched on a 15-year mission to explore the universe. Now, just past it’s 19th birthday, it is getting a new lease on life.  Space Shuttle mission STS-125 (scheduled for launch on May 11) is slated to replace and repair science instruments, computers, batteries, gyroscopes and blankets.  This, the last Hubble servicing mission, should allow Hubble to operate as a fully operational enhanced astronomical observatory for many more years.

At JPL we are taking this opportunity to revisit the Hubble mission and the work of JPL’s Wide Field and Planetary Camera 2, Hubble’s workhorse science instrument.  The camera has taken most of the revolutionary images attributed to Hubble.  We’ll recap the Shuttle mission activities as well.

2009 is also the International Year of Astronomy (IYA).  This event is a global celebration of astronomy and its contributions to society and culture and marks the 400th anniversary of the first use of an astronomical telescope by Galileo Galilei.  The aim of the Year is to stimulate interest, especially among young people, in astronomy and science under the central theme “The Universe, Yours to Discover.” We will discuss IYA2009 events and activities and ways to promote a greater appreciation of the inspirational aspects of astronomy.

Who: All educators (including museum staff) and students high school and above interested in Earth and space science and exploration.  The conference content is generally non-technical but does include some detailed scientific and engineering content.  The objective of the conference is to tell the exciting tale of real-life exploration and new discovery in a way that will excite and inspire students.  Students under 18 years of age must be accompanied by a registered adult.

When: All day Saturday, May 30, and the morning of Sunday, May 31, 2009.  Check-in begins at 7:45 A.M.  On Saturday the conference will conclude by 5:00 pm. On Sunday the conference will end at noon for a total of 12 hours of professional development time.

Where: The Jet Propulsion Laboratory’s von Kármán Auditorium.  JPL is located in the foothills of the San Gabriel Mountains north of the Rose Bowl.  For directions please visit http://www.jpl.nasa.gov/about_JPL/maps.cfm.  Pre-registration is required; walk-up registration will not be possible for this conference.  Note that you will need to show a photo ID at JPL’s security checkpoint upon arrival each day.

How: To register for this conference please send a check postmarked by Friday, May 22, 2009, for $40.00 payable to “Jet Propulsion Laboratory” to:

Hubble Educator Conference
Attn: Mary Kay Kuehn
Jet Propulsion Laboratory
M/S 180-109
4800 Oak Grove Drive
Pasadena CA 91109

Please register by Friday, May 22, 2009.  The $40 registration fee includes continental breakfast and breaks both days and a box lunch on Saturday.  For registration questions please call the JPL Education Office at 818-393-0561.

For updates and information visit the JPL Education Gateway at http://education.jpl.nasa.gov/.

Name________________________________________
Title_________________________________________
Organization/School_________________________________________________
Address_______________________________________ State____ Zip________
Grade(s) Taught/Enrolled_____________________________________________
Subject(s) Taught/Enrolled____________________________________________
Contact info for confirmation & last minute changes:
E-mail: ________________________________
Phone: ________________________________

$40 Registration Fee Enclosed? Check # ____________

Wine Consumption Lengthens Life Expectancy

New research shows that very light wine consumption over the long term appears to lead to a longer life.

As part of the Zutphen Study, a group of randomly selected Danish men were repeatedly monitored between 1960 and 2000, and long-term light wine consumption was associated with an increase in life expectancy of nearly 5 years.

Dr. Martinette T. Streppel from Wageningen University, the Netherlands, and colleagues report in the Journal of Epidemiology and Community Health that the benefit was independent of total alcohol consumption.

During the 40-year monitoring period, 1,130 of the 1,373 men died, according to the report.  The average age at death was 77 years.

The report flowed along with past studies showing that long-term light alcohol intake was significantly associated with lower mortality risk.  Life expectancy of men with a long-term alcohol intake of up to 20 grams a day was 2.3 years longer than non-drinkers.  

Drinking more than that decreased the benefit.  Men that consumed over 20 grams of alcohol a day had a life expectancy that was 1.9 years longer than the non-drinkers.

A long-term intake of 2 grams of wine per day increased life expectancy by 2.5 years compared to beer and spirit drinkers, and by 4.7 years compared to nondrinkers.

Seventy percent of wine consumed in the study was red wine.

When the analysis factored in socioeconomic status, dietary factors and other lifestyle habits, the link between wine and longer lifespan remained the same.

———-

On The Net:

Wageningen University

JECH

Scientists Say Human Organs Could Be Harvested From Sheep

A Japanese scientist claims that humans may be able to harvest transplant organs from sheep within a decade.

Professor Yutaka Hanazono told London’s Times Online that he his sheep has been successfully modified to develop a spare pancreas hidden in its underbelly.

Although the only viable donor for the spare pancreas would be a diabetic chimpanzee, Professor Hanazono said it represents an opportunity to provide harvestable human organs in sheep and ending the ethical debate over the use of human stem cells to create organs.

The pancreas currently growing inside the sheep was created using stem cells from monkeys, Hanazono said, but he believes that technology could one day be used to make sheep into living human organ banks.

Hanazono, who works at the Jichi Medical University, foresees a future where human livers, hearts, pancreases and skin can be grown in sheep. He says it could happen in the next decade, or two.

“We have made some very big advances here. There has historically been work on the potential of sheep as producers of human blood, but we are only slowly coming closer to the point where we can harvest sheep for human organs,” Professor Hanazono told The Times.

“We have shown that in vivo (in a living animal) creation of organs is more efficient than making them in vitro (in a test tube) but now we really need to hurry.”

Hanazono told The Times that Japanese law defines “death as the point when the heart permanently stops. The concept of brain death “” the phase at which organs can most effectively be harvested from donors “” does exist, but organs cannot be extracted at that point.

As a result, the organ donor system in Japan is virtually nonexistent, said Hanazono. In Japan the rate of organ donors per million people is less than 0.8, compared to about 27 in the US.

“To avert disaster, say doctors, Japan either needs the science of synthetic organ generation to advance faster than seems possible, or it needs a complete rethink on the Japanese definition of death,” according to the Times.

————

On The Net:

London’s Times Online

Jichi Medical University

Court Case Continues Between Hollywood And DVD Copying Software

A San Francisco court could decide this week if DVD users can make personal backups the way people do with audio, as the six big film studios are claiming that a program called RealDVD violates copyright, BBC News reported.

Bill Hankes of RealDVD said the consumer should have the same fair use rights to copy DVDs just as they have for the last decade with music.

The RealNetworks-produced digital copying program allows DVD owners to create digital copies of their discs for their own personal use without having to pay extra fees.

However, some studios let users make a digital copy of a movie onto a computer by paying more for an “expanded edition” of a DVD””creating the argument that the consumer is being made to pay twice for the same movie.

“For 11 years, since the Digital Millennium Copyright Act (DMCA) made it illegal to bypass any digital rights management protection system, the movie and music industries have fought a war ostensibly against piracy,” said Kevin Hunt, who writes the Electronic Jungle column for the Baltimore Sun.
 
“In reality, it has been a war against the consumer, designed to make people pay more than once for the same song or album or movie,” he added.

But the Motion Picture Association of America (MPAA), which represents the movie studios, claim that RealDVD is illegal under the DMCA, saying the software bypasses the copy protection built into DVDs, meaning that users could copy a DVD and share it around.

The studios have described the product as “Steal DVD.” Most fear the technology would enable people to “rent, rip, and return” DVDs without ever having to purchase them from retail stores.

Greg Goeckner, executive vice president and general counsel, MPAA told the BBC in an e-mail statement: “Our objective is to get the illegal choices out of the marketplace and instead focus constructively with the technology community on bringing in more innovative and flexible legal options for consumers to enjoy movies.”

RealNetworks, which makes RealDVD, claims that in actual fact the company has enhanced the security of the product.

RealDVD spokesman Hankes said the company had added an extra layer of security encryption to ensure piracy is not a possibility and a digital version of a movie made using RealDVD can only be played on the computer that made the copy.

Hankes, however, said he had not been surprised by Hollywood’s reaction to the product, adding there has been a tension between Silicon Valley and Hollywood for a long time and this is another example of that.

He said it is not uncommon for content owners to be initially concerned about the manner in which their content will be treated by new technology.

“That is why we went to talk to the studios before we released the product,” he said.

A survey conducted by The National Consumers League, a 100-year-old consumer watchdog group, showed consumers want choice.

Executive director Sally Greenberg said the entertainment industry would be wise to pay attention to the attitudes and purchase desire of the typical American consumer, who, according to the survey, is very interested in being able to back up his or her collection.

But despite the current court case, there are still a number of illegal ways to do what ReadDVD does.

Hankes said consumer behavior is going to continue regardless of what happens in this court case and the real question is can Hollywood and technology get out in front of it so the consumer adopts legitimate behavior.

“Hollywood says that without encryption, the DVD market would collapse. I say, the pirates have already won, the software to copy is free and you’re still selling DVDs,” said Fred Von Lohmann, a senior lawyer with the Electronic Frontier Foundation.

Judge Marilyn Patel, who presided over the Napster case and eventually shut down the original peer-to-peer music file-sharing service, is hearing the case in the U.S. District Court for the Northern District of California.

The hearing is expected to end this week with closing arguments this Friday or the following week.

Judge Patel is expected to deliver her decision in a written ruling in the coming weeks.

On the Net:

Self-cleaning Materials, Water-Striding Robots

Self-cleaning walls, counter tops, fabrics, even micro-robots that can walk on water — all those things and more could be closer to reality because of research recently completed by scientists at the University of Nebraska-Lincoln and at Japan’s RIKEN Institute.

Humans have marveled for millennia at how water beads up and rolls off flowers, caterpillars and some insects, and how insects like water striders are able to walk effortlessly on water. It’s a property called super hydrophobia and it’s been examined seriously by scientists since at least the 1930s.

“A lot of people study this and engineers especially like the water strider because it can walk on water,” said Xiao Cheng Zeng, Ameritas university professor of chemistry at UNL. “Their legs are super hydrophobic and each leg can hold about 15 times their weight. ‘Hydrophobic’ means water really doesn’t like their legs and that’s what keeps them on top. A lot of scientists and engineers want to develop surfaces that mimic this from nature.”

In a paper to be published in the May 4-8 online edition of the Proceedings of the National Academy of Sciences, Zeng and his Japanese colleagues, Takahiro Koishi of the University of Fukui and RIKEN, Kenji Yasuoka of Keio University, and Shigenori Fujikawa and Toshikazu Ebisuzaki of RIKEN, give engineers and materials scientists important clues in how to develop the long-sought super hydrophobic materials.

In nature, organisms like caterpillars, water striders and the lotus achieve super hydrophobia through a two-level structure — a hydrophobic waxy surface made super hydrophobic by the addition of microscopic hair-like structures that may be covered by even smaller hairs, greatly increasing the surface area of the organism and making it impossible for water droplets to stick.

Using the superfast supercomputer at RIKEN (the fastest in the world when the research started in 2005), the team designed a computer simulation to perform tens of thousands of experiments that studied how surfaces behaved under many different conditions. Zeng and his colleagues used the RIKEN computer to “rain” virtual water droplets of different sizes and at different speeds on surfaces that had pillars of various heights and widths, and with different amounts of space between the pillars.

They learned there is a critical pillar height, depending on the particular structure of the pillars and their chemical properties, beyond which water droplets cannot penetrate. If the droplet can penetrate the pillar structure and reach the waxy surface, it is in the merely hydrophobic Wenzel state (named for Robert Wenzel, who found the phenomenon in nature in 1936). If it the droplet cannot penetrate the pillars to touch the surface, the structure is in the super hydrophobic Cassie state (named for A.B.D. Cassie, who discovered it in 1942), and the droplet rolls away.

“This kind of simulation — we call it ‘computer-aided surface design’ — can really help engineers in designing a better nanostructured surface,” Zeng said. “In the Cassie state, the water droplet stays on top and it can carry dirt away. In the Wenzel state, it’s sort of stuck on the surface and lacks self-cleaning functionality. When you build a nanomachine — a nanorobot — in the future, you will want to build it so it can self-clean.”

Zeng said there were three main advantages to performing the experiments on a computer rather that in a laboratory. First, they were able to conduct thousands more repetitions than would have been possible in a lab. Second, they didn’t have to worry about variables such as dirt, temperature and air flow. Third, they could control the size of droplets down to the exact number of molecules, whereas in a laboratory experiment the droplets would unavoidably vary by tens of thousands of molecules.

The idea for the experiment came about in 2005 when Zeng visited RIKEN during his year as a fellow of the John Simon Guggenheim Foundation, which paid for the start-up for the project. Koishi spent the spring of 2005 with Zeng at UNL as they designed the project in detail. Yasuoka and his family spent the 2006-07 academic year in Lincoln during his a one-year sabbatical with Zeng, in part because of this project.

“We wanted to design a grand-challenge project so we could take advantage of the RIKEN super computer,” Zeng said. “We thought this was an interesting project and we need a very, very fast computer to deal with it. I also have to acknowledge the Nebraska Research Initiative, the Department of Energy and the National Science Foundation. The NRI is great because it allows me to do highly risky research, to develop this kind of challenging project.”

Image 1:  This image shows a virtual water droplet on “pillars.” Credit: Xiao Cheng Zeng

Image 2: This picture shows members of the research team (left to right) Xiao Cheng Zeng, UNL, Kenji Yasuoka, Keio University, and Takahiro Koishi, University of Fukui. Credit: UNL University Communications

On the Net:

Flu Concerns As Winter Sets In South of the Equator

As the winter approaches for countries south of the equator, health experts caution that those countries that have thus far been spared from outbreaks of the swine flu could be at an elevated risk as the virus continues to mutate and spread.

Until now, the countries most affected by the flu ““ Mexico, the U.S. Canada and countries in Europe ““ have all been located in the northern hemisphere and are just coming out of the last few months of cool winter weather.  For countries in the southern hemisphere, however, autumn has already arrived and winter will be setting in the coming months.

“The highest peaks of influenza activity occur in winter,” said Raina MacIntyre of the University of New South Wales’ School of Public Health and Community Medicine in Sydney, Australia.  “For us in the southern hemisphere, it’s particularly concerning.”

MacIntyre explains that flu is more easily spread during the winter months in large part because people tend to gather together indoors in confined spaces, making it easier for the virus to jump from person to person over shorter distances.  She also stated that there is some evidence that the colder temperatures may make viruses in general more transmittable.

Health experts have also warned of increased risks in the coming months as the swine flu could genetically recombine with standard winter flu strains to form an even more dangerous and transmissible hybrid.

“Winter is coming in the southern hemisphere and governments have to step up their actions to protect their populations, especially in the absence of a (swine flu) vaccine,” said World Health Organization spokesman Dick Thompson.  “We have a concern there might be some sort of (genetic) reassortment and that’s something we’ll be paying special attention to.”

Another WHO spokesman mentioned in a press statement on Monday that the organization is considering raising the pandemic alert level to a 6, indicating that a global pandemic is already underway.

U.N. Secretary-General Ban Ki-moon, however, stated that the WHO currently “has no plan to raise the alert level to 6.”  Margaret Chan, another WHO official, added that “we are not there yet” in a video address to the U.N. General Assembly.

Though Australia has not yet identified any cases of the H1N1 strain within its borders, neighboring New Zealand confirmed its sixth case on Monday and is still waiting for the results on another 11 probable cases.

While Mexican have officials begun to relax some restrictions and make a few cautious steps towards normalcy in the last few days, South America has confirmed its first case of the virus in Columbia, where the flu season is just beginning.

“Latin American countries may have a somewhat stronger surveillance system than in Africa.  Africa’s going to need some additional support and surveillance,” said Thompson.

A number of experts, however, continue to contend that countries south of the equator should focus their efforts on battling seasonal flu outbreaks rather than the swine flu. 

John Mackenzie, a flu expert at Curtin University in Australia, believes that seasonal flues could potentially pose a greater health risk than the swine flu.  Vaccination efforts, he argues, should continue to focus on the elderly and people with chronic illnesses, as the effects of the swine flu have proven to be relatively mild so far.

Because vaccine makers are only able to produce one vaccine at a time and worldwide production capacity is limited, there has been some debate recently regarding whether production efforts should be diverted to provide for a potential swine flu pandemic or remain focused on creating vaccines for traditional seasonal viruses.

MacIntyre says that Australia is well prepared for the possibility of an outbreak.  In recent years, a great deal of planning and resources have gone into the accumulation of a large stockpile of flu treatments and tightly coordinated emergency plans.

Australian officials say that they have enough reserves of Tamiflu and Relenza to treat well over a third of their 22 million citizens.

A number of experts have commented on the benefit that inhabitants of the southern hemisphere will have during the coming flu season, as officials in these countries will have had more time to analyze the virus and strategically prepare for an outbreak.

According to Nikolai Petrovsky, an endocrinologist at Flinders University in Adelaide, Australia, “By the time it comes to Australia and the southern hemisphere, we’ll know more about it than (they) did when it arrived over there.”

On the Net:

Medicare Advantage Plan Extra Payments

$43 billion in extra payments have been made to private Medicare Advantage plans since 2004

Private Medicare Advantage (MA) plans will be paid $11.4 billion more in 2009 than what the same beneficiaries would have cost in the traditional Medicare fee-for-service program, according to a new report released today by The Commonwealth Fund. This new analysis, The Continuing Costs of Privatization: Extra Payments to Medicare Advantage Plans Jump to $11.4 Billion in 2009, estimates that since MA was enacted in 2004, $43 billion in extra payments have been made.

In the report, Brian Biles, professor of health policy at George Washington University and colleagues find that extra payments to MA plans will amount to an average of $1,138, or 13 percent over fee-for-service costs, for each of about 10 million Medicare beneficiaries enrolled in Medicare Advantage plans. The $11.4 billion in extra payments in 2009 represents a 34 percent increase over 2008 payments, which totaled $8.5 billion. According to authors, the steep one-year increase was due to the increase in payment rates and enrollment in the private MA plans.

The bulk of these extra payments were mandated by the Medicare Modernization Act of 2003, which was intended to expand the role of private plans in Medicare in an effort to reduce growth in Medicare spending. Since 2004, MA plan enrollment has increased from 4.8 million to the current 10 million.

“It is clear that private plans are continuing to substantially raise the cost of serving Medicare beneficiaries,” said Commonwealth Fund president Karen Davis. “Modifying these payments in 2010 is an excellent first step, but policymakers should examine whether or not these plans are the best use of Medicare dollars for the beneficiaries they were designed to serve.”

The Congressional Budget Office estimates that bringing MA payments in line with traditional fee-for-service Medicare would save $157 billion over the next 10 years. Recent steps taken by the Centers for Medicare and Medicaid Services that reduce the payments made to private MA plans in 2010 do not address the factors responsible for the $11.4 billion in extra payments, the authors say.

The authors note that funds saved by eliminating extra payments to private plans could be used for other purposes, such as offsetting the costs of Medicare policy improvements””including reducing the Part B premiums that Medicare beneficiaries pay or increasing eligibility for low-income subsidies in Medicare Part D””or offsetting part of the cost of expanding health insurance to the 47 million uninsured.

“Right now we are spending billions of dollars on extra payments for a limited group of Medicare beneficiaries.” said Biles. “These plans haven’t realized the cost savings they were initially intended to create, and the extra spending will continue to increase even with the new CMS payment policies in place in 2010. We have to ask ourselves if this is the best use of our health care dollars or if those dollars could be better spent improving Medicare benefits for all beneficiaries or expanding health insurance coverage.”

This report updates the analysis of Medicare Advantage spending published in the Commonwealth Fund report, The Continuing Cost of Privatization: Extra Payments to Medicare Advantage Plans in 2008, last fall. It uses the most recent data available on actual MA enrollment from February 2009.

On the Net:

Ancient Tsunami Hit New York, New Jersey

Researchers say sedimentary deposits from more than 20 cores in New York and New Jersey indicate that some sort of violent force, such as a huge wave, swept the Northeast coastal region some 2,300 years ago, BBC News reported.

While some experts believe it could have been a large storm, other evidence is increasingly pointing to a rare Atlantic Ocean tsunami.

The size and distribution of material would require a high velocity wave and strong currents to move it, according to Steven Goodbred, an Earth scientist at Vanderbilt University.

He added it is unlikely that short bursts produced in a storm would suffice. “If we’re wrong, it was one heck of a storm,” he said.

While some experts are skeptical that it could have been a tsunami, others believe that an undersea landslide is the most likely source.

However, one research group proposed that an asteroid impact provided the trigger, since barrier beaches and marsh grass embroidered the coast, and Native Americans walked the shore in 300BC.

Neal Driscoll, a geologist from Scripps Institution of Oceanography, who is not associated with the research, said an Atlantic tsunami was rare but not inconceivable and that verifying one that is 2,000 years old is tricky.

Driscoll said the most frequent Atlantic tsunami triggers included earthquakes, underwater landslides, or a combination of the two.

He suggested that the New York wave was on the scale of the Grand Banks tsunami in Newfoundland, which killed more than two-dozen people and snapped many transatlantic cables. 

He imagines it would have been big enough to leap over the barrier islands, but that it did not reach the magnitude of the 2004 Sumatran tsunami.

The link between the layers of unusual debris found in sediment cores and a tsunami were first proposed by Goodbred while studying shellfish populations in Great South Bay, Long Island.

He found that the age of extracted mud cores with incongruous layers of sand and gravel matched the age of wood deposits buried in the Hudson riverbed and marine fossils in a New Jersey debris flow in cores gathered by other scientists.

Goodbred said the fist-sized gravel he found in Long Island would require a high velocity of water to land where it did.

Tsunami verification can be challenging due to the age and nature of the material.

Goodbred said the radiocarbon dates of the debris are accurate to within a century, but the only evidence that a dramatic event took place thousands of years ago is common coastal debris like wood, sand, shells and rock.

Driscoll said researchers are left to discern whether it was strewn by a tsunami or a hurricane, or another large storm, such as a “nor’easter”.

He added that understanding the origins of these deposits could be difficult.

Tsunamis are most common in the Pacific and Indian Oceans where continental plates collide and large undersea earthquakes are relatively common, but they can occur in any ocean.

Bruce Jaffe of the United States Geological Survey said in the Atlantic, where the plates spread, tsunamis are rare, which means Atlantic tsunamis are not well studied.

Where the New York debris layers were found, there is little research on tsunami debris in the variety of northeast coastal environments like riverbeds and marine bays. He said there are few modern analogues to compare them with for identification.

Jaffe said Grand Banks is the only unequivocal tsunami in the Atlantic on the Northeast coast since there were eyewitness accounts and the deposits matched that of other modern tsunamis.

But tsunami groups should collect more core samples, according to Driscoll, to see whether the distribution of the debris is consistent enough to rule out the possibility of a severe storm.

However, Goodbred said teams are planning to do just that in order to confirm that the deposits are not quirks of local geology.

He said teams would also repeat carbon dating on cores to verify ages, and he believes the tsunami theory will win out in the end.

“We’re building a case of circumstantial evidence that is getting harder and harder to ignore,” he said.

A group led by Columbia University geologist Dallas Abbot thinks a space impactor may have set off the massive tsunami wave.

Abbot’s researchers discovered material in the New Jersey and Hudson River cores dated to 2,300 years ago, which they believe it to be meteoritic in nature.

The team found carbon spherules, shocked minerals, and nanodiamonds, which are produced under extreme pressures and temperatures.

“We didn’t find the typical shocked quartz, but that is usual for a water impact,” said Abbott.

While no crater has yet been found, she theorized that an asteroid landed in the water off the coast of New York and New Jersey, either creating the wave directly or triggering a submarine landslide.

Asteroid evidence, however, is lacking and many geologists and other scientists are skeptical of the theory; but proof of an asteroid impact is not necessary to build the case for a massive wave.

“The tsunami story stands on its own without the impact,” Goodbred said.

On the Net:

Study: Mexicans more outgoing, social

U.S. researchers found evidence that supports a stereotype held by many Americans that Mexicans are more outgoing, talkative, sociable and extroverted.

The study, published in the Journal of Research in Personality, said that finding also contradicts the way many Mexicans view themselves, as being less extroverted than Americans.

A team of social psychologists had students from Mexico and the United States wear small digital audio recorders the size of a cell phone for two days and then analyzed the interactions.

Mexicans and Americans differed in the way they behaved socially, Nairan Ramirez-Esparza, a post-doctoral researcher at the University of Washington, said in a statement.

The Electronically Activated Recorder worn by 54 U.S. students from the University of Texas and 46 Mexican students at the Universidad Autonoma de Nuevo Leon in Monterrey recorded sounds for 30 seconds every 12.5 minutes. The students could not tell when the device was operating.

Researchers later listened to and coded the recordings to determine what was going on — such as whether a conversation was occurring indoors or outdoors, in a class or hallway, how many people were involved, or whether a person was talking on the phone, using a computer or watching television.

Mexicans spent more time talking in person, in groups and outdoors in public while Americans were more likely to be alone and have remote interactions with people such as talking over the phone.

Better educated people choose better diets

People with higher education levels have higher quality diets but the better diets are most costly, U.S. researchers said.

Researchers from the University of Washington compared the eating habits and food costs of a sample of 164 adults in the Seattle area.

Pablo Monsivais and Adam Drewnowski, both of the University of Washington, Seattle, said energy density of the diet — i.e., available energy per unit weight — is one indicator of diet quality. Lean meats, fish, low-fat dairy products and fresh vegetables and fruit provide fewer calories per unit weight than do fast foods, sweets, candy and desserts.

Energy dense foods provide more calories per unit weight but tend to be nutrient-poor.

The study, published in the Journal of the American Dietetic Association, said that for both men and women, higher dietary energy density was associated with higher intakes of total fat and saturated fat and with lower intakes of dietary fiber, potassium and vitamins A and C. Daily diet cost was slightly higher for men at $6.72/day than women at $6.21/day — reflecting the fact that men ate more than women.

However, for each 2,000 calories of dietary energy, men spent $7.43 compared to $8.12 spent by women. Diets that were more costly in terms of dietary energy were also lower in energy density and contained higher levels of nutrients.

Study Reveals Source of Human Evolution, African Genetics

An international team of researchers has reported the largest-ever study of genetics in Africa that helps pinpoint where human evolution began.

The 10-year study combined efforts from African, American, and European researchers who studied 121 African populations, four African American populations and 60 non-African populations to uncover more than four million genotypes.

Teams were looking for patterns of variation at 1327 DNA markers. They discovered that about 71 percent of the African American population has genetic traces back to origins in West Africa. They also have between 13 percent and 15 percent European ancestry and a smaller amount of other African origins.

“This is the largest study to date of African genetic diversity in the nuclear genome,” said Sarah Tishkoff, a geneticist with joint appointments in the School of Arts and Sciences and the School of Medicine at the University of Pennsylvania.

“This long term collaboration”¦has resulted in novel insights about levels and patterns of genetic diversity in Africa, a region that has been underrepresented in human genetic studies.”

Researchers have placed the origins of human evolution to be in southern Africa, near the South Africa-Namibian border. They compiled a map of ancient human migrations to show that modern humans likely left the continent near the middle of the Red Sea in East Africa.

Analysts also uncovered evidence for ancient common ancestry of geographically diverse hunter-gatherer populations in Africa, including Pygmies from central Africa and click-speaking populations from southern and eastern Africa, suggesting the possibility that the original pygmy language may have contained clicks.

“Given the fact that modern humans arose in Africa, they have had time to accumulate dramatic changes” in their genes, Tishkoff told the Associated Press.

She added that there is no single African population that represents the modern diversity on the continent. This suggests that many ethnically diverse African populations should be included in studies of human genetic variation, disease susceptibility, and drug response.

“Our goal has been to do research that will benefit Africans, both by learning more about their population history and by setting the stage for future genetic studies, including studies of genetic and environmental risk factors for disease and drug response,” said Tishkoff.

Scott Williams, Associate Professor of Molecular Physiology & Biophysics at Vanderbilt University, told the AP that the study “provides a critical piece in the puzzle” for determining genes that may predispose certain populations to a particular illness, such as prostate cancer, hypertension or diabetes.

“Everybody’s history is part of African history because everybody came out of Africa,” said Muntaser Ibrahim of the department of molecular biology at the University of Khartoum, Sudan.

“Now we have spectacular insight into the history of the African population … the oldest history of mankind.

Christopher Ehret of the department of history at the University of California, Los Angeles, compared genetic information to the migration of different languages.

He found that about 2,000 language groups exist in Africa, but they are not always correlated to a specific genetic variation. Movement of a language usually involves arrival of new people, Ehret told the AP. The genetic variations typically transfer along with the movement.

The study, published in the journal Science, was supported by the National Cancer Institute, the National Institutes of Health, the Advanced Computing Center for Research and Education at Vanderbilt University, the L.S.B. Leakey and Wenner Gren Foundation, the National Science Foundation, the David and Lucile Packard and Burroughs Wellcome foundations.

On the Net:

Bank of Japan lowers recovery expectations

The Bank of Japan predicted the recession would lead to a 3.1 percent gross domestic product contraction in the year ending in March 2010.

The prediction is a sharp increase in the bank’s assessment of the country’s economic troubles. In January, the BOJ said the economy would decline 2 percent, Kyodo news agency reported Thursday.

Economic conditions in Japan have deteriorated significantly, the bank said in its biannual economic report. The report said ”the growth rate is expected, from the latter half of fiscal 2009, to recover at a moderate pace.”

The speed of recovery depends on the restructuring of the U.S. and European financial systems, BOJ Gov. Masaaki Shirakawa said at a news conference.

The BOJ also said it would monitor the swine flu that broke out in Mexico and has spread to Asia, Europe and the United States.

To date, the flu’s impact seems to be limited, Shirakawa said. However, we need to carefully monitor the effects of a potential risk factor, he said.

Scientists Develop New Invisibility Cloak Technology

Two teams of scientists have developed a cloak that renders objects invisible to near-infrared light, BBC News reported.

The new technology, however, does not contain metals unlike previous such “cloaks” that resulted in imperfect cloaking because of losses of light.

Researchers say that since the approach can be scaled down further in size, the new technology is a major step towards a cloak that would work for visible light.

John Pendry from Imperial College London first theorized a cloak with a “carpet” design in 2008. One of the research teams describes its miniature “carpet cloak” in the journal Nature Materials.

A team at Cornell University, led by Michal Lipson, demonstrated a cloak based on Pendry’s concept.

Xiang Zhang, professor of mechanical engineering at the University of California, Berkeley, led a separate team.

He explained that his team was essentially transforming a straight line of light into a curved line around the cloak, making it difficult to perceive any change in its pathway.

This is the first cloak built considered to be carpet-based, as it uses a dielectric – or insulating material – that absorbs far less light than previous invisibility cloaks designed using metals.

Zhang said metals introduce a lot of loss, or reduce the light intensity, which can leave a darkened spot in the place of the cloaked object.

He added that since the new design uses silicon, a material that absorbs very little light, it is a “big step forward” in the evolution of invisibility cloaking.

The cloak’s design gives the illusion of a flattened surface by canceling out the distortion produced by the bulge of the object underneath. Therefore, light is bent around the object, like water around a rock.

The cloak changes the local density of the object it is covering, Zhang explained.

He told BBC News that when light passes from air into water it would be bent, because the optical density, or refraction index, of the glass is different to air.

“So by manipulating the optical density of an object, you can transform the light path from a straight line to any path you want,” he said.

Through a series of minuscule holes strategically “drilled” into a sheet of silicon, the new material produces such an effect.

Zhang’s team was able to “decide the profile” of the cloaked object by altering the optical density with the holes, thereby proving Pendry’s theory.

He explained that the team drilled lots of very densely packed holes in some places and in others they were much more sparse.

“Where the holes are more dense, there is more air than silicon, so the optical density of the object is reduced,” Zhang said.

“Each hole is much smaller than the wavelength of the light. So optical light doesn’t see a hole – it just sees a sort of air-silicon mixture,” he added.

He said this allows the density of the object to be adjusted as far as the light is concerned.

However, the current demonstration cloak was very tiny at just a few thousandths of a millimeter across. But he said there are applications even for such a minuscule cloak.

The electronics industry could use such technology to hide flaws on the intricate stencils or ‘masks’ that are used to cast processor chips.

Zhang said that alone could save the industry millions of dollars.

“It would allow them to fix flaws rather than produce an entirely new mask,” he said.

On the Net:

Media Triggers Short-Term Decline In Trans Fat Sales

A new study suggests that an increase in the number of news stories describing the health risks of trans fats seem to be affecting the nation’s shopping habits.

However, the effect does not appear to last long.

Scientists found that shoppers in Los Angeles had a tendency to purchase fewer food products containing artery-clogging trans fats in the week following media coverage about the fats. The effects waned shortly thereafter.

The study indicates a need for sustained public health initiatives to remind consumers to limit their intake of trans fat.

“While news coverage is a potentially valuable source of information, and one that can help the public to make informed decisions about their health, this study shows that news coverage alone is not enough to sustain changes in consumer behavior,” said Dr. Dominick L. Frosch of the University of California Los Angeles, the study’s co-researcher, in a statement.

Trans fats, formed when hydrogen is added to vegetable oil during processing to make food solidify, not only raise levels of “bad” LDL cholesterol, but can also lower levels of the “good” HDL cholesterol.

Foods labeled as containing “partially hydrogenated vegetable oil” include trans fat.  That has typically included most commercially prepared baked and fried foods, such as crackers, cookies, breads, chips and French fries.  However, restaurants and manufacturers have been increasingly removing trans fats from their food.

Since 2006, the U.S. Food and Drug Administration (FDA) has required all food makers to label the amount of trans fat in their products if the amount exceeds 0.5 grams per serving.

However, “there has been no coordinated effort to educate the public about the dangers of trans fat,” since the policy went into effect, Frosch said.

A recent study of U.S. adults found that although most were aware they should avoid trans fats, few could actually name any foods that typically contain them.

In the current study, Frosch and colleague Dr. Jeff Niederdeppe of Cornell University analyzed weekly sales data from a leading Los Angeles grocery chain for a 129-week period between 2005 and 2007.

They examined the relationship between local news coverage of trans fats and sales of several products rich in trans-fat such as hot dogs, buttered popcorn, stick margarine, vegetable shortening, packaged cookies and biscuits.

The researchers found that, in general, news coverage seemed to trigger a decline in trans-fat sales.  The influence was stronger after the FDA labeling rule went into effect, the study showed.

However, the impact of each media campaign waned after just one week.

“In the absence of broader changes in food policy and public education,” said Niederdeppe in the statement.

“News coverage may be insufficient to produce lasting reductions in trans-fat purchases and consumption.”

The study was published in the May 2009 American Journal of Preventive Medicine.

On the Net:

Recession Takes Toll On Workers’ Waistlines

A new survey finds that the recession may be taking a toll on Americans’ waistlines, with one in 10 U.S. workers reporting more daily snacking amid concerns over the economic slowdown.

More than four in ten survey participants, or 43 percent, reported gaining weight in their present jobs, according to the online survey by jobs Web site CareerBuilder.com.

One-quarter reported gaining more than 10 pounds, while one-in-six reported gaining in excess of 20 pounds.

Demonstrating how eating habits can cause weight gain, 39 percent reported eating out for lunch two or more times a week, while 12 percent reported buying their lunch from a vending machine at least once a week.

Two-thirds reported snacking at least once daily, with 24 percent reported snacking twice a day, the survey found.

“Weight gain in the office is common and is a result of a variety of issues including today’s economic stress and poor eating habits,”  Rosemary Haefner, vice president of human resources for CareerBuilder.com, told Reuters.

According to the survey, just 9 percent of employees exercise at midday, despite 25 percent of U.S. companies having an in-house fitness facility or providing gym passes to employees.

The survey found that 48 percent of women reported gaining weight, while 39 percent of men reported weight gain.

The online survey, which included responses from 4,435 U.S. full-time adult employees, was conducted by Harris Interactive from February 20 through March 11.

On the Net:

WHO Raises Pandemic Alert To Level 5

The World Health Organization (WHO) raised its pandemic alert level to 5 Wednesday afternoon, the second-highest level, indicating that the swine flu outbreak is moving closer toward becoming a pandemic.

Dr. Margaret Chan, the agency’s Director-General, said the decision to raise the alert was based on the latest scientific evidence on the outbreak.

Level 6 is the final stage, indicating a global pandemic of a new and deadly disease.

“For the first time in history we can track the pandemic in real time,” Chan said.

“The world is better prepared for an influenza pandemic than at any time in history.”

“No matter what the situation is, the international community should treat this as a window of opportunity to ramp up … response,” a Reuters report quoted Chan as saying.

“It is really all of humanity that is under threat during a pandemic.”

The new strain is a never-before-seen combination of swine, avian and human viruses.  

The fast-moving virus continues to spread throughout the globe – with cases now confirmed in at least 10 U.S. states.

Cases of H1N1 swine flu have now been confirmed in Mexico, the United States, Canada, Britain, Israel, New Zealand and Spain. Germany and Austria also reported cases of swine flu on Wednesday, while the number of reported cases increased in the United Kingdom and Spain.

As of early Wednesday, the CDC reported a total of 91 confirmed cases of swine flu in the United States.  However, many states have reported additional suspected cases that are now being investigated.

“The more recent illnesses and the reported death suggest that a pattern of more severe illness associated with this virus may be emerging in the U.S. Most people will not have immunity to this new virus and, as it continues to spread, more cases, more hospitalizations and more deaths are expected in the coming days and weeks,” wrote the U.S. Centers of Disease Control and Prevention (CDC) in an update on its Web site on Wednesday.

A Mexico City boy who traveled to Texas with family members became the first confirmed death in the U.S. from swine flu.  The 23-month-old boy arrived in the border city of Brownsville, Texas with “underlying health issues” on April 4.  He developed flu symptoms four days later, according to the Texas Department of State Health Services.

He was taken to a Brownsville hospital April 13 and transferred the following day to a hospital in Houston, where he died Monday night.

President Barack Obama advised schools with confirmed or possible swine flu cases to “consider temporarily closing so that we can be as safe as possible.”

At least 74 elementary, junior high and high schools have closed throughout the country due to confirmed or probable swine flu cases, the Department of Education said Wednesday.

An additional 30 schools have closed as a precautionary measure, said department spokesman Massie Ritsch.

In California, the number of confirmed cases statewide grew to 14, including a sick Marine at a base in Southern California.  State officials are investigating another 17 probable cases in eight counties.

On the Net:

Best husband are conscientious, neurotic

Women, but not men, get an added health benefit when paired with someone who is conscientious and neurotic, U.S. researchers said.

Study leader Brent Roberts of the University of Illinois and colleagues call the the boost in health reported by those with conscientious spouses or romantic partners the compensatory conscientiousness effect.

Highly conscientious people are more organized and responsible and tend to follow through with their obligations, to be more impulse controlled and to follow rules, Roberts said in a statement.

Highly neurotic people tend to be more moody and anxious, and to worry, he said.

Roberts and colleagues looked at the association of personality and self-reported health among more than 2,000 couples age 50 and older.

The study asked participants to rate their own levels of neuroticism and conscientiousness and to answer questions about the quality of their health.

The researchers found a significant, self-reported health benefit that accompanied marriage to a conscientious person, even among those who described themselves as highly conscientious.

A more unusual finding involved an added health benefit reported by women who were paired with highly conscientious men who were also highly neurotic, Roberts said. The same benefit was not seen in men with highly conscientious and neurotic female partners.

However, Roberts says that given a choice between a man who is simply conscientious and one who is conscientious and neurotic, choose the conscientious mate.

Health Experts Say Diabetes On The Rise In Gulf States

Health experts say that more and more Arabs living in the wealthy Gulf states are suffering lifestyle diseases, particularly obesity and diabetes, the AFP reported.

Governments in the area are now launching nutritional awareness programs to counteract the high-calorie fast-food culture that has gripped their desert nations.

Rapid economic growths due to windfall oil revenues are what Abdelrazzaq al-Madani, head of Dubai hospital and chairman of the Emirates Diabetes Society, blames for the rising obesity levels.

Madani told AFP that people have more sedentary jobs now compared to a harsher but more active lifestyle in the past.

He also noted a boom in restaurants offering foods from around the world, and high-calorie fast food.

He said that adolescents as young as 15 and 16 are developing diabetes, adding that there have been surges in obesity among younger children and especially teenagers.

Some 70 percent of adults and 12 percent of children in the UAE are overweight, while a fifth of the overweight children are at risk of developing obesity, according to official figures published this week.

But Madani said the increase in the incidence of diabetes is particularly worrying as it is a killer disease.

Heart attacks are one of the major killers in the United Arab Emirates and 80 percent of patients with diabetes die of heart attacks.

The latest official figures show that 19.6 percent of the UAE population of 6.4 million had diabetes in 2005.

Experts say that is the second highest rate in the world after the small South-Pacific island-nation of Nauru, where over 30 percent of the just over 13,000 population are diabetics.

Madani said he believes that if the same study were carried out today, there would be even higher numbers than in the previous results.

The UAE figure is expected to reach 28 percent in 2025, according to official data.

Figures from the UAE health ministry show that a third of patients in the country’s hospitals this month alone have diabetes, and that treatment of the disease costs the state some $200 million a year.

Doctors across the region told AFP that other Gulf states aren’t faring much better in their fight against the disease, with diabetes levels in Qatar at 15 percent, Bahrain at 14.3 percent and Oman at 13 percent.

In 2007, some 35 percent of children under 14 in the emirate were diabetic, compared to only seven percent 10 years earlier, according to Mariam al-Ali, a pediatrician at Hamad hospital in Qatar.

Health ministry official Mariam al-Jalahema said that a study conducted in 2006 on a sample of 1,769 people in Bahrain showed that 16.8 percent of women and 11.7 percent of men had diabetes.

The kingdom of Saudi Arabia’s official SPA news agency reported this month that 25 percent of the oil-rich kingdom’s over-30 population is diabetic.

Ahmad al-Shatti, the head of the diabetes awareness program in Kuwait, told AFP that one in every four Kuwaitis has diabetes.

This has led the UAE health ministry to launch a nationwide campaign to raise awareness about the health risks tied to obesity and diabetes.

The ministry has even joined forces with the United Nation’s Children’s Fund (UNICEF) to fight obesity among children in a project that includes programs to educate mothers on healthy nutrition for their children and, in collaboration with the education ministry, encouraging school pupils to engage in sports and follow a healthy diet.

In Abdu Dhabi, the UAE’s capital, several schools have organized running races to promote the idea of a healthier lifestyle and one school canteen reportedly stopped selling junk food.

Similar awareness campaigns have also been launched in other Gulf states, including a check-up booth in malls as well as TV programs and educational seminars that discuss healthier alternatives.

Meanwhile, a condition known as “diabetic foot” which leads to dozens of foot amputations each year is also raising alarm in the Gulf area.

The condition is a direct result of diabetes that causes a lack of feeling in the foot due to damaged nerve endings and inadequate blood flow.

Untreated wounds and ulcers can lead to infection in the foot and amputation if the infection further deteriorates.

In March, Abdulaziz al-Gannass, a foot and ankle surgeon in Riyadh, told AFP that some 90 people had a foot amputated each month in the Saudi capital alone.

“People as young as 30 now come in for diabetes-linked foot amputations,” Gannass said.

The Emirates Physiotherapy Society held a week-long campaign in March with the motto “foot first” after learning that 49 foot amputations were carried out in the UAE last year, according to physiotherapist Amal al-Shamlan, head of the rehabilitation section at Al-Wasl hospital in Dubai.

Image Courtesy Wikipedia

Postpartum Depression Relief Helps Sex Life

Having a baby can be both a thrill and the cause of depression for some women, but researchers found antidepressants could improve sexual problems.

Dr. Katherine Wisner, of the University of Pittsburgh Medical Center, Pennsylvania, and colleagues studied sexual problems in 70 women diagnosed with postpartum depression during an 8-week study.

They examined the antidepressants nortriptyline (for example, Sensoval, Aventyl) and sertraline (Zoloft, Lustral) for postpartum depression in the participants.

Seventy-three percent of women reported problems in at least three areas of sexual function at the start of the study, but by week 8 this number had fallen to 37 percent.

According to the researchers, women whose depression was cured were more likely to report fewer concerns about sex drive, sexual arousal, and reaching orgasm than those whose depression did not remit, regardless of the antidepressant they took.

The findings were published in the Journal of Clinical Psychiatry.

Investigators found that a decrease in sexual concerns was specifically linked with improvements in depression.

The authors wrote that since a specific association with either nortriptyline or sertraline did change the outcome, it does not matter how depression is addressed, only that it is relieved.

On the Net:

Vitamin Lift In Genetically Modified Corn

A genetic breakthrough has allowed the engineering of multiple vitamins into a single plant for the first time, in hopes of granting a nutritive boost in the diet of developing countries.

Though genetic engineering has previously been used to enhance vitamin content in a variety of crops such as rice, potatoes, lettuce and tomatoes, this will mark the first time that scientists have been able to intensify multiple vitamins in a single plant.

European scientists have genetically modified (GM) and engineered a vitamin-fortified corn designed to increase consumption of three key nutrients, which millions of people in developing countries are in dire need of.

These modifications are responsible for causing the corn to produce high levels of beta-carotene and precursors of vitamin C and folic acid.

The research delineates how the South African white corn was created in the journal Proceedings of the National Academy of Sciences (PNAS).

To embed the genes into the corn’s DNA, they are attached to microscopic gold particles covered with DNA and shot into immature corn embryos. If received by the embryo, this would change its internal biochemical processes causing it to produce the desired vitamins.

According to the study, the analysis of sample plants grown from the genetically modified seeds shows that the genes have successfully manipulated the corn into producing vitamins over four subsequent generations.

Dr Christou and his colleagues, from universities in Spain and in Germany noted in their paper that the amount of vitamins produced “vastly exceeds” anything yielded by conventional plant breeding methods. 

Plants have been produced that are fortified with one vitamin, but this only has the potential to curb one deficiency at a time.

Scientists believe that by producing a plant that carries three vitamins, poorer nations surviving on an unbalanced diet could be greatly helped.

Researchers estimate that by consuming 100-200g of fortified corn, a person is provided with almost their entire recommended daily intake of vitamin A, folic acid and 20% of the necessary ascorbate.

“Our research is humanitarian in nature and targets impoverished people in developing countries. This specific project is targeted towards sub-Saharan Africa,” Dr. Christou told BBC News.

He claims that they exclusively receive public funding and have no commercial restraints or vested interest. 

Dr. Christou said the recent success in the lab work is leading to the onset of field trials to be held in the U.S in 2010.

When they conclude the field trials, they hope to be able to have enough data to begin trying it in Africa.

“We will soon embark on animal studies to generate efficacy and safety data, which will be required at some point,” said Christou. 

Prof Jonathan Napier, research leader at the UK’s Rothamsted Research Institute compares the work by Dr. Christou and his colleagues to a similar work done on something called “golden rice”, only producing much greater levels of vitamin A.

He pointed to farmers and agriculturists who have been breeding crops to resist disease, and yield a greater and easier harvest.

He noted that the only difference is that the introduction of more advanced technology allows one to opt for more important traits such as nutrition.

However, he said, the process of getting the product from the lab to the field and ultimately for wide-scale use would be full of obstacles and would not be quickly accomplished. 

He said the approval process alone would be arduous and time consuming, “But it’s important to make sure that the technology works, is stable and is evaluated as well as possible.”

A Differing View

There are many who oppose genetic modification and present a far more skeptical view to the process.

Clare Oxborrow from campaign group Friends of the Earth gives caution about genetically modifying crops.

She points out that it is “virtually impossible” to monitor and ensure that the people consuming the crop are receiving the proper amount of vitamins which the plants had been modified to produce.

She suggests that the real issue is that impoverished people lack access to any variety of food, therefore fortifying one plant could not solve the problem at hand.

Instead of investing in “expensive, untested and potentially risky GM “Ëœtechnofixes'”, she goes on to commend using research efforts to help people grow and gain access to a broader range of foods which would have much greater impact on health in the long run.

She adds, “Supporting families to grow green leafy vegetables in their communities can ensure sufficient levels of vitamin A, as well as a host of other nutrients and vitamins that a narrow GM fix would not even begin to solve.”

Vitamin Benefits:

  • Beta carotene – becomes Vitamin A – good for skin, eyesight, embryonic development, fertility and the immune system
  • Folate – folic acid – helps with red blood cell formation and many genetic processes, aids development of fetus during pregnancy
  • Ascorbate – becomes Vitamin C – essential for skin proteins and wound-healing and stimulates the immune system

On the Net:

Diminuendo ““ New Mouse Model For Progressive Hearing Loss

Scientists of Helmholtz Zentrum Mnchen have developed a new mouse model that can be associated with deafness. With this model they succeeded for the first time in showing that microRNA, a new class of genes, influences hearing loss. The respective microRNA seed region influences the production of sensory hair cells in the inner ear, both in the mouse and in humans. The findings have been published ahead of print in the current online issue of Nature Genetics. This study represents a major step forward in elucidating the common phenomenon of progressive hearing loss, opening up new avenues for treatment.

Scientists of Helmholtz Zentrum Mnchen, led by Professor Martin Hrab© de Angelis, director of the Institute of Experimental Genetics, have developed a new mouse model with a genetic mutant in which a single base of a specific microRNA seed region has been altered. Mice carrying this miR-96 mutation suffer progressive hearing loss as they get older. Moreover, if they carry two of these mutants, their sensory hair cells are impaired from birth on.

A number of genes associated with hearing loss were already known. “However, we were very surprised when with our new mouse model we discovered this new class of genes ““microRNA ““ as genetic cause for this clinical picture,” explained Dr. Helmut Fuchs, who conceived the idea of this mouse model and who is scientific -technical head of the German Mouse Clinic at Helmholtz Zentrum Mnchen.

The new mouse model is called diminuendo, named after the term in music theory meaning “becoming gradually softer”. The mice were bred using the ENU method in which the male mice are administered N-ethyl-N-nitrosurea (ENU), thus influencing the DNA of their sperm. Successor generations develop dominant or recessive mutations. Using methods like these, Martin Hrab© de Angelis and his colleagues in the German Mouse Clinic can thus identify mutants that develop diseases similar to human diseases. They made the diminuendo mouse model available to colleagues of the Wellcome Trust Sanger Institute in Cambridge, UK, who ““ based on specific characterizations ““ ultimately found the association with the miR-96 mutation.

In Germany alone, around 13 million people have impaired hearing, according to estimates of the German Deaf Association (Deutscher Schwerhörigenbund). There are diverse causes for this, including deafness simply due to old age, hearing loss caused by infections and damage due to chronic noise. However, progressive hearing loss can also have genetic causes.

“We assume that our mouse model will be of far-reaching significance for the development of treatment strategies against genetically caused progressive hearing loss in humans,” Dr. Fuchs explained. Colleagues from Spain confirm his assumption. They have already performed first examinations on patients diagnosed with progressive hearing loss. In them the microRNA cluster Mirn96 was mutated in the same seed region as in the mouse model. Now, with the aid of this mouse model, the international research consortium hopes to identify factors which are necessary for long-term survival of hair cells and thus to find new approaches for treatment of progressive hearing loss.
Further information

Original Publication: Lewis M. et al.: An ENU-induced mutation of miR-96 associated with progressive hearing loss in mice. Nature Genetics online April 12, 2009 http://dx.doi.org/10.1038/ng.369

The German Mouse Clinic is a diagnosis clinic established under the auspices of the National Genome Research Network (NGFN). Here mutant (genetically altered) mice are characterized under standardized conditions, in order to find animal models for genetically caused human diseases and thus to better understand these diseases. Further information

The Institute of Experimental Genetics of Helmholtz Zentrum Mnchen is concerned with the functional analysis of the genome of mammals. Mouse models are developed to elucidate gene functions, and new investigation methods of genomics, proteomics and bioinformatics are applied. The objective is to elucidate the causes and pathogenic patterns of human diseases. Further information

Mouse Mutant Archive: Further information

Why to see a doctor for allergies

Children with allergies and asthma may benefit from an allergy evaluation, a U.S. allergist advises.

Dr. Maya Jerath of the University of North Carolina at Chapel Hill School of Medicine says controlling allergy symptoms can prevent some of the common complications of untreated allergies like sinusitis and ear infections.

Some sufferers seek relief with supplements like vitamin C or zinc, Jerath said.

These supplements may help with colds, but there are no studies showing they work for allergic rhinitis, Jerath says in a statement. However, nasal saline rinses can be helpful in mitigating symptoms because they minimize exposure by clearing out any allergens that might be present. In addition, there are a few small studies that show regular use of these rinses can change the cells lining your nose making it less prone to inflammation, which creates that stuffy feeling.

The best way to reduce the impact of seasonal allergies is to avoid exposure, but Jerath says it is unrealistic to tell people not to go outside.

There are many prescription medications that work well for allergies, she says. Similarly, if your symptoms last for more than a season, you can see a doctor to find out what you might be allergic to, and to see if you’re a candidate for immunotherapy — a treatment that aims to cure allergies.

Shaken Baby Syndrome Renamed

The American Academy of Pediatrics is encouraging doctors to adopt a more scientifically descriptive term for shaken baby syndrome.

The group suggests a term – such as “abusive head trauma” – which communicates diagnosis of brain, skull and spinal injuries associated with shaking as well as other head injuries experienced, to be issued in a policy statement being published in the May issue of its journal, Pediatrics.

The academy says that the use of the new diagnostic term in medical records could provide more clarity in the courtroom where some defense attorneys and doctors are able to claim that shaken baby syndrome doesn’t exist on the grounds that unless the neck is broken, it is not plausible for a baby to be shaken hard enough to cause brain damage.

Dr. Robert Block, former chairman of an academy committee on child abuse explains that this argument is based on faulty evidence and most physicians who specialize in treating child abuse would differ.

Shaking can cause bruising, swelling, and bleeding, “which can lead to permanent, severe brain damage or death,” according to The National Institutes of Health.

Block says that avoiding the use of such a vague term such as “shaken baby syndrome” could curb legal rhetorical maneuvering which distracts from the question of whether the abuse actually occurred.

Block, a pediatrics professor at the University of Oklahoma’s community medicine school in Tulsa, explicitly expresses that when it comes to the possibility of death resulting from shaking an infant, this change of terms in no way alters the position of the academy.

Dr. Cindy Christian, a co-author of the policy statement and a child abuse researcher at Children’s Hospital of Philadelphia, said evidence indicates that while it is possible for babies to be injured by severe shaking alone, many times they also have head injuries caused by other abuse.

The National Center on Shaken Baby Syndrome says shaking ends in injury or being killed in an estimated 1,200 to 1,400 U.S. children each year, but with many cases unreported, this number is likely much higher.

While the new term, “abusive head trauma”, is broadly used, the advocacy group states that shaking is in fact the particular leading cause of death in most cases.

Marilyn Barr, executive director of the center on shaken baby syndrome, commends the academy for “trying to clear murky waters.”

The policy encourages doctors to look for signs of head trauma that could be the result of shaking and to educate parents about more effective ways to calm a baby and avoid the dangers of shaking.

On the Net:

Morphine Can Be Given More Effectively Without Increased Dosages

Researchers at the Hebrew University of Jerusalem have found a way to maintain the pain-killing qualities of morphine over an extended period of time, thus providing a solution for the problem of having to administer increasing dosages of the drug in order to retain its effectiveness.

One of the limitations in long-term use of morphine for pain relief is the rapid development of tolerance. The effectiveness of morphine declines quickly, and one must increase the dosage in order to preserve effective pain relief. However, the increased dosage also increases negative side effects.

The Hebrew University researchers, Prof. Yehuda Shavit and his graduate student Gilly Wolf of the Psychology Department, found that administration of morphine causes a substance called interleukin-1 to be released.

Under normal circumstances, interleukin-1 plays an important role in survival. In case of tissue damage, nerve injury, or inflammatory reaction, inteleukin-1 is released and sets off a process which increases the sensitivity to pain in the injured area. This pain serves as a warning signal, telling the body that there is a problem that should be attended to. In case of chronic pain, morphine is still the drug of choice for pain relief.

However, since prolonged administration of morphine raises the level of interleukin-1, thereby enhancing pain sensitivity, the effectiveness of morphine as a pain killer is steadily reduced, requiring greater dosages with accompanying negative side effects.

The Hebrew University researchers were able to show in animal experiments that administering morphine together with another drug that blocks the activity of interleukin-1 provides more effective pain relief over the long term without having to increase the dosage.

Shavit, who is the Leon and Clara Sznajderman Professor of Psychology at the Hebrew University and whose specialty is psychoneuroimmunology, expressed hope that this research will make it possible for clinicians to make use of morphine, together with substances that block interluekin-1, in order to bring about better pain relief with lower dosages and with minimized side effects. The research will be presented at a conference on pain research on May 3 on the Mount Scopus campus of the university. The conference is open to journalists and to people in the field.

On the Net:

Many Still Find Gadget Lingo Perplexing

A survey by Gadget Helpline had over 5,000 users come up with a list of the 10 most confusing technological terms.

Terms such as WAP, dongle, and cookie are among the many words leaving users scratching their heads.

The firm is asking for simplistic words that reflect the actual meaning rather than the overwhelming amount of jargon currently in place.

The Plain English Campaign, insisting that this would help bring down the “walls of techno-babble”, supports this motion.

The campaign secretary for the Plain English Campaign, Peter Griffiths, explained to BBC that it is possible to relieve exasperated users by making the technology world easier to understand and therefore navigate.

He said, “We need to pull our head our of the digital clouds and use plain English.”

He added pragmatically, “If changing the name isn’t an option then a glossary of terms would work. Not only does it explain the language, but it’s a nice way of learning for people who don’t have such a good grasp of the language.”

Seemingly simple words such as Digital TV have joined the English language but few people actually understand its practical meaning.

This becomes exponentially more perplexing when you take into account that many companies have varying names for identical products.

“One way of linking peripherals to a Mac was via an interface called FireWire. On a Sony it is called i.LINK and it’s also called Lynx by Texas Instruments, even though all three are exactly the same thing. That hardly makes things easy for the consumer,” said Alex Watson, editor of Custom PC magazine.

“Even when the industry tries to appeal to regular people, it doesn’t always work. Take Wi-Fi – it was named solely because of HiFi. Wireless fidelity doesn’t actually mean anything, but the alternative was 802.11B which hardly trips off the tongue.”

He told the BBC that part of the problem is that companies experience immense pressure to construct catchy marketing-friendly terms as well as words that will connect them to the actual meaning, some of which may become a part of the public vernacular.

Mr. Watson says that as users become more familiarized with words, the language surrounding it will constantly evolve.

“It may be called Wi-Fi but most people would call it a wireless network, which is exactly what it is,” he said.

The Top 10 list:

  • Dongle
  • Cookie
  • WAP
  • Phone jack
  • (Nokia) Navi Key
  • Time shifting
  • Digital TV
  • Ethernet
  • PC Suite
  • Desktop

On the Net:

New Cache Of Mummies Discovered In Egypt

Vividly painted wooden coffins housing an anthology of pharaonic-era mummies were uncovered near the Lahun pyramid in Egypt, the director of the excavation said on Sunday. 

The mud-brick Lahun pyramid is believed to be constructed by the pharaoh Senusret II in the 12th dynasty, who reigned 4,000 years ago.  According to Reuters, the mummies are the first to be discovered beneath the sand surrounding the pyramid in the depths of the desert rock, but the excavation team anticipates more findings very soon. 

This is not the first time for the site to be excavated, as archaeologists searched for artifacts there more than a century ago. 

“The tombs were cut on the rock itself, and they vary in architectural designs,” said archaeologist Abdul Rahman Al-Ayedi, head of excavations at the site. “Most of the mummies we discovered were with these bright and beautiful colors.”

Coffins decorated with bright hues of green, red and white bearing images of their occupants were unearthed at the site.

Ayedi believes a new understanding of Egyptian funerary architecture and customs of the Middle Pharaonic Kingdom all the way to the Roman era could be learned from the exploration of the dozens of tombs encompassing the site near Fayoum, 35 miles south of Cairo. 

The archaeologists unearthed dozens of mummies, thirty of which were very well-preserved with prayers purposed to help the deceased in the afterlife inscribed upon them.  Some of the tombs were erected on top of gravesites from earlier eras. 

Egypt may soon publicize an additional Lahun pyramid finding of importance, Ayedi said.  The site, once enveloped in slabs of white limestone, revealed that it could possibly be thousands of years older than previously thought.

Ayedi told reporters, “The prevailing idea was that this site has been established by Senusret II, the fourth king of the 12th dynasty. But in light of our discovery, I think we are going to change this theory, and soon we will announce another discovery.”

He said teams had made a discovery of an artifact that was dated earlier than the 12th dynasty, but did not include any specifics on the item and promised an official statement would be made within days. 

Ayedi’s ambition in excavating at Lahun, Egypt’s southernmost pyramid, is to uncover more than what was discovered at the first excavation in the 19th century. He believes the results of that expedition do not match the significance of the site.

“The size of the site is huge. So I thought that we could unearth a lot of elements in this site. At the beginning of the excavation, I thought that we may rewrite the history of the area, and I was right,” he announced.

Down a 16-meter well, the archaeologists discovered the main entrance to the pyramid just last year.  Later, they found storage jars and other various objects, and in recent months, they uncovered the mummies in the surrounding desert rock.  

This year has brought many significant discoveries in Egypt, whose economy is supported heavily on tourism.  In February, an uncommon find of an intact mummy sealed in a sarcophagus was unearthed at the world’s oldest standing step pyramid at Saqqara, near Cairo.

On the Net:

Blood Vessels Created From Dialysis Patients’ Own Cells

A new study shows how dialysis patients were successfully fitted with blood vessels engineered solely from their own living tissue, making it easier and safer for them to use dialysis machines, the AFP reported.

The results suggest that doctors might one day be able to custom-produce blood vessels for patients with circulatory problems in their hearts or legs.

Lab-grown blood vessels were implanted into 10 patients with advanced kidney disease in Argentina and Poland from 2004 to 2007, wrote Todd McAllister of Cytograft Tissue Engineering in California and colleagues in the New England Journal of Medicine.

The researchers published early results for two of these patients in 2005 and preliminary findings for another 4 patients two years later.

Dialysis patients compensate for kidney failure by having their blood filtered three times a week on the machines to remove wastes and excess fluid from the body.

During the procedure a short bypass, or shunt, must be inserted between an artery and a vein for the process to work. This additional bit of arterial plumbing can be taken from an existing vein in some cases.

But many patients have the bridge made from plastic, often resulting in a significantly higher failure rate.

The patients in the study were fitted with new shunts grown in a laboratory from their own adult cells in a technique called cell-sheet biofabrication.

The process starts with a skin biopsy. Then fibroblast cells are grown in a single-layer cell culture, rolled up and chemically washed to remove living cells.

After being encased with another sheet of living cell tissue, the resulting tube-like membrane is then implanted in the patient’s arm.

However, the researchers acknowledged that three of the grafts failed during an initial safety phase where the new bypasses were assessed for mechanical stability while the patients continued to receive traditional dialysis.

Two other patients left the study for reasons unrelated to the trials, while the five remaining patients had grafts functioning for dialysis for between six and 20 months after implantation.

The study concluded that doctors and scientists could provide the requisite mechanical strength in a tissue-engineered construct without relying on synthetic biomaterials.

“The fact that our long-term intervention rate was so low — even with this high risk group — offers hope to patients suffering from end-stage renal disease,” it said.

Epidemiological studies show that there are between 1.5 and 2.0 million people receiving dialysis treatment around the world, and many millions more in need.

The cost of the innovative technique would need to come down before it was commercially viable, according to experts in a commentary also published in The Lancet.

McAllister said he and his team plan to test similar devices in patients with heart and leg problems.

“It’s basically a piece of plumbing to bypass blockages,” he said.

On the Net:

How Cigarettes Calm You Down

The calming neurological effects of nicotine have been demonstrated in a group of non-smokers during anger provocation. Researchers writing in BioMed Central’s open access journal Behavioral and Brain Functions suggest that nicotine may alter the activity of brain areas that are involved in the inhibition of negative emotions such as anger.

Jean Gehricke led a team of researchers from the University of California who studied the effect of nicotine patches on the subjects’ tendency to retaliate in response to anger provocation. The subjects played a computer game and could see a video screen of another player who they believed to be their opponent, although, in fact, they were playing alone. After each round, the victor could give his opponent a burst of unpleasant noise ““ at a duration and volume set by the winner. In some of the subjects, nicotine was associated with a reduced tendency to retaliate, even after provocation by the ‘opponent’.

According to Gehricke, “Participants who showed nicotine-induced changes in anger task performance also showed changes in brain metabolism. Nicotine-induced reductions in length of retaliation were associated with changes in brain metabolism in response to nicotine in brain areas responsible for orienting, planning and processing of emotional stimuli”.

The authors say that their findings support the idea that people of an angry disposition are more susceptible to nicotine’s effects, and are therefore more likely to become addicted to cigarettes. They conclude, “Novel behavioral treatments that affect the cortical and limbic brain areas, like anger management training, may aid smoking cessation efforts in anger provoking situations that increase withdrawal and tobacco cravings”.

* Nicotine-induced brain metabolism associated with anger provocation. Jean G Gehricke, Steven G Potkin, Frances M Leslie, Sandra E Loughlin, Carol K Whalen, Larry D Jamner, James Mbogori and James H Fallon. Behavioral and Brain Functions (in press)

On the Net:

NSF Awards Millions For Cloud Computing Research

CLuE awards promote academic use o cluster computing resources on IBM/Google cloud

Today, the National Science Foundation (NSF) announced it has awarded nearly $5 million in grants to 14 universities through its Cluster Exploratory (CLuE) program to participate in the IBM/Google Cloud Computing University Initiative. The initiative will provide the computing infrastructure for leading-edge research projects that could help us better understand our planet, our bodies, and pursue the limits of the World Wide Web.

In 2007, IBM and Google announced a joint university initiative to help computer science students gain the skills they need to build cloud applications. Now, NSF is using the same infrastructure and open source methods to award CLuE grants to universities around the United States. Through this program, universities will use software and services running on an IBM/Google cloud to explore innovative research ideas in data-intensive computing. These projects cover a range of activities that could lead not only to advances in computing research, but also to significant contributions in science and engineering more broadly.

NSF awarded Cluster Exploratory (CLuE) program grants to Carnegie-Mellon University, Florida International University, the Massachusetts Institute of Technology, Purdue University, University of California-Irvine, University of California-San Diego, University of California-Santa Barbara, University of Maryland, University of Massachusetts, University of Virginia, University of Washington, University of Wisconsin, University of Utah and Yale University.

“Academic researchers have expressed a need for access to massively scaled computing infrastructure to explore radically new approaches to solving data-intensive problems. These approaches would be unthinkable using ordinary computing resources available on campuses today,” Jeannette Wing, NSF’s assistant director for computer & information science and engineering. “We are pleased to provide the awards to these fourteen universities, enabling researchers to engage with this emerging and novel model of computing.”

“IBM is intensely focused on applying technology and science to make the world work better,” said Willy Chiu, vice president, IBM Cloud Labs. “IBM is thrilled to power the groundbreaking studies taking place at these prestigious universities, and to help enable researchers and students around the world tackle some of the biggest problems of our time.”

“We’re pleased and excited that the CluE program will support a wide range of original research,” said Alfred Spector, Google’s vice president for research and special initiatives. “We’re looking forward to seeing the grantees solve challenging problems across various fields through creative applications of distributed computing.” The universities will run a wide range of advanced projects and explore innovative research ideas in data-intensive computing, including advancements in image processing, comparative studies of large-scale data analysis, studies and improvements to the Internet, and human genome sequencing, among others, using software and services on the IBM/Google cloud infrastructure.

Carnegie-Mellon University

Researchers at Carnegie-Mellon University are using cloud computing to characterize the topicality of web content to more effectively process web searches. Routing searches topically requires less effort than traditional searches, enabling significant computational and financial savings. The project is using the Google/IBM cluster to “crawl” the web and perform the data cleansing and pre-processing necessary to develop a web dataset of 1 billion documents to support the research. The web dataset is also being made available to the larger information retrieval community to multiply the impact of the project on that discipline.

The second research project is focused on developing the Integrated Cluster Computing Architecture (INCA) for machine translation (using computers to translate from one language to another). Open-source toolkits make it easier for new research groups to tackle the problem at lower costs, broadening participation. Unfortunately, existing toolkits have not kept up with the computing infrastructure required for modern “big data” approaches to machine translations; INCA will fill this void.

Florida International University

Florida International University (FIU) researchers are leveraging cloud computing to analyze aerial images and objects to help support disaster mitigation and environmental protection. Specifically, the CluE effort at FIU relates to its TerraFly project, which is a web-service of 40 terabytes of aerial imagery, geospatial queries and local data. Students and researchers will now be able to precisely code these images in real-time.

Massachusetts Institute of Technology, University of Wisconsin-Madison and Yale University

These three universities are using the National Science Foundation CLuE grants for a comparative study of approaches to cluster-based, large-scale data analysis. Both MapReduce and parallel database systems provide scalable data processing over hundreds to thousands of nodes, yet it’s important for researchers to know the differences in performance and scalability of these two approaches to know which is more suitable when designing new data-intensive computing applications.

Purdue University

This project is investigating linguistic extensions to MapReduce abstractions for programming modern, large-scale systems, with special focus on applications that manipulate large, unstructured graphs. This will impact a broad class of scientific applications. Graphs have important utility in the social sciences (social networks), recommender systems, and business and finance (networks of transactions), among others. The specific case study targeted by the research is a comparative analysis of graph-structured biochemical networks and pathways which underlie many important problems in biology.

University of California-Irvine

In many applications, data-quality issues resulting from a variety of errors create inconsistencies in structures, representations or semantics. Simple spelling variations such as “Schwarzenegger” vs. “Schwarseneger,” “Brittany Spears” vs.”Britney Spears,” or “PO Box” vs. “P.O. Box” are an example of this. Dealing with these issues is becoming increasingly important as the volume of data being processed increases. This project is providing support for efficient fuzzy queries on large text repositories. Supporting fuzzy queries can ultimately help applications mitigate their data quality issues because entities with different representations can be matched.

University of California-San Diego / San Diego Supercomputer Center

Researchers at the University of California, San Diego are studying how to manage and process massive spatial data sets on large-scale compute clusters. The specific test case is analysis of high-resolution topographic data sets from airborne LiDAR surveys, which are of great interest to many Earth scientists. Providing efficient access and analytic capabilities will have broad impact beyond the geosciences because the techniques are likely to be applicable to other types of large data sets.

University of California-Santa Barbara

Many of today’s data-intensive application domains, including searches on social networks like Facebook and protein matching in bioinformatics, require us to answer complex queries on highly-connected data. The UCSB Massive Graphs in Clusters (MAGIC) project is focused on developing software infrastructure that can efficiently answer queries on extremely large graph datasets. The MAGIC software will provide an easy to use interface for searching and analyzing data, and manage the processing of queries to efficiently take advantage of computing resources like large datacenters.

University of Maryland-College Park

The CluE initiative is funding another machine translation project that promises to bridge the language divide in today’s multi-cultural and multi-faceted society. Systems capable of converting text from one language into another have the potential to transform how diverse individuals and organizations communicate. By coupling network analysis with cross-language information retrieval techniques, the result is a richer, multilingual contextual model that will guide a machine translation system in translating different types of text. The potential broader impact of this project is no less than knowledge dissemination across language boundaries, which will serve to enrich the lives of all the world’s citizens.

A second project focuses on developing parallel algorithms for analyzing the next generation of sequencing data. Scientists can now generate the rough equivalent of an entire human genome in just a few days with one single sequencing instrument. The analysis of these data is complicated by their size – a single run of a sequencing instrument yields terabytes of information, often requiring a significant scale-up of the existing computational infrastructure needed for  analysis.

University of Massachusetts-Amherst

This project focuses on how researchers at the Center for Intelligent Information Retrieval (CIIR) are using the CluE infrastructure to learn more about word relationships. These relationships are not labeled explicitly in text and are quite varied; by exploiting these relationships, this project will help lead to a more effective ranking of web-retrieval results.

University of Virginia

Imagine continuously zooming into an image from your personal photo collection.  Unlike with modern image processing software, however, this zoom operation would reveal details missing from the original image. For example, zooming into someone’s shirt would eventually show a high-resolution image of the threads that compose it. A research team in the Department of Computer Science at the University of Virginia plans to develop techniques for intelligently enlarging a digital image that use a database of millions of on-line images to find examples of what its components look like at a higher spatial resolution.

University of Washington

Astrophysics is addressing many fundamental questions about the nature of the universe through a series of ambitious wide-field optical and infrared imaging surveys. New methodologies for analyzing and understanding petascale data sets are required to answer these questions. This research project is focused on developing new algorithms for indexing, accessing and analyzing astronomical images. This work is expected to have a broad range of applications to other data intensive fields.

University of Washington and University of Utah

This project is building a new infrastructure for computational oceanography that uses the CluE platform to allow ad hoc, longitudinal query and visualization of massive ocean simulation results at interactive speeds. This infrastructure leverages and extends two existing systems: GridFields, a library for general and efficient manipulation of simulation results; and VisTrails, a comprehensive platform for scientific workflow, collaboration, visualization, and provenance.

IBM/Google Cloud Computing University Initiative

The following resources are available from IBM and Google to these universities to leverage for their respective projects:

* A cluster of processors running an open source implementation of Google’s published computing infrastructure (MapReduce and GFS from Apache’s Hadoop project)

* A Creative Commons licensed university curriculum developed by Google and the University of Washington focusing on massively parallel computing techniques

* Open source software designed by IBM to help students develop programs for clusters running Hadoop. The software works with Eclipse, an open source development platform.

* Management, monitoring and dynamic resource provisioning by IBM using IBM Tivoli systems management software

Image Caption: A computer visualization of a river bed created using VisTrails, a system developed by University of Utah computer scientists to help scientists create high-quality visualizations. Under the CluE initiative, the University of Utah team will work with other computer scientists from the University of Washington to expand the capabilities of VisTrails and make it easier to visualize very large data sets. Credit: Juliana Freire and Claudio Silva, University of Utah

On the Net:

Scientists Create Elusive Molecule In Lab Experiments

Scientists say the Rydberg molecule, a molecule that until now existed only in theory, has finally been created, BBC News reported.

The molecule is formed through an elusive and extremely weak chemical bond between two atoms.

Researchers reported in the journal Nature that the new type of bonding occurs when one of the two atoms in the molecule has an electron very far from its nucleus or center.

Nobel prize-winning physicist Enrico Fermi developed fundamental quantum theories about how electrons behave and interact. Experts say the new molecule discovery reinforces Fermi’s ideas.

The scientists formed the Rydberg molecules in question from two atoms of rubidium – one a Rydberg atom, and one a “normal” atom. The movement and position of electrons within an atom can be described as orbiting around a central nucleus, with each shell of orbiting electrons further from the center.

But what makes a Rydberg atom unique is that it has one electron alone in an outermost orbit, which is, atomically, very far from its nucleus.

Fermi predicted back in 1934 that if another atom were to “find” that lone, wandering electron, it might interact with it.

However, Chris Greene, a theoretical physicist from the University of Colorado who first predicted that Rydberg molecules could exist, said Fermi never imagined that molecules could be formed.

“We recognized, in our work in the 1970s and 80s, the potential for a sort of forcefield between a Rydberg atom and a groundstate [or normal] atom. It’s only now that you can get systems so cold, that you can actually make them,” Greene said.

Vera Bendkowsky from the University of Stuttgart, who led the research, explained that unimaginably cold temperatures are needed to create the molecules and the nuclei of the atoms have to be at the correct distance from each other for the electron fields to find each other and interact.

“We use an ultracold cloud of rubidium – as you cool it, the atoms in the gas move closer together,” she said.

She said that with temperatures very close to absolute zero, a “critical distance” of about 100nm (nanometers – 1nm = one millionth of a millimeter) between the atoms is reached. When one is a Rydberg atom, the two atoms form a Rydberg molecule. This 100nm gap is vast compared to ordinary molecules.

Greene likened the Rydberg electron to a sheepdog that keeps its flock together by roaming speedily to the outermost periphery of the flock, and nudging them back towards the center any member that might begin to drift away.

But it requires energy to push the electron out to its lonely periphery in order to make a Rydberg atom.

Bendkowsky said they excite the atoms to the Rydberg stage with a laser and if they have a gas at the critical density and with two atoms at the correct distance that are able to form the molecule, they excite one to the Rydberg state in order to form a molecule.

The researchers said, however, that the longest-lived Rydberg molecule survives for just 18 microseconds, but the fact that the molecules can be made and seen confirms long-held fundamental atomic theories.

Helen Fielding, a physical chemist from University College London, called it a “very exciting set of experiments,” adding that it shows that this approach is feasible.

“It will be interesting to see what other fundamental physics we’ll be able to test with it,” she said.

A Nobel prize-winning piece of physics research by Indian physicist Satyendra Nath Bose was the catalyst for Professor Greene’s prediction that Rydberg molecules could exist.

Bose sent some theoretical calculations about particles to Albert Einstein in 1924 and Einstein made a prediction that if a gas was cooled to a very low temperature, the atoms would all suddenly collapse into their “lowest possible energy state”, so they would be almost frozen and behave in an identical and predictable way.

Experts say this is analogous to when a gas suddenly condenses into drops of liquid.

Greene realized that when scientists reached the goal of Bose-Einstein condensation, by cooling and trapping alkali atoms, ultracold physics could be used to form molecules that simply would not exist in any other conditions.

On the Net:

Replicated Brain Closer To Thought

A replication of a small portion of the brain has been developed, based on careful construction of each molecule, and has successfully restored experimental results from actual brains, BBC News reported.

By placing the so-called “Blue Brain” in a virtual body, signs of molecular and neural origins of thought and memory can be observed.

The director leading the study stated that scaling the recreation of the human brain is simply a matter of money.  “It’s not a question of years, it’s one of dollars. The psychology is there today and the technology is there today. It’s a matter of if society wants this. If they want it in 10 years, they’ll have it in 10 years. If they want it in 1000 years, we can wait.”

The Blue Brain venture was unveiled at the European Future Technologies meeting in Prague in 2005. Of all brain recreation efforts ever undertaken, this is considered the most ambitious yet. 

The Blue Brain project is unique in that it was designed to reverse-engineer mammal brains from actual laboratory information and to establish a computer model down to the level of the molecules that make them up.  Many other computer simulation efforts, however, have tried to code in “brain-like” computation or to imitate portions of the nervous systems and brains of various different animals, but none have accomplished the unique features of the Blue Brain project.

In the first part of the venture, researchers completed a model of the neocortical column, which supports higher brain function and thought. 

Director of the Blue Brain project, Henry Markram, commented, “The thing about the neocortical column is that you can think of it as an isolated processor. It is very much the same from mouse to man – it gets a bit larger a bit wider in humans, but the circuit diagram is very similar.”  Markram also founded the Brain Mind Institute of Switzerland. 

When evolution discovered this “mammalian secret,” it reproduced numerous times and then “used it as it needed more and more functionality,” he commented further.

At the Science Beyond Fiction conference, Professor Markram informed attendees that the column is being developed into a virtual reality agent, or in other words, a simulated animal in a simulated environment for the purpose of enabling researchers to monitor the specific activities in the column as the animal moves about the space.

“It starts to learn things and starts to remember things. We can actually see when it retrieves a memory, and where they retrieved it from because we can trace back every activity of every molecule, every cell, every connection and see how the memory was formed.”

In the second phase of the project, an improved version of the IBM Blue Gene supercomputer is utilized.  Use of this supercomputer was common in the research to date. “The next phase is beginning with a ‘molecularization’ process: we add in all the molecules and biochemical pathways to move toward gene expression and gene networks. We couldn’t do that on our first supercomputer.”

Professor Markram expects in a period of 10 to 20 years the project will incorporate many components of medicine, including the genomic profile, and eventually establishing a wide-range database for “personalized medicine.” 

This approach would enable researchers to imitate how an individual might react to a specific drug or treatment.

The goal of the conference is to encourage high-risk, multidisciplinary research in information and communication technologies (ICT).  Thus, researchers of wide ranging backgrounds from computer scientists to biologists flock to the meeting.  Naturally, a collaboration of this many brilliant minds warrants much disagreement.  Some of the researchers believe the superior goals of the Blue Brain project are unachievable. 

Wolfgang Wahlster of the German Research Center for Artificial Intelligence, and a chief German government scientific adviser on ICT, believes that they reductionist tactic of the endeavor is flawed.

“Imagine you could follow in one of the most advanced Pentium chips today what each and every transistor is doing right now,” he said in a statement.

“Then I ask, ‘What is happening? Is Word running? Are you doing a Google search?’ You couldn’t answer. Looking at this level you cannot figure it out.

“This is very interesting research and I’m not criticizing it, but it doesn’t help us in computer science in having the intelligent behavior of humans replicated.”

By building up from one neocortical column to the entire neocortex, Professor Markram asserts, the ethereal “emergent properties” that depict human thought will slowly make themselves noticeable.

“They are not things that are easily predicted by just knowing elements – by definition – but by putting them together you can explore the principles, where they came from. Basically that’s what we’re after: understanding the principles of emergent properties.”

The very core of being human is based on the spatial awareness of lower mammal graduate to political positions and artistic expression in humans, which is all derivative of these emergent properties. 

On the Net:

Chandra Eyes Impact Of Galaxy Jet

A survey by the Chandra X-ray observatory has revealed in detail, for the first time, the effects of a shock wave blasted through a galaxy by powerful jets of plasma emanating from a supermassive black hole at the galactic core. The observations of Centaurus A, the nearest galaxy that contains these jets, have enabled astronomers to revise dramatically their picture of how jets affect the galaxies in which they live. The results will be presented on Wednesday 22nd April at the European Week of Astronomy and Space Science in Hatfield by Dr Judith Croston of the University of Hertfordshire.

A team led by Dr Croston and Dr Ralph Kraft, of the Harvard-Smithsonian Center for Astrophysics in the USA, used very deep X-ray observations from Chandra to get a new view of the jets in Centaurus A. The jets inflate large bubbles filled with energetic particles, driving a shock wave through the stars and gas of the surrounding galaxy. By analyzing in detail the X-ray emission produced where the supersonically expanding bubble collides with the surrounding galaxy, the team were able to show for the first time that particles are being accelerated to very high energies at the shock front, causing them to produce intense X-ray and gamma-ray radiation. Very high-energy gamma-ray radiation was recently detected from Centaurus A for the first time by another team of researchers using the High Energy Stereoscopic System (HESS) telescope in Namibia.

“Although we expect that galaxies with these shock waves are common in the Universe, Centaurus A is the only one close enough to study in such detail,” said Dr Croston. “By understanding the impact that the jet has on the galaxy, its gas and stars, we can hope to understand how important the shock waves are for the life cycles of other, more distant galaxies.”

The powerful jets are found in only a small fraction of galaxies but are most common in the largest galaxies, which are thought to have the biggest black holes. The jets are believed to be produced near to a central supermassive black hole, and travel close to the speed of light for distances of up to hundreds of thousands of light years. Recent progress in understanding how galaxies evolve suggests that these jet-driven bubbles, called radio lobes, may play an important part in the life cycle of the largest galaxies in the Universe.

Energetic particles from radio galaxies may also reach us directly as cosmic rays hitting the Earth’s atmosphere. Centaurus A is thought to produce many of the highest energy cosmic rays that arrive at the Earth. The team believe that their results are important for understanding how such high-energy particles are produced in galaxies as well as for understanding how massive galaxies evolve.

The results of this research will be published in a forthcoming issue of the Monthly Notices of the Royal Astronomical Society.

Image Caption: The image shows in red the X-ray emission produced by high-energy particles accelerated at the shock front where Centaurus A’s expanding radio lobe (shown in blue) collides with the surrounding galaxy. (In the top-left corner X-ray emission from close to the central black hole, and from the X-ray jet extending in the opposite direction can also be seen.)

On the Net:

Distant Signs Of Water In The Universe Detected

Astronomers have found the most distant signs of water in the Universe to date. Dr John McKean of the Netherlands Institute for Radio Astronomy (ASTRON) will be presenting the discovery at the European Week of Astronomy and Space Science in Hatfield on Wednesday 22nd April.

The water vapor is thought to be contained in a jet ejected from a supermassive black hole at the centre of a galaxy, named MG J0414+0534. The water emission is seen as a maser, where molecules in the gas amplify and emit beams of microwave radiation in much the same way as a laser emits beams of light. The faint signal is only detectable by using a technique called gravitational lensing, where the gravity of a massive galaxy in the foreground acts as a cosmic telescope, bending and magnifying light from the distant galaxy to make a clover-leaf pattern of four images of MG J0414+0534. The water maser was only detectable in the brightest two of these images.

Dr McKean said, “We have been observing the water maser every month since the detection and seen a steady signal with no apparent change in the velocity of the water vapor in the data we’ve obtained so far. This backs up our prediction that the water is found in the jet from the supermassive black hole, rather than the rotating disc of gas that surrounds it.”

The radiation from the water maser was emitted when the Universe was only about 2.5 billion years old, a fifth of its current age.

“The radiation that we detected has taken 11.1 billion years to reach the Earth. However, because the Universe has expanded like an inflating balloon in that time, stretching out the distances between points, the galaxy in which the water was detected is about 19.8 billion light years away,” explained Dr McKean.

Although since the initial discovery the team has looked at five more systems that have not had water masers, they believe that it is likely that there are many more similar systems in the early Universe. Surveys of nearby galaxies have found that only about 5% have powerful water masers associated with active galactic nuclei. In addition, studies show that very powerful water masers are extremely rare compared to their less luminous counterparts. The water maser in MG J0414+0534 is about 10 000 times the luminosity of the Sun, which means that if water masers were equally rare in the early Universe, the chances of making this discovery would be improbably slight.

“We found a signal from a really powerful water maser in the first system that we looked at using the gravitational lensing technique. From what we know about the abundance of water masers locally, we could calculate the probability of finding a water maser as powerful as the one in MG J0414+0534 to be one in a million from a single observation. This means that the abundance of powerful water masers must be much higher in the distant Universe than found locally because I’m sure we are just not that lucky!” said Dr McKean.

The discovery of the water maser was made by a team led by Dr Violette Impellizzeri using the 100-metre Effelsberg radio telescope in Germany during July to September 2007. The discovery was confirmed by observations with the Expanded Very Large Array in the USA in September and October 2007. The team included Alan Roy, Christian Henkel and Andreas Brunthaler, from the Max Planck Institute for Radio Astronomy, Paola Castangia from Cagliari Observatory and Olaf Wucknitz from the Argelander Institute for Astronomy at Bonn University. The findings were published in Nature in December 2008.

The team is now analyzing high-resolution data to find out how close the water maser lies to the supermassive black hole, which will give them new insights into the structure at the centre of active galaxies in the early Universe.

“This detection of water in the early Universe may mean that there is a higher abundance of dust and gas around the super-massive black hole at these epochs, or it may be because the black holes are more active, leading to the emission of more powerful jets that can stimulate the emission of water masers. We certainly know that the water vapor must be very hot and dense for us to observe a maser, so right now we are trying to establish what mechanism caused the gas to be so dense,” said Dr McKean.

Image Caption: The image is made from HST data and shows the four lensed images of the dusty red quasar, connected by a gravitational arc of the quasar host galaxy. The lensing galaxy is seen in the centre, between the four lensed images. Credit: John McKean/HST Archive data

On the Net:

Hubble Reveals Formation Of First Massive Galaxies

First results from the GOODS NICMOS survey, the largest Hubble Space Telescope program ever led from outside of the United States, reveal how the most massive galaxies in the early Universe assembled to form the most massive objects in the Universe today. Dr Chris Conselice from the University of Nottingham will present the results at the European Week of Astronomy and Space Science at the University of Hertfordshire on Wednesday 22nd April.

The observations are part of the Great Observatories Origins Deep Survey (GOODS), a campaign that is using NASA’s Spitzer, Hubble and Chandra space telescopes together with ESA’s XMM Newton X-ray observatory to study the most distant Universe. A team of scientists from six countries used the NICMOS near infrared camera on the Hubble Space Telescope to carry out the deepest ever survey of its type at near infrared wavelengths. Early results show that the most massive galaxies, which have masses roughly 10 times larger than the Milky Way, were involved in significant levels of galaxy mergers and interactions when the Universe was just 2-3 billion years old.

“As almost all of these massive galaxies are invisible in the optical wavelengths, this is the first time that most of them have been observed,” said Dr Conselice, who is the Principal Investigator for the survey. “To assess the level of interaction and mergers between the massive galaxies, we searched for galaxies in pairs, close enough to each other to merge within a given time-scale. While the galaxies are very massive and at first sight may appear fully formed, the results show that they have experienced an average of two significant merging events during their life-times.”

The GOODS NICMOS results show that these galaxies did not form in a simple collapse in the early universe, but that their formation is more gradual over the course of the Universe’s evolution, taking about 5 billion years.

Dr Conselice said, “The findings support a basic prediction of the dominant model of the Universe, known as Cold Dark Matter, so they reveal not only how the most massive galaxies are forming, but also that the model that’s been developed to describe the Universe, based on the distribution of galaxies that we’ve observed overall, applies in its basic form to galaxy formation.”

The preliminary results are based on a paper led by PhD student Asa Bluck at the University of Nottingham.

Image Caption: NICMOS Image of the GOODS North field. Credit: C Conselice, A Bluck, GOODS NICMOS Team.

On the Net:

Charged Dust From Inside Enceladus

A team of planetary scientists working on the NASA/ESA/ASI Cassini-Huygens mission has discovered tiny, charged icy particles in the plume from Saturn’s moon Enceladus that offer a tantalizing glimpse of the interior of this enigmatic world. Dr Geraint Jones and Dr Chris Arridge, both from University College London’s Mullard Space Science Laboratory, will present the results on behalf of the Cassini Plasma Spectrometer (CAPS) instrument team on Wednesday 22nd April at the European Week of Astronomy and Space Science conference at the University of Hertfordshire.

Cassini has been exploring Saturn and its moons since 2004. Five hundred kilometer (300 mile)-wide Enceladus, discovered from Slough in 1789 by William Herschel, has been found by Cassini’s suite of instruments to possess active jets near its southern pole that spew gas and water out into space over thousands of kilometers. During two particularly close flybys of the moon in 2008, skimming only 52 and 25 km from the surface at around 15 km per second (54000 km per hour), the CAPS instrument on the spacecraft was pointed to scoop up gas as it zoomed through the plume.

The CAPS instrument is designed to detect charged gas (plasma), but its measurements in the plume revealed a surprise: the instrument also detected tiny ice grains whose signatures could only be present if they were electrically charged. These grains, probably only measuring a few nanometers across (billionths of a meter ““ 50 000 times thinner than a human hair), fall into a size range between gas atoms and much larger ice grains, both of which were sampled directly during previous Enceladus flybys. The particles have both positive and negative electrical charges, and the mix of the charges varied as the Cassini spacecraft crossed the plume.

Jones and Arridge suggest that the grains may be charged through so-called triboelectric processes, through bumping together in the vent below Enceladus’s surface before they emerge into the plume. This provides important hints to the conditions in the vents, and in turn may help with understanding conditions in the interior.

Drs Jones and Arridge are intrigued by what their discovery reveals about Enceladus: “What are particularly fascinating are the bursts of dust that CAPS detects when Cassini passes through the individual jets in the plume” says Jones. “Each jet is split according to charge though”, adds Arridge, “Negative grains are on one side, and positive ones on the other”.

As Arridge will explain in his presentation, as these charged grains travel away from Enceladus, their paths are bent by electric and magnetic fields in Saturn’s giant magnetosphere. In this way Saturn’s magnetosphere acts as an enormous mass spectrometer for the plume particles, allowing scientists to constrain their masses. Arridge has begun modeling the paths of these newly-discovered particles.

Ionized gas (plasma) in Saturn’s magnetosphere flows past Enceladus at over 80000 km per hour. Arridge’s results show that for this enormous mass spectrometer to work and for these dust particles to reach Cassini, this river of plasma must be significantly slowed down, in and near the plume, to speeds of less than 3200 km per hour. This slowing of the plasma is a result of the plume injecting particles into the plasma stream – making the whole flow slow down in a similar effect to when cars join a busy motorway. These new results provide further evidence that the material in the Enceladus plume has a huge influence on the moon’s surroundings.

Future Cassini flybys will help further understand the processes that occur at Enceladus and in its vicinity. William Herschel could not have suspected that the tiny point of light that he found in 1789 would turn out to be such an exotic place.

Image Caption: Observations from the Cassini Plasma Spectrometer (CAPS) made during the Cassini flyby of Enceladus on 12th March 2008, superimposed on Cassini’s path. As the spacecraft passed the moon, CAPS detected streams of charged particles in individual jets within the plume; negative particles are shown in this view. Each ribbon in the image gives an indication of the measured particle energy per charge: high energy particle fluxes are shown nearest Enceladus, and lower energy particles are farthest. The red points marked on Enceladus show the locations of known jet sources found by other Cassini instruments. Credit: MSSL-UCL

On the Net:

Astronomers Find ‘Garden Hose’ Jet Trail Nebula

Using the NASA Rossi X-ray Timing Explorer (RXTE) satellite, a team of astronomers have discovered an object predicted, but never seen before ““ a “Ëœjet trail’ nebula. Team leader Dr Klaas Wiersema of the University of Leicester will present the discovery on Wednesday 22nd April at the European Week of Astronomy and Space Science conference at the University of Hertfordshire.

The RXTE satellite has been scanning the centre of our galaxy every few days for the last years, searching for variable X-ray sources. Through these scans it has found a multitude of varying X-ray sources, most of which are thought to be X-ray binaries. These systems consist of a compact star (a neutron star or black hole) that pulls material away from a “normal” companion star. This material forms hot disks, which emit X-rays. X-ray binaries are also known to spout jets of gas at velocities very close to the speed of light.

While most of them are highly variable in intensity, there is also a subclass found by RXTE which is nearly constant in brightness and rather faint. It is this class of sources that Dr Wiersema and his team set out to study. They obtained accurate positions of the X-ray sources using the NASA Chandra X-ray space telescope and used the European Southern Observatory’s 3.6-m telescope at La Silla in Chile to search for the corresponding optical signals. The sources were then confirmed as X-ray binaries.

But one of these sources surprised the team. In addition to a faint optical source a bright large nebula (cloud of gas and dust) was visible on the optical images. This nebula consists of two stripes, and is like no other nebula seen before ““ it is a completely new class of object.

Careful measurements of the shape of the nebula helped the team to understand the origin of the nebula: it appears to be made by the powerful jets of the X-ray binary. The jets of the binary slam into the interstellar medium (ISM – the tenuous gas between the stars), where they make the gas radiate. As the binary moves rapidly through the galaxy, the jet-ISM interaction points move with it, creating the so-called “jet trails” we see in the image.

These trails had been predicted by theorists in the past, but despite searches were not seen before in other sources, as they require a rare set of circumstances to form: the X-ray binary has to move very rapidly (in this case about 100 km per second across the line of sight), and the interstellar medium has to be denser than normal.

Dr Wiersema compares the nebula pattern to garden hoses on soil. “Imagine holding two powerful hoses, pointing to the ground. Where the water hits the ground, mud splashes up. If you stand still, a large circular patch of mud would form and slowly spread out. But if you walk quickly across the garden, you make two parallel stripes of mud. The jets from the X-ray binary make the nebula in the same way.”

The accidental discovery of this nebula gives astronomers a powerful new tool to help them understand how X-ray binaries live their life. The power of the jet now and in the past can be derived from the shape and brightness of the nebula and shapes a new view of the way X-ray binaries produce these jets.

Image Caption: The fan-like nebulosity is clearly visible in this image from the ESO 3.6-m telescope. The powerful jets emitted by the X-ray binary (which itself is too faint to see in this image) crash into the interstellar medium. Because this X-ray binary is moving quickly through space, it has a fast proper motion and drags these “impact points” along with it. This leaves two long “trails” behind: the two stripes of emission seen running diagonally across the image. Credit: K. Wiersema / ESO / University of Leicester

On the Net:

A (Less)er Challenge To Galaxy Formation

An international team of astronomers have undertaken a survey with a new submillimeter camera have discovered more than a hundred dusty galaxies in the early Universe, each of which is in the throes of an intense burst of star formation. One of these galaxies is an example of a rare class of starburst, seen just 1 billion years after the Big Bang. In her presentation on Wednesday 22nd April at the European Week of Astronomy and Space Science conference, team leader Dr. Kristen Coppin of Durham University will discuss the new results and how they may present a direct challenge to our current ideas of how galaxies formed.

The team (known as the LESS collaboration) used the new Large Apex Bolometer Camera (LABOCA) camera on the Atacama Pathfinder Experiment (APEX) telescope sited in the Atacama Desert in Chile to make a map of the distant galaxies in a region of the sky called the Extended Chandra Deep Field South.  These galaxies are so far away that we see them as they appeared billions of years ago. LABOCA is sensitive to light at wavelengths just below 1mm (submillimeter radiation), and is able to find very dusty and very luminous galaxies at very early times in the history of the Universe. These submillimeter galaxies represent massive bursts of star formation associated with the early formation of some of the most massive galaxies in the present-day Universe: giant elliptical galaxies.

For many years it has been thought that these giant elliptical galaxies formed most of their stars at very early times in the Universe, within the first billion years after the Big Bang. However, very few examples of these very distant and very bright dusty sources have been found in submillimeter surveys, until the LESS collaboration completed their survey of a Full Moon-sized patch of sky in the southern hemisphere constellation of Fornax.  Their survey is the largest and deepest of its kind in submillimeter radiation and reveals over a hundred galaxies that are forming stars at a prodigious rate.

Working with their new map, the team identified one of the submillimeter sources as being associated with a star forming galaxy which is seen just 1 billion years after the Big Bang. This remarkable galaxy shows the signatures of both intense star formation and obscured black hole growth when the Universe was only 10 percent of its current age. Dr. Coppin and the LESS team suggest that there could be far more submillimeter galaxies lurking at these early times than had previously been thought.  Dr Coppin comments, “The discovery of a larger number of such active galaxies at such an early time would be at odds with current galaxy formation models”.

Image Caption: The image shows the most distant submillimeter galaxy discovered by the LESS collaboration. The main image shows a wide 3-color optical image from the NASA/ESA Hubble Space Telescope (HST), overlaid by the contour map from the LESS survey made using the LABOCA submillimeter camera. Higher resolution radio and NASA Spitzer Space Telescope mid-infrared data have pinpointed the source of the submillimeter emission to the optical galaxy indicated by a box. The observed energy output of the galaxy measured as a function of wavelength is plotted in the inset, showing that most of the energy is being emitted in the far-infrared and submillimeter from dust-reprocessed starlight. Using spectroscopic data from the Keck telescope on Hawaii and the ESO VLT in Chile, the light travel time to the object is 12 billion years, meaning we are seeing it as it was just over 1 billion years after the Big Bang. Credit: K. Coppin / the LESS collaboration

On the Net:

Satellite Galaxies Challenge Newtonian Model

The high speed of stars and apparent presence of “Ëœdark matter’ in the satellite galaxies that orbit our Milky Way Galaxy presents a direct challenge to Newton’s theory of gravitation, according to physicists from Germany, Austria and Australia. Professor Pavel Kroupa of the University of Bonn’s Argelander-Institut fuer Astronomie (AlfA) will discuss the results of the team’s two studies in a presentation on Wednesday 22nd April at the European Week of Astronomy and Space Science conference at the University of Hertfordshire.

Together with scientists at the University of Vienna and the Australian National University in Canberra, the AlfA team looked at the small dwarf galaxies that orbit the Milky Way. Some of these contain only a few thousand stars and so are relatively faint and difficult to find. Standard cosmological models predict the presence of hundreds of these companions around most of the larger galaxies, but up to now only 30 have been observed around the Milky Way.

The team of scientists looked at the distribution of these satellite dwarf galaxies and discovered they were not where they should be. “There is something odd about their distribution”, explains Professor Kroupa. “They should be uniformly arranged around the Milky Way, but this is not what we found.” The astronomers discovered that the eleven brightest of the dwarf galaxies lie more or less in the same plane – in a kind of disk shape – and that they revolve in the same direction around the Milky Way (in the same way as planets in the Solar System revolve around the Sun).

Professor Kroupa and the other physicists believe that this can only be explained if today’s satellite galaxies were created by ancient collisions between young galaxies. Team member and former colleague Dr Manuel Metz, now at the Deutsches Zentrum fuer Luft- and Raumfahrt, also worked on the study. “Fragments from early collisions can form the revolving dwarf galaxies we see today” comments Dr Metz. But he adds that this introduces a paradox. “Calculations suggest that the dwarf satellites cannot contain any dark matter if they were created in this way. But this directly contradicts other evidence. Unless the dark matter is present, the stars in the galaxies are moving around much faster than predicted by Newton’s standard theory of gravitation.”

Dr Metz continues, “The only solution is to reject Newton’s theory. If we live in a Universe where a modified law of gravitation applies, then our observations would be explainable without dark matter.”

With this evidence, the team share the convictions of a number of groups around the world who believe that some of the fundamental principles of physics have been incorrectly understood. If their ideas are correct, it will not be the first time that Newton’s theory of gravitation has been modified. In the 20th century it happened when Einstein introduced his Special and General Theories of Relativity and again when quantum mechanics was developed to explain physics on sub-atomic scales. The anomalies detected by Dr. Metz and Professor Kroupa and their collaborators imply that where weak accelerations predominate, a “Ëœmodified Newtonian dynamic’ may have to be used. If the scientists are right then this has far-reaching consequences for our understanding of the Universe we live in.

The two studies will appear in papers in Monthly Notices of the Royal Astronomical Society and the Astrophysical Journal.

Image Caption: An image of the Draco satellite dwarf galaxy. Credit: Mischa Schirmer, University of Bonn

On the Net:

A New Hope For Bio-Fuels?

Scientists at the University of California in San Francisco have discovered a potentially revolutionary new way of creating gasoline without the use of food crops, and without the pesky necessity of spending millions of years buried beneath the earth.

Using a peculiar microbe discovered in a French garbage dump combined with the wonders of modern synthetic biology, a team of researchers has identified a chemical compound capable of converting carbon-based biomass into a gas that can be processed to produce gasoline.

The chemical conversion process is able to use a variety of inexpensive substrates, such as agricultural waste products like the leaves and stalks left over from corn and sugar cane harvesting. 

The end product is gasoline that the researchers claim is chemically identical to that derived from fossil-fuel sources in petroleum refineries around the world.

“You could fill your car up with it right now, so there’s no difference in engine technology or anything like that,” said Chris Voigt, chief researcher for the project.

Voigt believes that the U.S. could potentially turn to this new breed of bio-fuels as a security net in the event of unstable oil prices on the world market.

“Then, if the sugar price goes high and the oil price goes down, you could flip it and the consumer would not know any difference,” he added ““ a possibility not available with ethanol. 

In recent years, opponents of the ethanol movement ““ the other bio-fuel ““ have levied the double-edged criticism that production of the corn-based fuel drives up global food prices and is not environmentally friendly, despite its green reputation. 

Both are criticisms that would be circumvented by the new fuel-production technology, since it utilizes essentially useless plant waste products that are typically discarded by farmers.

Voigt estimates that with advancements in the efficiency of the conversion process and use of genetically modified plants, gasoline could eventually be produced for as little as $1.65 a gallon from sugar cane bagasse.

He also believes that fuel derived from cellulosic materials such as poplar trees could be even cheaper to produce ““ at roughly $1.10 to $1.30 a gallon.  The problem however lies in creating a sustainable and profitable model for growing the necessary amount of trees, which require a substantially longer growth period before they can be harvested.

For years, scientists have tried and “failed miserably” to find an enzyme that could efficiently break down cellulose ““ the main component of plants’ rigid cell walls ““ in hopes of creating cheap bio-fuels, said Voigt.

“So we started looking at organisms that can do that naturally,” he explained.  “We then found this one that we realized was unique.”

Voigt’s team utilized a novel species of bacterium first discovered in a French garbage dump in the 1980’s.  They then teamed the bacterium up with common yeast cells.  When the two are placed together on a plant-based substrate like switchgrass, the chemistry is amazing.

The bacterium first consumes the vegetation and produces the chemical acetate as a metabolic side-product.  The yeast then feeds on the acetate, which it in turn converts into methyl halides, a family of molecules traditionally used as agricultural fumigants.

The methyl halides are then released as a gas that scientists can easily collect and convert to gasoline.

According to Voigt, by simply substituting a different catalyst the methyl halides can also be converted into other useful chemicals, such as the ethylene used in plastic bags.

The group’s results are sure to reinvigorate hopes for the future of truly green bio-fuels.

Voigt estimates that the first large-scale pilot production facility for methyl halide conversion could be ready for production in as little as three years.

On the Net: