Invasive Frogs In Florida Floated From Cuba

Amphibian experts have long wrangled over the origins of the Greenhouse frog (Eleutherodactylus planirostris) and the Cuban treefrog (Osteopilus septentrionalis). The two species of invasive frog which are hopping their way through Florida probably got to the state by hitching a ride on floating debris from Cuba, according to a study published on Wednesday, reports AFP.

The two species are widespread across the Caribbean, but were first spotted in the Florida Keys — the island chain that starts at Florida’s southeastern tip — in the mid-1800s. Years later, both began became firmly established on the mainland and embarked on a relentless advance.

Today, the Greenhouse frog has established colonies as far north as Alabama, while the Cuban treefrog can be found all around the southern Florida coastline.

Scientists led by Blair Hedges at Pennsylvania State University analyzed the frogs’ DNA to identify the amphibians’ closest native relatives, which would tease out clues about this unusual migration.

The Greenhouse frog has been shown to originate from a small area of western Cuba, while the Cuban treefrog came from at least two sources in Cuba, of which the best bet is a remote peninsula in the western part of the island.

The team believes that the two species came to Florida thousands of years ago, quite possibly by climbing on board vegetation that then floated like a raft across the narrow strait.

The frogs have adapted over the years to the colder winters of Florida and this enabled them to spread northwards when transport and commerce links developed in the mid-20th century. “Both of them could have come across (to Florida) naturally, not by swimming, because these frogs would die pretty quickly in salt water, but by floating across on vegetation,” Hedges said in a phone interview with AFP.

“There are plenty of examples of flotsam crossings, on short distances as well as long distances, even across oceans. These frogs, especially the treefrog, are on many small islands in the Caribbean which have no humans, so clearly they get around. There’s no other way they could have got to those islands other than by floating.”

Hedges added: “What we’re speculating in this paper is that if they were there on the Keys on their own for thousands of years, they could have adapted to a more continental climate, making them better invasive species and when they made their way up into Florida, that may explain why they have done so well.”

Image Caption: A Cuban treefrog Osteopilus septentrionalis. Credit: Wikipedia   

On the Net:

Pennsylvania State University

C. Elegans Gets Viruses

By Michael C. Purdy, Washington University in St. Louis

Finding means C. elegans may aid studies of human infections

A workhorse of modern biology is sick, and scientists couldn’t be happier.

Researchers at Washington University School of Medicine in St. Louis, the Jacques Monod Institute in France and Cambridge University have found that the nematode C. elegans, a millimeter-long worm used extensively for decades to study many aspects of biology, gets naturally occurring viral infections.

The discovery means C. elegans is likely to help scientists study the way viruses and their hosts interact.

“We can easily disable any of C. elegans’ genes, confront the worm with a virus and watch to see if this makes the infection worse, better or has no effect,” says David Wang, PhD. “If it changes the worm’s response to infection, we will look to see if similar genes are present in humans and other mammals.”

Wang, associate professor of pathology and immunology and of molecular microbiology at Washington University School of Medicine, says that several fundamental aspects of human biology, including the ability of cells to self-destruct to prevent cancer, and RNA interference, an important process for regulating how genes are used to make proteins, were first identified in C. elegans and later affirmed to be present in humans.

The findings appear online in PLoS Biology.

Marie-Anne Felix, PhD, a researcher who studies nematodes at the Monod Institute, began the study by gathering C. elegans from rotting fruit in French orchards. Felix noted that some of her sample worms appeared to be sick. Treatment with antibiotics failed to cure them.

Felix then repeated a classic biology experiment that led to the discovery of viruses.

“She ground up the sick worms, passed them through a filter fine enough to remove any bacterial or parasitic infectious agents and exposed a new batch of worms to the ground-up remains of the first batch,” Wang says. “When the new batch got sick, she knew that a viral infection was likely to be present.”

Wang specializes in the identification of novel viruses. He found the worms had been suffering infections from two viruses related to nodaviruses, a class of viruses previously found to infect insects and fish. Nodaviruses are not currently known to infect humans. Tests showed one of the new viruses can infect the strain of C. elegans most commonly used in research.

“Model organisms are essential to important steps forward in biology, and we’re eager to see what C. elegans can teach us about the way hosts and viruses interact,” Wang says.

Felix M-A, Ashe A, Piffaretti J, Wu G, Nuez I, Belicard T, Jiang Y, Zhao G, Franz CJ, Goldstein LD, Sanroman M, Miska EA, Wang D. Natural and experimental infection of Caenorhabditis nematodes by novel viruses related to nodaviruses. PLoS: Biology, Jan. 25, 2011.

Funding from French National Center for Scientific Research, the National Institutes of Health, the Midwest Regional Center for Excellence for Biodefense and Emerging Infectious Diseases Research, the Cancer Research UK Programme, the Burroughs-Wellcome Fund and the Herchel-Smith Foundation supported this research.

Image Caption: Scientists have discovered that C. elegans, a microscopic worm biologists have used in the lab to identify important biological phenomena, suffers from natural viral infections. This may mean that C. elegans can help scientists learn more about how hosts and viruses interact. Credit: Marie-Anne Felix, the Monod Institute

On the Net:

Researchers Eliminate Roadblock In Regenerative Medicine

By Wileen Wong Kromhout, UCLA

New ‘cocktails’ support long-term maintenance of human embryonic stem cells

In regenerative medicine, large supplies of safe and reliable human embryonic stem (hES) cells are needed for implantation into patients, but the field has faced challenges in developing cultures that can consistently grow and maintain clinical-grade stem cells.
 
Standard culture systems use mouse “feeder” cells and media containing bovine sera to cultivate and maintain hES cells, but such animal product”“based media can contaminate the cells. And because of difficulties in precise quality control, each batch of the medium can introduce new and unwanted variations.
 
Now, a team of stem cell biologists and engineers from UCLA has identified an optimal combination and concentration of small-molecule inhibitors to support the long-term quality and maintenance of hES cells in feeder-free and serum-free conditions. The researchers used a feedback system control (FSC) scheme to innovatively and efficiently select the small-molecule inhibitors from a very large pool of possibilities.
 
The research findings, published Jan. 25 in the journal Nature Communications, represent a major advance in the quest to broadly transition regenerative medicine from the benchtop to the clinic.
 
“What is significant about this work is that we’ve been able to very rapidly develop a chemically defined culture medium to replace serum and feeders for cultivating clinical-grade hES cells, thereby removing a major roadblock in the area of regenerative medicine,” said Chih-Ming Ho, the Ben Rich”“Lockheed Martin Professor at the UCLA Henry Samueli School of Engineering and Applied Science and a member of the National Academy of Engineering.
 
Unlike current animal product”“based media, the new medium is a “defined” culture medium “” one in which every component is known and traceable. This is important for clinical applications and as drugs or cells enter the world of regulatory affairs, including good manufacturing practice compliance and Food and Drug Administration supervision.
 
“It is also the first defined medium to allow for long term single-cell passage,” said the paper’s senior author, Hong Wu, the David Geffen Professor of Molecular and Medical Pharmacology at the David Geffen School of Medicine at UCLA and a researcher with UCLA’s Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research.
 
Single-cell passaging “” a process in which hES cells are dissociated into single cells and subcultured through single-cell”“derived colonies “” is important in overcoming the massive cell death associated with hES cell dissociation during routine passage, and it allows for genetic manipulation at the clonal level.
 
“Although other studies have demonstrated growth of hES cells under defined media formulations and/or on defined surfaces, to the best of our knowledge, this is the first study that combines defined cultures with routine single-cell passaging, which plays an important role in supplying a large mass of clinically applicable cells,” said Hideaki Tsutsui, a UCLA postdoctoral scholar and lead author of the study. “Thus, our hES cell culture system, guided by the FSC technique, will bring hES cells one step closer to clinical therapies.”
 
Initially, the very large number of small molecules in the culture medium and their unknown synergistic effects made it difficult for researchers to assess the proper concentration of each for achieving long-term expansion of hES cells. The major challenge was to find the best way  to sort out those molecules and rapidly determine the best combinatorial concentrations.
 
The breakthrough, ultimately, was the product of a close interdisciplinary collaboration.
 
Tsutsui, then a UCLA Engineering graduate student, and Bahram Valamehr, then a graduate student at the Geffen School of Medicine, started working on the project two years ago. Armed with biological readouts and analyses of stem cells mastered in Hong Wu’s laboratory through the lab’s extensive accomplishments in stem cell research, Tsutsui and Valamehr used the FSC scheme “” developed previously by Ho’s group to search for optimal drug combinations for viral infection inhibition and cancer eradication “” to facilitate the rapid screening of a very large number of possibilities.
 
Working together, the team was able to discover a unique combination of three small-molecule inhibitors that supports long-term maintenance of hES cell cultures through routine single-cell passaging.
 
“There are certain research projects biologists can dream about, and we know we can eventually get there, but we don’t have the capacity to achieve them in a timely manner, especially in a study like this,” Wu said. “It would have taken 10 graduate students another 10 years to test all the possible combinations of molecules. Having an opportunity to collaborate with the engineering school has been invaluable in making this dream a reality.”
 
“This is the best example of demonstrating the strength and potential of interdisciplinary collaborations,” said Ho, who is also director of the Center for Cell Control at UCLA Engineering and a senior author of the paper. “Engineers and biologists working side by side can accomplish a mission impossible.”
 
Other authors of the study included Antreas Hindoyan, Rong Qiao, Xianting Ding, Shuling Guo, Owen N. Witte and Xin Liu.
 
The project received major funding from the National Institutes of Health Roadmap for Medical Research through the UCLA Center for Cell Control and a seed grant from the Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research.

Image Caption: Human embryonic stem cells in culture created by UCLA researchers

On the Net:

The Practical Full-Spectrum Solar Cell Comes Closer

Paul Preuss, Berkeley Lab

Solar cells are made from semiconductors whose ability to respond to light is determined by their band gaps (energy gaps). Different colors have different energies, and no single semiconductor has a band gap that can respond to sunlight’s full range, from low-energy infrared through visible light to high-energy ultraviolet.

Although full-spectrum solar cells have been made, none yet have been suitable for manufacture at a consumer-friendly price. Now Wladek Walukiewicz, who leads the Solar Energy Materials Research Group in the Materials Sciences Division (MSD) at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), and his colleagues have demonstrated a solar cell that not only responds to virtually the entire solar spectrum, it can also readily be made using one of the semiconductor industry’s most common manufacturing techniques.

The new design promises highly efficient solar cells that are practical to produce. The results are reported in a recent issue of Physical Review Letters.

How to make a full-spectrum solar cell

“Since no one material is sensitive to all wavelengths, the underlying principle of a successful full-spectrum solar cell is to combine different semiconductors with different energy gaps,” says Walukiewicz.

One way to combine different band gaps is to stack layers of different semiconductors and wire them in series. This is the principle of current high-efficiency solar cell technology that uses three different semiconductor alloys with different energy gaps. In 2002, Walukiewicz and Kin Man Yu of Berkeley Lab’s MSD found that by adjusting the amounts of indium and gallium in the same alloy, indium gallium nitride, each different mixture in effect became a different kind of semiconductor that responded to different wavelengths. By stacking several of the crystalline layers, all closely matched but with different indium content, they made a photovoltaic device that was sensitive to the full solar spectrum.

However, says Walukiewicz, “Even when the different layers are well matched, these structures are still complex ““ and so is the process of manufacturing them. Another way to make a full-spectrum cell is to make a single alloy with more than one band gap.”

In 2004 Walukiewicz and Yu made an alloy of highly mismatched semiconductors based on a common alloy, zinc (plus manganese) and tellurium. By doping this alloy with oxygen, they added a third distinct energy band between the existing two ““ thus creating three different band gaps that spanned the solar spectrum. Unfortunately, says Walukiewicz, “to manufacture this alloy is complex and time-consuming, and these solar cells are also expensive to produce in quantity.”

The new solar cell material from Walukiewicz and Yu and their colleagues in Berkeley Lab’s MSD and RoseStreet Labs Energy, working with Sumika Electronics Materials in Phoenix, Arizona, is another multiband semiconductor made from a highly mismatched alloy. In this case the alloy is gallium arsenide nitride, similar in composition to one of the most familiar semiconductors, gallium arsenide. By replacing some of the arsenic atoms with nitrogen, a third, intermediate energy band is created. The good news is that the alloy can be made by metalorganic chemical vapor deposition (MOCVD), one of the most common methods of fabricating compound semiconductors.

How band gaps work

Band gaps arise because semiconductors are insulators at a temperature of absolute zero but inch closer to conductivity as they warm up. To conduct electricity, some of the electrons normally bound to atoms (those in the valence band) must gain enough energy to flow freely ““ that is, move into the conduction band. The band gap is the energy needed to do this.

When an electron moves into the conduction band it leaves behind a “hole” in the valence band, which also carries charge, just as the electrons in the conduction band; holes are positive instead of negative.

A large band gap means high energy, and thus a wide-band-gap material responds only to the more energetic segments of the solar spectrum, such as ultraviolet light. By introducing a third band, intermediate between the valence band and the conduction band, the same basic semiconductor can respond to lower and middle-energy wavelengths as well.

This is because, in a multiband semiconductor, there is a narrow band gap that responds to low energies between the valence band and the intermediate band. Between the intermediate band and the conduction band is another relatively narrow band gap, one that responds to intermediate energies. And finally, the original wide band gap is still there to take care of high energies.

“The major issue in creating a full-spectrum solar cell is finding the right material,” says Kin Man Yu. “The challenge is to balance the proper composition with the proper doping.”

In solar cells made of some highly mismatched alloys, a third band of electronic states can be created inside the band gap of the host material by replacing atoms of one component with a small amount of oxygen or nitrogen. In so””called II-VI semiconductors (which combine elements from these two groups of Mendeleev’s original periodic table), replacing some group VI atoms with oxygen produces an intermediate band whose width and location can be controlled by varying the amount of oxygen. Walukiewicz and Yu’s original multiband solar cell was a II-VI compound that replaced group VI tellurium atoms with oxygen atoms. Their current solar cell material is a III-V alloy. The intermediate third band is made by replacing some of the group V component’s atoms ““ arsenic, in this case ““ with nitrogen atoms.

Finding the right combination of alloys, and determining the right doping levels to put an intermediate band right where it’s needed, is mostly based on theory, using the band anticrossing model developed at Berkeley Lab over the past 10 years.

“We knew that two-percent nitrogen ought to do the job,” says Yu. “We knew where the intermediate band ought to be and what to expect. The challenge was designing the actual device.”

Passing the test

Using their new multiband material as the core of a test cell, the researchers illuminated it with the full spectrum of sunlight to measure how much current was produced by different colors of light. The key to making a multiband cell work is to make sure the intermediate band is isolated from the contacts where current is collected.

“The intermediate band must absorb light, but it acts only as a stepping stone and must not be allowed to conduct charge, or else it basically shorts out the device,” Walukiewicz explains.

The test device had negatively doped semiconductor contacts on the substrate to collect electrons from the conduction band, and positively doped semiconductor contacts on the surface to collect holes from the valence band. Current from the intermediate band was blocked by additional layers on top and bottom.

For comparison purposes, the researchers built a cell that was almost identical but not blocked at the bottom, allowing current to flow directly from the intermediate band to the substrate.

The results of the test showed that light penetrating the blocked device efficiently yielded current from all three energy bands ““ valence to intermediate, intermediate to conduction, and valence to conduction ““ and responded strongly to all parts of the spectrum, from infrared with an energy of about 1.1 electron volts (1.1 eV), to over 3.2 eV, well into the ultraviolet.

By comparison, the unblocked device responded well only in the near infrared, declining sharply in the visible part of the spectrum and missing the highest-energy sunlight. Because it was unblocked, the intermediate band had essentially usurped the conduction band, intercepting low-energy electrons from the valence band and shuttling them directly to the contact layer.

Further support for the success of the multiband device and its method of operation came from tests “in reverse” ““ operating the device as a light emitting diode (LED). At low voltage, the device emitted four peaks in the infrared and visible light regions of the spectrum. Primarily intended as a solar cell material, this performance as an LED may suggest additional possibilities for gallium arsenide nitride, since it is a dilute nitride very similar to the dilute nitride, indium gallium arsenide nitride, used in commercial “vertical cavity surface-emitting lasers” (VCSELs), which have found wide use because of their many advantages over other semiconductor lasers.

With the new, multiband photovoltaic device based on gallium arsenide nitride, the research team has demonstrated a simple solar cell that responds to virtually the entire solar spectrum ““ and can readily be made using one of the semiconductor industry’s most common manufacturing techniques. The results promise highly efficient solar cells that are practical to produce.

“Engineering the Electronic Band Structure for Multiband Solar Cells,” by Nair Lopez, Lothar Reichertz, Kin Man Yu, Ken Campman, and Wladyslaw Walukiewicz, appears in the 10 January, 2011 Physical Review Letters.

Image 1: A solar cell’s ability to convert sunlight to electric current is limited by the band gaps of the semiconductors from which it is made. For example, semiconductors with wide band gaps respond to shorter wavelengths with higher energies (lower left). A semiconductor with an intermediate band has multiple band gaps and can respond to a range of energies (lower right).

Image 2: Kin Man Yu and Wladek Walukiewicz have long been leaders in multiband solar cell technology.

Image 3: At top, a test device of the new multiband solar cell was arranged to block current from the intermediate band; this allowed a wide range of wavelengths found in the solar spectrum to stimulate current that flowed from both conduction and valence bands (electrons and holes, respectively). In a comparison device, at bottom, the current from the intermediate band was not blocked, and it interfered with current from the conduction band, limiting the device’s response.

On the Net:

GRIN Plasmonics: A Practical Path To Superfast Computing, Ultrapowerful Optical Microscopy and Invisibility Carpet-Cloaking Devices

They said it could be done and now they’ve done it. What’s more, they did it with a GRIN. A team of researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California, Berkeley, have carried out the first experimental demonstration of GRIN ““ for gradient index ““ plasmonics, a hybrid technology that opens the door to a wide range of exotic optics, including superfast computers based on light rather than electronic signals, ultra-powerful optical microscopes able to resolve DNA molecules with visible light, and “invisibility” carpet-cloaking devices.

Working with composites featuring a dielectric (non-conducting) material on a metal substrate, and “grey-scale” electron beam lithography, a standard method in the computer chip industry for patterning 3-D surface topographies, the researchers have fabricated highly efficient plasmonic versions of Luneburg and Eaton lenses. A Luneburg lens focuses light from all directions equally well, and an Eaton lens bends light 90 degrees from all incoming directions.

“This past year, we used computer simulations to demonstrate that with only moderate modifications of an isotropic dielectric material in a dielectric-metal composite, it would be possible to achieve practical transformation optics results,” says Xiang Zhang, who led this research. “Our GRIN plasmonics technique provides a practical way for routing light at very small scales and producing efficient functional plasmonic devices.”

Zhang, a principal investigator with Berkeley Lab’s Materials Sciences Division and director of UC Berkeley’s Nano-scale Science and Engineering Center (SINAM), is the corresponding author of a paper in the journal Nature Nanotechnology, describing this work titled, “Plasmonic Luneburg and Eaton Lenses.” Co-authoring the paper were Thomas Zentgraf, Yongmin Liu, Maiken Mikkelsen and Jason Valentine.

GRIN plasmonics combines methodologies from transformation optics and plasmonics, two rising new fields of science that could revolutionize what we are able to do with light. In transformation optics, the physical space through which light travels is warped to control the light’s trajectory, similar to the way in which outer space is warped by a massive object under Einstein’s relativity theory. In plasmonics, light is confined in dimensions smaller than the wavelength of photons in free space, making it possible to match the different length-scales associated with photonics and electronics in a single nanoscale device.

“Applying transformation optics to plasmonics allows for precise control of strongly confined light waves in the context of two-dimensional optics,” Zhang says. “Our technique is analogous to the well-known GRIN optics technique, whereas previous plasmonic techniques were realized by discrete structuring of the metal surface in a metal-dielectric composite.”

Like all plasmonic technologies, GRIN plasmonics starts with an electronic surface wave that rolls through the conduction electrons on a metal. Just as the energy in a wave of light is carried in a quantized particle-like unit called a photon, so, too, is plasmonic energy carried in a quasi-particle called a plasmon. Plasmons will interact with photons at the interface of a metal and dielectric to form yet another quasi-particle, a surface plasmon polariton (SPP).

The Luneburg and Eaton lenses fabricated by Zhang and his co-authors interacted with SPPs rather than photons. To make these lenses, the researchers worked with a thin dielectric film (a thermplastic called PMMA) on top of a gold surface. When applying grey-scale electron beam lithography, the researchers exposed the dielectric film to an electron beam that was varied in dosage (charge per unit area) as it moved across the film’s surface. This resulted in highly controlled differences in film thickness across the length of the dielectric that altered the local propagation of SPPs. In turn, the “mode index,” which determines how fast the SPPs will propagate, is altered so that the direction of the SPPs can be influenced.

“By adiabatically tailoring the topology of the dielectric layer adjacent to the metal surface, we’re able to continuously modify the mode index of SPPs,” says  Zentgraf. “As a result, we can manipulate the flow of SPPs with a greater degree of freedom in the context of two-dimensional optics.”

Says Liu, “The practicality of working only with the purely dielectric material to transform SPPs is a big selling point for GRIN plasmonics. Controlling the physical properties of metals on the nanometer length-scale, which is the penetration depth of electromagnetic waves associated with SPPs extending below the metal surfaces, is beyond the reach of existing nanofabrication techniques.”

Adds Zentgraf, “Our approach has the potential to achieve low-loss functional plasmonic elements with a standard fabrication technology that is fully compatible with active plasmonics.”

In the Nature Nanotechnology paper, the researchers say that inefficiencies in plasmonic devices due to SPPs lost through scattering could be reduced even further by incorporating various SPP gain materials, such as fluorescent dye molecules, directly into the dielectric. This, they say, would lead to an increased propagation distance that is highly desired for optical and plasmonic devices. It should also enable the realization of two-dimensional plasmonic elements beyond the Luneburg and Eaton lenses.

Says Mikkelsen, “GRIN plasmonics can be immediately applied to the design and production of various plasmonic elements, such as waveguides and beam splitters,  to improve the performance of integrated plasmonics. Currently we are working on more complex, transformational plasmonic devices, such as plasmonic collimators, single plasmonic elements with multiple functions, and plasmonic lenses with enhanced performance.”

This research was supported by the U.S. Army Research Office and the National Science Foundation’s Nano-scale Science and Engineering Center.

Lawrence Berkeley National Laboratory is a U.S. Department of Energy (DOE) national laboratory managed by the University of California for the DOE Office of Science. Berkeley Lab provides solutions to the world’s most urgent scientific challenges including sustainable energy, climate change, human health, and a better understanding of matter and force in the universe. It is a world leader in improving our lives through team science, advanced computing, and innovative technology. Visit our Website at www.lbl.gov

On the Net:

2010 Considered One Of The Worst Years For Diaster

A United Nations-backed research report showed Monday that 2010 was among the worst years on record for natural disasters over the past 20 years, leaving upwards of 300,000 people dead, and countless more injured and/or displaced.

Nearly two-thirds of the total death toll, came from the devastating Haiti earthquake a year ago, that claimed the lives of more than 222,500 people, according to the Belgium-based Center for Research on the Epidemiology of Disasters (CRED).

Last summer’s heatwave in Russia was the second deadliest disaster of the year, leaving 55,736 people dead according to figures it compiled from insurers and media reports of official sources.

Margareta Wahlstroem, UN special representative for disaster risk reduction, told AFP that the 2010 was “one of the worst in decades in terms of the number of people killed and in terms of economic losses.”

“These figures are bad, but could be seen as benign in years to come,” she said, pointing to the impact of unplanned growth of urban areas, environmental degradation and climate change.

The economic cost of the 373 major disasters recorded in 2010 reached 109 billion dollars. Of that, an estimated 30 billion dollars in damage stemmed from the powerful earthquake that hit Chile in February. The temblor unleashed a tsunami that swept away villages and claimed most of the 521 lives lost in the quake.

China’s summer flooding and landslides caused around 18 billion dollars in damage, while Pakistan deluges cost 9.5 billion dollars, according to CRED’s annual study.

While Haiti is still struggling to recover from the earthquake that devastated most of the capital city of Port-au-Prince, it ranked lower on the global economic scale with an estimated 8 billion dollars in damage.

Of the 207 million people affected by disasters worldwide last year, Asians accounted for 89 percent of those numbers, according to CRED.

On the Net:

Cuba Entering High Speed Broadband World

Cuba is on its way to join the high-speed broadband era with an undersea fiber-optic cable laid from Venezuela, bringing the promise of speedy Internet to one of the world’s least connected countries.

A specialized ship sailed from Camuri beach this weekend, trailing the cable from buoys on the start of a 1,000-mile journey across the Caribbean sea.

Venezuelan and Cuban officials hailed the project as a blow to the United State’s embargo on the island. 

It will make Cuba’s connection speed 3,000 times faster and could help modernize its economy.

“This means a giant step for the independence and sovereignty of our people,” Rogelio Polanco, Cuba’s ambassador to Caracas, told The Guardian.

The ship will lay the cable at depths of up to 19,000 feet and is expected to reach eastern Cuba by February 8.  Cuba’s government said that the cable should be in use by June or July.

Cuba has some censorship restrictions but the impact could be profound.  The country has just 14.2 Internet users per 100 people, the western hemisphere’s lowest ratio, with access largely restricted to government offices, universities, foreign companies and tourist hotels.

The 50-year-old U.S. embargo prevented Cuba tapping into Caribbean fiber-optic cables, which forced it to rely on a slow, expensive satellite link of just 379 megabits per second.

Hugo Chavez, Venezuela’s president, funded the $70 million cable and named it Alba-1, after the region’s Caracas-led leftwing alliance. 

Improved communication is necessary to effect “historic, political and cultural change”, Ricardo Men©ndez, Venezuela’s science, technology and industry minister, told The Guardian.

The cable should boost President Raul Castro’s drive to modernize Cuba’s centrally planned economy and make state enterprises slimmer and more efficient.

Cuban officials said the priority would be improving communications for those who already had access to the island’s intranet, a government-controlled version of the Internet.  Communist daily newspaper Granma reported that broadband would mean higher quality communication but not necessarily “broader” communication.

Antonio Gonzalez-Rodiles, a scientists in Havana, told The Guardian that most Cubans would still have to rely on state media for news, meaning a diet of propaganda about government successes and distorted reporting of the outside.

“I think it’s pretty unlikely they are going to let Cubans access this immense information source, given there’s no clear [state] desire to democratize our society and reduce censorship. A lot of things are going to have to change before Cubans will be able to navigate this sea of information.”

Showing Empathy To Patients Can Improve Care

Showing clinical empathy to patients can improve their satisfaction of care, motivate them to stick to their treatment plans and lower malpractice complaints, found a study published in CMAJ (Canadian Medical Association Journal) (pre-embargo link only) http://www.cmaj.ca/embargo/cmaj090113.pdf.

“Empathy is the ability to understand another’s experience, to communicate and confirm that understanding with the other person and to then act in a helpful manner,” writes Dr. Robert Buckman, Princess Margaret Hospital and the Faculty of Medicine, University of Toronto. “Despite some overlap with other compassionate responses, particularly sympathy, empathy is distinct.”

In clinical practice, physicians do not express empathic responses frequently. In a recent study where oncologists were video-recorded speaking with their patients, oncologists only responded to 22% of moments thought to be an empathic opportunity. Another more recent study involving oncologists and lung cancer patients showed the physicians responding to only 11% of empathic opportunities.

There is new evidence indicating that empathy is an important medical tool and it can be acquired and taught in medical school. “Clinical empathy is an essential medical skill that can be taught and improved, thereby producing changes in physician behaviour and patient outcomes.”

“Our profession now needs to incorporate the teaching of clinical empathy more widely into clinical practice at all levels beginning with the selection of candidates for medical school,” write the authors. “The behavioral aspects of empathy “” the empathic response “” can be assessed and integrated into medical schools’ core communication skills training.”

The authors conclude that physicians must also model an empathetic approach to patient care in the teaching environment.

On the Net:

A Research Study Identifies Who Uploads The Majority Of The Content To The P2P Piracy Networks

A study done at Carlos III University of Madrid (UC3M) identifies and characterizes the users who upload contents on the main P2P piracy networks on Internet and points out the incentives that they find to carrying out this activity.

The research study examines the behavior of the users who are responsible for publishing over 55,000 files on the two main portals (Mininova and The Pirate Bay) of BitTorrent, the most popular P2P application for file-sharing on Internet, where some users publish contents that are then exchanged with up to tens of thousands of internet users. Their growing popularity is mostly due to the availability of contents that are of great interest, such a recently released films in the cinemas, or episodes of a variety of television series.

Users who publish contents on BitTorrent dedicate a large part of their own resources (bandwidth, storage capacity) and assume the risks involved in publishing contents that are protected by copyright laws. So, is this altruistic behavior or is there some type of economic incentive at work? “The success of BitTorrent is due to the fact that a few users make a large number of contents available in exchange for receiving economic benefits”, explain the authors of a study carried out by the Telematic Engineering Department of the UC3M, Professors Rub©n Cuevas, Carmen Guerrero and ÃÂngel Cuevas. Their analysis demonstrates that a small group of users of these applications (around one hundred) is responsible for 66 percent of the content that is published and 75 percent of the downloads. In other words: the great success of a massively used application like BitTorrent depends on a few users.

The study by the researchers at this public university in Madrid, in collaboration with scientists at the IMDEA Networks Institute, the University of Oregon (USA) and the Technical University of Darmstadt (Germany), identifies who these users are and what their incentives for massively publishing contents are. Basically, there are two different profiles. In one group there are the so-called “fake publishers”, organizations fighting illegal downloading and malicious users who publish a large quantity of false files in order to protect copyrights and spread infected software, respectively. The other group includes a small number of users (known as “top publishers”) who massively publish contents on BitTorrent and make a profit off of this activity, basically from on-line advertising and, to a lesser degree, from VIP subscriptions held by users who wish to speed up the downloading of the contents. “If these users lose interest in this activity or are eliminated from the system, BitTorrent’s traffic will be drastically reduced”, the authors of the study predict.

To carry out this research, these scientists have developed a tool that facilitates the gathering of relevant information related to thousands of files that are shared through the BitTorrent application. By means of this system, they were able to access the name of the user who published the content, his/her IP address (which provides the user’s city, country and service provider’s name ““ ISP- as well) and the IP address of those users who later used the BitTorrent application to download the contents.. “In order to remain anonymous, – explains Professor Rub©n Cuevas – many of them rent servers from companies that perform this service and then publish contents from those servers”.

The future of P2P networks

If an application such as BitTorrent’s great success depends on a few users who publish a lot of contents, what will happen if those users lose interest in this activity or are eliminated from the system (through legal actions against piracy, for example)? If other users moved in on its space, would the application survive of not? The article ends with this question, which asks us to reflect on the future and the fragility of this type of file exchange network. “In our opinion ““ the authors of the study comment ““ the success of BitTorrent lies in the availability of popular contents which are typically protected by copyright law, and people who take the risk of publishing those contents, do it because they receive an economic benefit in exchange for doing so”. Therefore, if, in the future, these users lost their incentive, either because of a decrease in advertising income or due to having to pay very expensive fines, BitTorrent would very likely cease to offer these contents, which would make people stop using the application on a massive scale. “At the present time, when the creators of contents, internet user associations and political parties are arguing about Internet piracy with regard to the controversial Sinde Law (note: ÃÂngeles González Sinde is the Spanish Minister of Culture), studies like this one are important to real understanding of the true nature of P2P content distribution networks and the economic model that is behind them”, the researchers conclude.

The study, titled “Is Content Publishing in BitTorrent Altruistic or Profit Driven?”, was recently presented at the ACM International Conference on emerging Networking Experiments and Technologies – CoNEXT, one of the most prestigious congresses in the area of communication networks, which held its sixth edition at the end of 2010 at Drexel University (Philadelphia, USA). The authors are Rub©n Cuevas, Carmen Guerrero and ÃÂngel Cuevas, of UC3M, Michal Kryzcka, of the IMDEA Networks Institute, Sebastian Kaune, from the Technical University of de Darmstadt (Germany) and Reza Rejaie, from the University of Oregon (EEUU).

On the Net:

‘Rogue Gene’ Discovery Could Help Stop Cancer Spread

British scientists from the University of East Anglia (UEA) have discovered that a “rogue gene” which helps cancer spread around the body could be blocked with the right kind of drugs and could stop many types of the disease in their tracks.

The researchers said their findings could lead to the development of new medicines to halt a critical late stage of the disease known as metastasis.

The researchers explained in their study that the culprit gene is an enzymic bonding agent found inside cancer cells.

It attacks and breaks down a naturally occurring protein in the body, which normally prevents cancer cells from spreading.

The UEA team found in laboratory tests that blocking the culprit gene’s levels, the natural inhibitor proteins were boosted and the cancer cells remained dormant.

Surinder Soond, who worked on the study, told Reuters that it was a “novel and exciting approach to treating cancer and the spread of tumors which holds great potential.”

“The challenge now is to identify a potent drug that will get inside cancer cells and destroy the activity of the rogue gene,” Andrew Chantry of UEA’s school of biological sciences, who led the research, told Reuters.

He said this was “a difficult but not impossible task” and one that would be made easier by the better understanding of the biological processes gained in this early research.

Chantry said in a telephone interview that the findings mean drugs could be developed in the next 10 years that could be used to halt the aggressive spread of many forms of cancer.

He said that if a drug was developed that deactivated the gene, conventional therapies like chemotherapy and radiotherapy could be used on primary tumors with no risk of the disease taking hold elsewhere.

Chantry said that his team is now working with other scientists to try and design a drug that could interrupt the gene’s activity.

The study results were published in today’s issue of Oncogene.

On the Net:

Naps Help Memories Transfer To Brain’s Hard Drive

Scientists, surprised by their own findings, report that the best way to not forget a newly learned poem, card trick or algebra equation may be to take a quick nap.

Researchers in Germany showed in experiments that the brain is better during sleep than during wakefulness at resisting attempts to scramble or corrupt a recent memory.

Their study provides new insights into the hugely complex process by which people store and retrieve deliberately acquired information.

Earlier research showed that fresh memories, which are stored temporarily in a region of the brain known as hippocampus, do not gel immediately.

Reactivating these memories soon after learning plays a crucial role in their transfer to more permanent storage in the brain’s “hard drive,” also known as the neocortex.

However, during wakefulness this period of reactivation renders the memories more fragile.

For example, learning a second poem at this juncture will likely make it harder to commit the first one to deep memory.

Bjorn Rasch of the University of Lubeck in Germany and three colleagues assumed that the same thing happens when we sleep, and designed an experiment to find out if they were right.

Volunteers were asked to memorize 15 pairs of cards showing pictures of animals and everyday objects.  They were exposed to a slightly unpleasant odor.

Half the 24 volunteers who stayed awake 40 minutes later were asked to learn a second, slightly different pattern of cards.

They were again made to smell the same odor just before starting, which was designed to trigger their memory of the first exercise.

The 12 other subjects did the second exercise after a brief snooze, during which they were exposed to the odor while in a state called slow-wave sleep.

Both groups were then tested on the original task.

The sleep group performed significantly better, retaining on average 85 percent of the patterns, compared to 60 percent of those who had remained awake.

“Reactivation of memories had completely different effects on the state of wakefulness and sleep,” lead author Susanne Diekelmann, also from the University of Lubeck, told AFP.

“Based on brain imaging data, we suggest the reason for this unexpected result is that already during the first few minutes of sleep, the transfer from hippocampus to neocortex has been initiated,” she said in an email exchange.

She said that after 40 minutes of shuteye, significant chunks of memory were already “downloaded” and stored where they “could no longer be disrupted by new information that is encoded in the hippocampus.”

Diekelmann said the positive impact of short periods of sleep on memory consolidation could have implications for memory-intensive activities like language training.

She said that the findings also point to a strategy for helping victims of post-traumatic stress syndrome, a debilitating condition caused by extreme experiences.

The reactivation techniques “might prove useful in re-processing and un-learning unwanted memories,” she said. “And reactivation of newly learned memories during ensuing sleep could then help consolidate the desired therapeutic effects for the long-term.”

Diekelmann cautioned that computers are an imperfect metaphor for the way memories are stored in the brain.

“Human memory is absolutely dynamic. Memories are not statically ‘archived’ in the neocortex but are subject to constant changes by various influences,” she said.

She said that the act of remembering does not simply entail “reading” the stored data. 

“Recall is a reconstructive process in which memories can be changed and distorted.”

The study results were published this week in Nature Neuroscience.

On the Net:

Climate Change Threatens Many Tree Species

Global warming is already affecting the earth in a variety of ways that demand our attention. Now, research carried out at the Hebrew University of Jerusalem indicates that many tree species might become extinct due to climate change if no action is taken in time.

According to the research, trees which disperse their seeds by wind, such as pines and maples, will be unable to spread at a pace that can cope with expected climate changes.

The research, which focused on the ecological consequences of expected changes in the climate and the environment on tree spread, was conducted by Prof. Ran Nathan, head of the Alexander Silberman Institute of Life Science at the Hebrew University; his student, Nir Horvitz; and researchers from abroad.

Climate changes, which can be sensed already today and which are expected to continue in the next 50 years, include the increase of carbon dioxide concentration in the air and a reduction of surface wind speed in many areas. On the basis of earlier work, elevated concentration of carbon dioxide is expected to cause trees to produce many more seeds and to reach maturity earlier than under current conditions, hence speeding up their spread. On the other hand, the weakening of wind speed in certain areas should reduce spread rate of these trees. The balance between these opposing forces remained unknown.

Furthermore, it was unclear whether even the projected increase in wind speed in certain areas, together with the higher seed production and earlier maturation, will result in a fast enough spread of trees in order to be sufficient to match the climate changes.

These questions were examined in this study for the first time. Surprisingly, the results show that changes in wind speed, either the projected increase or decrease, have negligible effects on the rate of wind-driven spread of these species. The effects of increased seed production and earlier maturation is that which prevails, giving rise to faster spread in the future compared to current conditions. Still, this research showed that the faster spread predicted for these trees in the future will be much slower than the expected poleward shift of climate (temperature) ranges. Consequently, these tree species might not be able to withstand the climate change.

“Our research indicates that the natural wind-driven spread of many species of trees will increase, but will occur at a significantly slower pace than that which will be required to cope with the changes in surface temperature,” said Prof. Nathan. “This will raise extinction risk of many tree populations because they will not be able to track the shift in their natural habitats which currently supply them with favorable conditions for establishment and reproduction. As a result, the composition of different tree species in future forests is expected to change and their areas might be reduced, the goods and services that these forests provide for man might be harmed, and wide-ranging steps will have to be taken to ensure seed dispersal in a controlled, directed manner.”

The new research, published in the journal Ecology Letters is based on a unique, fully mechanistic model developed to predict trends in plant spread. This model is the first to consider how projected changes in biological and environmental factors would impact tree spread in future environments. Predictions which were made until now were founded on past trends and did not take into consideration the expected future changes in the key biological and environmental factors that determine plant spread.

In Israel, the research has bearing on various native tree species whose seeds are dispersed by the wind, such as Aleppo pine, Syrian maple and Syrian ash. The model that has been developed will be useful also in predicting the invasive spread of alien tree species, such as the tree of heaven, into Israeli natural habitats.

Trees with wind-dispersed seeds are mainly common in forests of North America and Eurasia. The current research points to the need to take human action to insure the dispersal of the seeds of these trees within the next half century, in view of the expected climate changes.

“It is important for those responsible for forest management in many parts of the world to understand that nature alone will not do the job,” said Prof. Nathan. “Human action will be required to ensure in a controlled manner the minimization of unexpected detrimental byproducts, and that those trees which are very important for global ecological processes will not become extinct,” he said. “These forests are important in many ways to man, including the supply of wood, the safeguarding of water quality, and the provision of recreation and tourism facilities.”

Image Caption: This is a typical Israeli pine tree. Credit: Ophir Altsein

On the Net:

Migration Helps Corals Survive Climate Change

The key to preserving the extraordinary richness and beauty of the world’s coral reefs through the coming period of fragmentation caused by climate change lies in a better understanding of how newborn coral larvae disperse across the oceans to settle and grow on new reefs.

Research by scientists at the ARC Centre of Excellence for Coral Reef Studies is throwing new light on the survival and settlement rates of larvae from different coral species, as a basis for predicting how fast coral species might change where they live in response to climate change.

“Coral larvae are weak swimmers, and they don’t feed. They are swept around the ocean by currents and have to find a place to settle and start to grow before they run out of energy,” says Professor Sean Connolly of CoECRS and James Cook University.

“Understanding these dispersal patterns, the rate at which the larvae perish, and the rate at which they lose the ability to settle when they do arrive at a reef, hold the key to predicting how quickly a coral species can move with its preferred environments as climate change renders unsuitable parts of its current habitat.”

This information will also help in the management of existing coral reefs, under siege from ocean warming, acidification and other human impacts, he adds. If managers know the where the young corals come from that renew a damaged reef, they can take steps to limit human pressures such as fishing, coral damage or polluted runoff accordingly.

The world’s coral scientists and managers often rely on mathematical models to predict the dispersal and settlement of coral larvae, as a key part of understanding how interconnected and resilient a reef may be. However, existing models do not take good account of the rates at which coral larvae in the wild die or lose competence to settle.

In a series of groundbreaking experiments, Dr Andrew Baird has measured both the survival rates and ability to settle of larvae for five different coral species. “I noticed that the ability of the larvae to settle increased sharply after 3 to 5 days, peaked at about 1-2 weeks and then tailed off very gradually,” Dr Baird says. “Some larvae lost the ability to settle very quickly, but others from the same batch could still settle even 100 days after hatching. This suggested that the current models for predicting coral dispersal were likely to be inaccurate.”

Armed with these new findings, Baird then approached Connolly to develop a new generation of models that could better capture this high variation in settlement ability.

Modeling these wide variations has helped to resolve an important scientific puzzle, Professor Connolly says – how coral species can show substantial genetic differences between nearby reefs, but still disperse far enough to be very geographically widespread.

“Our new models indicate that more larvae than we thought should settle very close to home, perhaps on the same reef or one next door. At the same time, a small proportion of stellar performers can survive for longer, and travel further, than previously thought,” says Connolly.

This, say the researchers, provides some hope that coral species may be able to “migrate” with climate change – dispersing their larvae towards cooler areas when their current habitat gets too hot to survive.

Their paper “Estimating dispersal potential for marine larvae: dynamic models applied to scleractinian corals” by Sean R Connolly and Andrew H Baird appears in the latest issue of the journal Ecology, No 91, 2010.

On the Net:

Monk Seal, Hump-backed Dolphin Threatened Off Coast Of Mauritania

Catalan researchers have studied the marine trophic network in Mauritania, on the north west coast of Africa, which is an extremely heavily exploited fishing area, as well as being home to two of the world’s most threatened species of marine mammal ““ the monk seal and the Atlantic hump-backed dolphin. The results of the study show that industrial and traditional fishing activities along the coast are putting these mammals and local marine ecosystems at great danger.

The researchers studied the local marine trophic network off the north west coast of Africa, and by analyzing stable carbon and nitrogen isotopes were able to verify the distribution and trophic position of 13 mammal species and also that of other species of macro seaweed, marine plants, fish, molluscs, turtles and phytoplankton, which had never been studied before.

The monk seal (Monachus monachus) and the Atlantic hump-backed dolphin (Sousa teuszii) are “the most coastal species of the whole area studied, and are the only ones occupying this marine ecosystem”, Ana M. Pinela, lead author of the study and a researcher at the University of Barcelona (UB), tells SINC.

The Portuguese scientist says this area, which is so “extremely” over-exploited by both industrial and traditional fishing “should be a conservation priority for these species, which are important for biodiversity. If they disappear, it would be hard for others to take their place”. This would cause a “serious” imbalance “at all levels” in local coastal ecosystems, which would remain without two super-predators that are “essential” for them to function properly.

The study, which has been published in Marine Ecology Progress Series, shows the importance of the predators Monachus monachus and Sousa teuszii for the proper functioning of coastal ecosystems in Mauritania.

The killer whale (Orcinus orca), which is also present in Mauritania, feeds at the same trophic level as the monk seal, meaning it feeds on fish, and not on marine mammals as it does off some other coasts. However, “its range is much more pelagic (of open oceans)”, says Pinela, who has described the offshore ecotype of the killer whales in this region for the first time.

“Unprecedented” fishing exploitation

“Mauritania contains some of the most heavily-exploited fisheries habitats in the whole world, with one of the world’s largest fisheries stocks, which is subject to very little regulation, inspection or control”, explains the researcher. Two of the world’s most threatened species live off this coast ““ the monk seal, which is on the verge of extinction, and the hump-backed dolphin, which has a very limited geographic range and is little known.

The scientists say conservation of these coastal areas should be “a priority”. “The Mauritanian Government and international agencies should more strongly monitor both industrial and traditional fisheries exploitation both in deep sea and coastal areas”, says Pinela. Over-fishing and the over-exploitation of resources limits the availability of prey for these species and damages ecosystems.

The research team is calling for “more diligent” regulation of such fishing practices, monitoring and control of the fishing fleet, “be it Mauritanian or international”, and the implementation of sustainable fishing off the north west of Africa.

In addition, estimates should be made of fish abundance and catches in the entire study area, because it is “a biodiversity hotspot that is home to a great diversity of marine mammals”, concludes Pinela.

References: Pinela, A.M.; Borrell, A.; Cardona, L.; Aguilar, A. “Stable isotope analysis reveals habitat partitioning among marine mammals off the NW African coast and unique trophic niches for two globally threatened species” Marine Ecology-Progress Series 416: 295-306, 2010

Image Caption: The monk seal and hump-backed dolphin are threatened by fishing activities off coast of Mauritania. Credit: Alex Aguilar

On the Net:

Fossil Find Solves Dinosaur Sex Riddle

The discovery of an ancient fossil, nicknamed “ËœMrs. T’, found with a fossilized egg in a Jurassic rock bed dating back 160 million years, has allowed scientists for the first time to sex pterodactyls.

The fossil was found in the Jurassic sedimentary rocks of China’s northeastern Liaoning Province and shows a nearly complete skeleton of a heavy-hipped female Darwinopterus and her egg. The find provides scientists with the first direct evidence for gender in these extinct reptilian fliers, showing that females were crestless, solving a long debated issue of what some pterosaurs did with their spectacular head crests. It suggests that the males used the crest as a showy display.

The discovery was made by an international team of British and Chinese researchers from the Universities of Leicester, Lincoln and the Geological Institute in Beijing. Details of the extraordinary find are published in the journal Science.

Pterosaurs were warm-blooded, winged creatures that flew among other dinosaurs between 220 and 65 million years ago, and resembled a cross between a stork and a bat.

The researchers said that the hawk-sized Mrs. T displays wider hips than males, and likely lacked the same distinct head crest that could be seen in males.

David Unwin, a paleobiologist in the Department of Museum Studies at the University of Leicester, said the discovery of a mother together with her egg is “incredibly rare,” and a first for pterosaurs.

“Certainly if somebody had said to me a few years ago, ‘What do you think about the chance of finding a pterosaur preserved together with an egg?’ I just would have said ‘You’re crazy,'” Unwin told AFP.

The discovery will help researchers better understand and identify previously found fossils as either male or female, especially in Darwinopterus, where both sexes were about the same size.

“She has relatively large hips, to accommodate the passage of eggs, but no head crest,” Unwin said in a statement. “Males, on the other hand, have relatively small hips and a well developed head crest. Presumably they used this crest to intimidate rivals, or to attract mates such as Mrs. T.”

Many of her other traits match those of 10 similar specimens found in the same rock formation — an elongated neck and tail, a protruding beak-like mouth lined with slender teeth, and an extended fifth toe.

The researchers who studied the fossil, which was first identified in 2009, gathered clues that showed how the young creature’s life came to a sudden violent end.

Judging by the visible bone break near her wing, some kind of sudden accident, such as a volcanic eruption or a severe storm, must have fractured the forearm, making it impossible for her to fly, they said.

Evidence shows that she plunged into a body of water and drowned. As her body grew waterlogged, it sank to the bottom. The egg, which looked like it was days from being laid, was pushed out of her decaying body.

“The association of the egg with the skeleton, at least in pterosaurs, is unique. We have no other similar sort of specimen,” Christopher Bennett, an expert at Fort Hays State University in Kansas who reviewed the research, told AFP.

Bennett said the creature might have had some sort of head markings, though it was likely not the same flashy display grown by males. “I am not entirely convinced that this individual didn’t have any crest. It is possible that it was a young adult and a bony crest had not yet grown,” he added.

Nonetheless, the fossil is a “nice specimen” and shows that “females had a pelvis that was deeper and had a larger opening,” said Bennett.

Unwin, writing in the journal Science, said: “Mrs. T’s egg is relatively small and had a soft shell. This is typical of reptiles, but completely different from birds which lay relatively large hard-shelled eggs. This discovery is not surprising though, because a small egg would require less investment in terms of materials and energy ““ a distinct evolutionary advantage for active energetic fliers such as pterosaurs and perhaps an important factor in the evolution of gigantic species such as the 10 meter wingspan Quetzalcoatlus.”

“Gender is one of the most fundamental of biological attributes, but extremely difficult to pinpoint with any certainty in the fossil record. Being able to sex pterosaurs is a major step forward. Finally, we have a good explanation for pterosaur head crests, a problem that has puzzled scientists for more than 100 years. Now, we can exploit our knowledge of pterosaur gender to research entirely new areas such as population structure and behavior. We can also play matchmaker for pterosaurs bringing back together long separated males and females in the single species to which they both belong,” he concluded.

Image 1: This graphic shows the sex-related features of Darwinopterus. The male (right) has a large head crest, but this is absent in the female (left). Credit: Mark Witton ([email protected])

Image 2: “Mrs. T” is a female Darwinopterus (wingspan 0.78 m) preserved together with her egg. Credit: L Junchang, Institute of Geology, Beijing ([email protected])

Image 3: This is a close up of the egg (20 by 28 mm) preserved together with Mrs. T, a female Darwinopterus. Credit: L Junchang, Institute of Geology, Beijing ([email protected])

On the Net:

New Alzheimer’s Insights

(Ivanhoe Newswire) — Two studies released in this week’s JAMA, provide new insights into Alzheimer’s disease (AD).

The first insight occurred when researchers used molecular imaging to identify a biomarker commonly linked to Alzheimer’s. This biomarker, called beta-amyloid, is discovered during autopsies of AD patients but may now be observed in living patients.

Doctors would be better able to diagnose and eventually treat AD by observing the noninvasive biomarkers that assist in the patient’s pathology. Between 10 and 20 percent of AD patients do not have AD pathology at the time of autopsy, and 33 percent of patients with mild symptoms go undiagnosed.

To observe these biomarkers, the researchers experimented with different types of positron emission tomographic (PET) imaging tests. The most promising looked to be florbetapir F 18 PET, which uses a chemical that binds to beta-amyloid to help identify it. At this point, however, the direct relationship between florbetapir-PET image and beta-amyloid has not been defined.

Avid Radiopharmaceuticals’ Christopher M. Clark, M.D., and his associates, tested the efficacy of florbetapir F 18 PET in predicting beta-amyloid presence in the brain during autopsy. They collected PET images of 35 near-death patients in various health care centers and corresponding images of a control set of 74 younger, presumably beta-amyloid-free individuals aged 18 to 50. “Florbetapir-PET images and postmortem results rated as positive or negative for beta-amyloid agreed in 96 percent of the 29 individuals in the primary analysis cohort,” the researchers wrote.

In the younger group, the florbetapir-PET image was found to be amyloid negative.

The researchers believe that impaired functionality may partially result from the patient’s ability or inability to tolerate accumulated amyloid in their brain. Genetics, lifestyle habits, environmental risks and neuropathological comorbidities may determine this.

The second study also involved beta-amyloid, as older adults containing lower levels of protein fragments of the biomarkers beta amyloid 42/40 were found to be more likely to suffer from cognitive decline over a nine-year period. Less education and literacy also contributed to this trend, putting such individuals at a higher risk of developing dementia.

University of California, San Francisco’s Kristine Yaffe, M.D., who also works with the San Francisco Veterans Affairs Medical Center, collaborated with a team of colleagues to look at the relationship between plasma beta-amyloid 42/40 accumulation and cognitive decline in older adults who did not suffer from dementia. They also tried to discern whether or not “cognitive reserve,” or levels of education and literacy, contributed to prevention.

Dr. Yaffe and her associates observed beta-amyloid 42/40 levels in 997 elderly adults who were registered in the Health ABC study, a prospective observational study that initially began in 1997-1998 and included a 2006-2007 follow-up. They found a strong connection between low beta-amyloid 42/40 levels and cognitive decline over nine years. Plasma beta-amyloid 42 was also strongly associated.

As for cognitive reserve, individuals who had a “low reserve,” measured by an education level below a high school graduate and a literacy level equivalent to sixth grade or below, were significantly associated with beta-amyloid 42/40 level, which was not the case in those with “high reserve.”

“These results are important, as the prevalence of cognitive impairment is increasing exponentially and prevention will be crucial. To identify those at risk of dementia, biomarkers like plasma beta-amyloid levels that are relatively easy to obtain and minimally invasive could be useful. In addition, our finding of an interaction of cognitive reserve with the association of plasma beta-amyloid level and cognitive decline could have public health importance because it may suggest pathways for modifying beta-amyloid effects on cognition,” the authors of the study wrote.

Source: JAMA, January 2011

Internet Linked To Teen Depression

A new study finds that too much, or too little, Internet use may be linked to depression among teens. 

The research showed that teens who spent no time online, as well as those who were heavy Internet users, were at increased risk of depression symptoms.

Dr. Pierre-Andre Michaud and colleagues at the University of Lausanne in Switzerland surveyed 7,200 participants aged 16 to 20 years old about the frequency of their Internet use.

Those who said they were online more than two hours per day were classified as “heavy” Internet users, while those who were online from several times per week to two hours daily were classified as “regular” users.

The participants also answered several health-related questions, including some about “depressive tendencies” that assess how often a person feels hopeless or sad.

The results showed that teens that were either heavy Internet users or non-users were more likely to be depressed or very depressed than those who were “regular” Internet users.

Among male participants, heavy Internet users and non-users were both one-third more likely to have a high depression score, compared with “regular” users.  Among girls, heavy Internet users had an 86 percent greater chance of depression compared with regular users, while non-users had a 46 percent greater chance.

Nevertheless, average depression scores among non-users, regular users and heavy users were all near the lower end of the scale — between 1 and 2 on a scale of 1 to 4, with 1 being “not depressed at all.”

A separate study published last year in which pediatricians in Sweden were queried to estimate rates of mental health disorders in young patients found average teen depression rates of around 1.4 percent.

The researchers said it was not yet clear why both heavy Internet use and non-use were linked to higher depression risks in teens. 

Since many teens use the Internet to connect with friends, perhaps those who are never online may be more socially isolated, they speculated.

As for heavy Internet use, prior studies have found links to depression symptoms, although the underlying reasons remain unclear.

One study of Taiwanese teens found that depression symptoms typically preceded kids’ heavy Internet use, noted Michaud and his colleagues.

The current study found that certain other health concerns were also more common among heavy Internet users. For instance, 18 percent of males who were heavy Internet users were overweight, compared with just 12 percent of regular Internet users.  And in female teens, 59 percent of heavy Internet users were sleep-deprived, compared with 35 percent of regular users.

As with the other findings, the reason for these relationships is unclear.

It may be that some teenage girls exchange bedtime for online time, or that sedentary computer time may result in weight gain among boys, the researchers speculated.

Michaud and his team concluded that either excessive time online, or little to no time online, could be indicators that a teen is having problems.

However, regular Internet use — up to two hours per day in this study — appears to be “normal”.

The researchers noted that the study was conducted in 2002, before the ubiquity of social media sites such as Facebook and Twitter.  Many teenagers now spend much more time online, which could alter the definition of “normal” Internet use.

The study was published online January 17 in the journal Pediatrics.

On the Net:

Tutankhamen’s Tomb To Be Closed To Visitors

Time is running out to visit the tomb of Tutankhamen, as officials with Egypt’s Supreme Council of Antiquities have announced plans to close it to tourists by the end of the year.

The tomb, which was discovered some 89 years ago, has been damaged as a result of the many visitors it has received, particularly over the past three decades, according to a Monday report in the Australian newspaper The Sunday Times. Instead, visitors will be directed to a soon-to-be-created replica of the tomb in Luxor, while the original will be closed down for preservation purposes.

“There’s no alternative. Closing the tombs is the only way to preserve them,” Dr. Zahi Hawass, Egypt’s antiquities chief, told The Sunday Times. “People’s respiration, the humidity they bring into the tombs, their sweat, the fact that they use flashes when taking pictures–all this damages the tombs.”

“If I don’t build this ‘Valley of the Replicas’, the originals will be destroyed in less than 100 years,” Dr. Hawass added. “That would be a disaster for history.”

The replica of Tutankhamen’s tomb should be ready within a year’s time, Dr. Hawass said. Currently, the Supreme Council of Antiquities is attempting to raise funds from private donors for the project, which according to The Sunday Times will cost approximately $10 million.

According to Mike Pitts, Editor of British Archaeology, it was only a matter of time before this happened.

“There can be no disputing the problem,” Pitts said in an article written for the Guardian on Monday, noting that once Howard Carter discovered the tomb in 1922, “It was recognized immediately that the sterile environment had been compromised: Carter’s chemist found ‘air-infections’ the day after they broke in. Those were nothing, however, compared with the humidity, fungi and dust wafted through the tomb by a thousand or more visitors a day.”

“Cue staining, crumbling and erosion of the paint. Short of sand-blasting it, you would be hard pushed to devise a more efficient mechanism for destroying the 3,300-year-old art,” he said, adding that the problem was “affects ancient sites around the world”¦ Heritage tourism may be good for economies but, badly managed, it harms the heritage. It’s right that our access should be controlled.”

Pitts added that advances in technology should make the replica of Tutankhamen’s tomb “visually indistinguishable from the original; and you can see it with better lighting and access.”

“Indeed, the replication process is so precise, it brings new insights to the original, helping academics and tour guides alike. No, it’s not the real tomb. But it is a real facsimile, and when you visit you will become part of a cutting-edge research project,” he added.

On the Net:

Plasma Exchange Effective In Treating Severe MS Relapses, Neuropathies

A new guideline from the American Academy of Neurology recommends using plasma exchange to treat people with severe relapses in multiple sclerosis (MS) and related diseases, as well as those with certain kinds of nerve disorders known as neuropathies. The guideline is published in the January 18, 2011, print issue of Neurology®, the medical journal of the American Academy of Neurology.

Plasma exchange, formally known as plasmapheresis, is the process of taking blood out of the body, removing constituents in the blood’s plasma thought to be harmful, and then transfusing the rest of the blood (mainly red blood cells) mixed with replacement plasma back into the body.

The guideline recommends doctors consider using plasma exchange as a secondary treatment for severe flares in relapsing forms of MS and related diseases. The treatment was not found to be effective for secondary progressive and chronic progressive forms of MS.

According to the guideline, doctors should offer plasma exchange for treatment of severe forms of Guillain-Barr© syndrome and for temporary treatment of chronic inflammatory demyelinating polyneuropathy. Plasma exchange may also be considered for treatment of some other kinds of inflammatory neuropathies.

“These types of neurologic disorders occur when the body’s immune system mistakenly causes damage to the nervous system. Plasma exchange helps because it removes factors in the plasma thought to play a role in these disorders,” said guideline lead author Irene Cortese, MD, a neurologist with the National Institutes of Health in Bethesda, Md., and a member of the American Academy of Neurology.

The guideline authors also looked at the use of plasma exchange for other neurologic disorders, including myasthenia gravis and pediatric autoimmune neuropsychiatric disorders (PANDAS), but there was not enough evidence to determine whether it is an effective treatment.

Side effects of plasma exchange include infection and blood-clotting issues.

On the Net:

Israel, US May Be Responsible For Iranian Cyberattack

A computer worm was used by officials in the US and Israel in an attempt to sabotage Iran’s attempts to create an atomic bomb, the New York Times first reported on Sunday.

According to an AFP article that followed the original Times story, the New York-based newspaper cited intelligence and military experts as claiming that the virus, known as Stuxnet, successfully shut down one-fifth of Iran’s nuclear centrifuges back in November.

The testing reportedly took place at “the heavily guarded Dimona complex in the Negev desert housing the Middle East’s sole, albeit undeclared nuclear weapons program,” the French news agency said. “Experts and officials told the Times the effort to create Stuxnet was a US-Israeli project with the help, knowingly or not, of Britain and Germany.”

The AFP reports that there had previously been speculation that Israel was behind a Stuxnet worm attack that targeted Iranian computers, and officials in the Middle Eastern nation have blamed the Israelis and the Americans for the deaths of two nuclear scientists and November and January.

Tehran officials claim that their uranium enrichment program is seeing strong progress, while US officials have been among those stating that they believed the program has suffered serious setbacks, reporters from the wire service said.

According to Ewen MacAskill of The Guardian, because of the cyberattack, “The chances of a military strike against Iran this year are receding.”

“Last year, rumors of military action began to be heard louder round Washington, with diplomats and officials warning that this year would be the year of decision on whether to launch a military strike. But the mood has changed,” MacAskill wrote on Sunday, adding that an official told his newspaper “that the military option is now less likely, citing not only the cyberattack, but also the synchronized assassination last year of two Iranian nuclear scientists, attributed to Israel.”

Meanwhile, some Russian scientists are expressing concern over the damage Stuxnet may have caused to Iran’s nuclear reactors. According to Con Coughlin of the Telegraph, Russian nuclear experts assisting with technical aspects of Iran’s program fear that the “extensive damage caused to the plant’s computer systems” because of the worm could result in “‘another Chernobyl’ if they were forced to comply with Iran’s tight deadline to activate the complex this summer.”

“Russian scientists working at the plant have become so concerned by Iran’s apparent disregard for nuclear safety issues that they have lobbied the Kremlin directly to postpone activation until at least the end of the year, so that a proper assessment can be made of the damage caused to its computer operations by Stuxnet,” Coughlin added.

On the Net:

‘Master Switch’ Found For Inflammatory Diseases

Scientists have discovered a protein that acts like a “master switch” determining whether white blood cells will boost or dampen inflammation, a discovery that could help researchers find new drugs, or possibly even a cure, for rheumatoid arthritis.

Many patients with rheumatoid arthritis are treated with a class of medications called tumor necrosis factor (TNF) inhibitors made by a number of drug companies including Abbott Laboratories, Merck, Pfizer and Amgen.

But as many as 30 percent of those patients do not respond well to anti-TNF drugs, so experts say it is imperative to develop more widely effective treatment options for the debilitating condition.

In the study, published in the journal Nature Immunology on Sunday, scientists from Imperial College London found that the protein IRF5 acts as a molecular switch that controls whether white blood cells — known as macrophages — will promote or inhibit inflammation.

The scientists said the results suggest that blocking the production of IRF5 in macrophages could prove to be a valuable way of treating a wide range of autoimmune diseases, such as rheumatoid arthritis, inflammatory bowel disease, lupus and multiple sclerosis.

They also suggest that boosting IRF5 levels could help treat people whose immune systems are compromised or damaged.

“Our results show that IRF5 is the master switch in a key set of immune cells, which determines the profile of genes that get turned on in those cells,” Dr. Irina Udalova from the Kennedy Institute of Rheumatology at Imperial College London, the senior researcher on the study, said in a statement.

“This is really exciting because it means that if we can design molecules that interfere with IRF5 function, it could give us new anti-inflammatory treatments for a wide variety of conditions,” Udalova said.

“Diseases can affect which genes are switched on and off in particular types of cells. Understanding how this switching is regulated is crucial for designing targeted strategies to suppress unwanted cell responses,” she said.

The researchers said IRF5 seems to work by switching on genes that stimulate inflammatory responses and dampening genes that inhibit them. It can do this either by interacting with DNA directly, or by interacting with other proteins that themselves control which genes are switched on, the researchers explained.

Udalova’s team is now studying how IRF5 works at the molecular level and which proteins it interacts with so they can design ways to block its effects.

Rheumatoid arthritis affects about 1 percent of the world’s population and arises when the immune system mistakenly attacks joints all over the body. It may also affect the skin, heart, lungs, kidneys and blood vessels. Many who suffer get deformed hands and feet, which affects movement and the ability to function normally.

On the Net:

Blueberries Can Help Reduce High Blood Pressure

Eating blueberries can guard against high blood pressure, according to new research by the University of East Anglia and Harvard University.

High blood pressure ““ or hypertension ““ is one of the major cardiovascular diseases worldwide. It leads to stroke and heart disease and costs more than $300 billion each year. Around a quarter of the adult population is affected globally ““ including 10 million people in the UK and one in three US adults.

Published next month in the American Journal of Clinical Nutrition, the new findings show that bioactive compounds in blueberries called anthocyanins offer protection against hypertension. Compared with those who do not eat blueberries, those eating at least one serving a week reduce their risk of developing the condition by 10 percent.

Anthocyanins belong to the bioactive family of compounds called flavonoids and are found in high amounts in blackcurrants, raspberries, aubergines, blood orange juice and blueberries. Other flavonoids are found in many fruits, vegetables, grains and herbs. The flavonoids present in tea, fruit juice, red wine and dark chocolate are already known to reduce the risk of cardiovascular disease.

This is the first large study to investigate the effect of different flavonoids on hypertension.

The team of UEA and Harvard scientists studied 134,000 women and 47,000 men from the Harvard established cohorts, the Nurses’ Health Study and the Health Professionals Follow-up Study over a period of 14 years. None of the participants had hypertension at the start of the study. Subjects were asked to complete health questionnaires every two years and their dietary intake was assessed every four years. Incidence of newly diagnosed hypertension during the 14-year period was then related to consumption of various different flavonoids.

During the study, 35,000 participants developed hypertension. Dietary information identified tea as the main contributor of flavonoids, with apples, orange juice, blueberries, red wine, and strawberries also providing important amounts. When the researchers looked at the relation between individual subclasses of flavonoids and hypertension, they found that participants consuming the highest amounts of anthocyanins (found mainly in blueberries and strawberries in this US-based population) were eight percent less likely to be diagnosed with hypertension than those consuming the lowest amounts. The effect was even stronger in participants under 60.

The effect was strongest for blueberry rather than strawberry consumption. Compared to people who ate no blueberries, those eating at least one serving of blueberries per week were 10 percent less likely to become hypertensive.

“Our findings are exciting and suggest that an achievable dietary intake of anthocyanins may contribute to the prevention of hypertension,” said lead author Prof Aedin Cassidy of the Department of Nutrition at UEA’s Medical School.

“Anthocyanins are readily incorporated into the diet as they are present in many commonly consumed foods. Blueberries were the richest source in this particular study as they are frequently consumed in the US. Other rich sources of anthocyanins in the UK include blackcurrants, blood oranges, aubergines and raspberries.”

The next stage of the research will be to conduct randomized controlled trials with different dietary sources of anthocyanins to define the optimal dose and sources for hypertension prevention. This will enable the development of targeted public health recommendations on how to reduce blood pressure.

“ËœHabitual intake of flavonoid subclasses and incident hypertension in adults’ by A Cassidy (UEA), E O’Reilly (Harvard), Colin Kay (UEA), L Sampson (Harvard), M Franz (Harvard), J Forman (Harvard), G Curhan (Harvard), and E Rimm (Harvard) will be published in the February 2011 edition of the American Journal of Clinical Nutrition.

On the Net:

CDC Outlines Racial Inequalities In US Health

US health officials released a report this week detailing the “inequities across race, sex and income levels with respect to how likely a person is to be sick or healthy,” according to a recent AFP report.

Healthcare access, exposure to environmental hazards, and behavioral risk factors varied greatly according to ethnicity, gender and social class, according to the report from the Centers for Disease Control and Prevention (CDC).

“I don’t think we saw any surprises necessarily,” Leandris Liburd, director of the CDC office of minority health and health equity, told AFP.

“But what the report does provide that we haven’t seen before is a really detailed analysis of the 22 topic areas,” she said, adding that it highlighted the “disproportionate burden” faced by certain groups when it comes to a variety of health issues.

For example, African American men and women are much more likely to die from heart disease or stroke than Caucasians.

Hispanic teens have a five times higher pregnancy rate than Asians, and families near the poverty level are more likely to have smokers than those where the income levels are higher.

Native Americans and Alaskan natives have the highest death rate in automobile accidents — 29 deaths per thousand.

And across the board, men are four times as likely was women to commit suicide.

The report found that High blood pressure is more frequent among blacks (42 percent) than whites (29 percent).

Drug use deaths are the highest among non-Hispanic whites and lowest among Asians and Pacific Islanders.

Also, the rate of preventable hospitalization goes up as incomes fall.

The CDC said $6.7 billion per year in healthcare costs could be avoided if these disparities were eliminated.

“Better information about the health status of different groups is essential to improve health,” said CDC chief Thomas Frieden. “This first of its kind analysis and reporting of recent trends is designed to spur action and accountability at the federal, tribal, state and local levels to achieve health equity in this country.”

On the Net:

Johannes Kepler ATV Readying For Launch

ATV-2 is almost ready for launch on 15 February from Europe’s Spaceport. It will be the heaviest load ever lofted into space by the Ariane 5 rocket, making the 200th flight of the European launcher even more spectacular.

ESA’s latest Automated Transfer Vehicle space ferry, named after the German astronomer and mathematician Johannes Kepler, is now fully fuelled, its oxygen tanks are filled and most of the cargo from ESA and NASA is placed inside.

Only last-minute cargo of up to 400 kg will be added two weeks before launch using a special access device. 

While the first ATV in 2008 performed a series of demonstrations on its way to the International Space Station (ISS), Johannes Kepler will head directly to its destination.

The planned journey includes some extra days to allow for possible delays, but the docking has to take place on 26 February to meet the busy ISS schedule.

Docking automatically, but controlled from Toulouse

ATV will navigate, fly and dock to the Station automatically, but it will be monitored and commanded from the ATV Control Centre (ATV-CC) in Toulouse, France. Despite its mass of about 20 tons, the ferry can maneuver itself to within a few centimeters.

During the docking, ESA astronaut Paolo Nespoli will stand by ready to interrupt the approach if necessary. ATV carries several separate systems to detect potential problems and to ensure the safety of the Station and its crew at all times.

The links between the ATV-CC, ATV, Ariane and control center at Europe’s Spaceport in Kourou, French Guiana, will be tested twice in realistic launch simulations, on 4 and 11 February.

The Ariane 5 ES vehicle is already assembled in Kourou and the ATV will be attached on top on 20 January, beginning 20 days of combined operations with the Ariane and ATV teams.

The launch window will open for four days from 15 February.

Express delivery service

ATV-2 will carry more to the Space Station than Jules Verne, the first ATV, delivered in 2008. Several upgrades permit Johannes Kepler to ferry a full propellant load of almost 5 tons. All the cargo ““ liquid, gas and dry goods ““ totals 7.5 tons.

At the Station, ATV will provide storage and help in adjusting the orbit, performing regular orbit reboosts and avoiding space debris.

After staying for three and a half months at the Station, it will undock before being commanded by ATV-CC to burn up in the atmosphere over an uninhabited area of the southern Pacific Ocean.

Follow the mission of Johannes Kepler on the ATV blog

ESA will follow the launch preparations, flight and docking of the ATV on a special blog opened today.

The blog will cover the mission’s milestones and include technical and operational details, updated with text and video entries from ESA establishments, the launch site in Kourou and the ATV control center in Toulouse.

Image 1: ATV-2 Johannes Kepler. Credits: 2010 ESA / CNES / Arianespace / Photo Optique Vid©o du CSG

Image 2: In anticipation of the vessel’s upcoming launch, scheduled for 15 February, ATV Johannes Kepler is now being “Ëœtanked up’ with fuel at Europe’s Spaceport in Kourou. Of all the vessels that can deliver cargo to the ISS, ATV can deliver the largest quantity of fuel, up to 5.5 tons maximum. With this fuel, ATV can regularly reboost the Station’s orbit, which suffers a natural decay of 50 to 100 m each day (what goes up must come down ““ unless reboosted by ATV ““ Ed.) due to drag caused by traces of atmosphere at the ISS orbital altitude (roughly, 400km. Credits: 2011 ESA / CNES / Arianespace / Photo Optique Vid©o du CSG / S. Martin

Image 3: The 200th Ariane for ATV Johannes Kepler was assembled in December 2010, inside the Spaceport’s high Launcher Integration Building, by positioning the core cryogenic stage over the mobile launch table and mating two large solid rocket boosters with the core stage. ATV-2, now already fuelled and almost ready to go, will be mated with the launcher later this month to start the combined operations between the Ariane and ATV teams, all targeting launch on 15 February 2011. Credits: 2010 ESA / CNES / Arianespace / Photo Optique Vid©o du CSG / P. Baudon

On the Net:

Laughter Can Be Good Medicine For IVF Procedures

An Israeli study suggests that laughter may increase the success rate of pregnancy though in-vitro fertilization (IVF).

An Israeli research team led by Dr. Shevach Friedler found that the odds of success were greater among women who were entertained by a professional “medical clown” just after the embryos were transferred to their wombs.

36 percent of the 219 women involved in the study became pregnant, versus 20 percent of women who’d had a comedy-free recovery after embryo implantation according to the findings in the journal Fertility and Sterility.

Dr. Friedler, who led the work, said he got the idea for the study after reading about the potential physiological effects of laughter as a “natural anti-stress mechanism. Patients suffering from infertility undergoing IVF are exceptionally stressed.”

Friedler, who is based at Assaf Harofeh Medical Center in Zrifin, told Reuters Health in an e-mail, “So I thought that this intervention could be beneficial for them at the crucial moments after embryo transfer.”

Friedler’s team had a medical clown visit their fertility clinic periodically over one year. Of the 219 women studied, half underwent embryo implantation on a day the clown was at the clinic. During recovery from the procedure, each woman had a 15-minute visit from the clown, who performed a specific routine created by Friedler.

The researchers found that compared with women who came to the clinic on a “non-clown” day, those who’d had a laugh were more than twice as likely to become pregnant, when age, type of infertility and number of transferred embryos were taken into account.

Friedler did not have any information on if other clinics would be sending in the clowns but added that if studies at other centers back up his findings, other fertility clinics may take up the tactic. “After all,” he noted, “this is one of the least hazardous interventions in our field.”

The use of clown characters in medical settings has long been used at medical centers in Israel, the U.S., Canada, Europe and Australia, usually in children’s hospitals.  It’s also gaining some academic backing. The University of Haifa in Israel, for example, recently launched a degree program in “medical clowning.”

On the Net:

Researchers Learn Why PSA Levels Reflect Prostate Cancer Progression

Researchers at the Duke Cancer Institute who have been studying prostate cancer cells for decades now think they know why PSA (prostate-specific antigen) levels reflect cancer progression.

“This is the first demonstration of a mechanism that explains why PSA is a bad thing for a tumor to produce,” said senior author Sal Pizzo, M.D., Ph.D., chair of the Duke Department of Pathology. “I am willing to bet there is also a connection in cancerous cell growth with this particular biological signaling mechanism happening in other types of cells.”

Using human prostate cancer cells in a laboratory culture, the team found that an antibody reacts with a cell surface receptor called GRP78 on the cancer cells to produce more PSA. The PSA arises inside of the cancer cell and then moves outside of the cell, where it can bind with the same antibody, called alpha2-macroglobulin (ÃŽ±2M).

The PSA forms a complex with the antibody that also binds to the GRP78 receptor, and that activates several key pathways which stimulate cancer cell growth and cell movement and block cell death.

The study bolsters the case for measuring PSA as a marker of tumor progression, as well as for monitoring for ÃŽ±2M antibody levels.

“The use of PSA to make the initial diagnosis of prostate cancer has become controversial over the past decade,” Pizzo said. “I personally believe PSA is more useful as a progression marker, particularly with a baseline value on record at the time of the original therapy. A rapidly rising value and/or a very high value is reason for concern. I also believe that monitoring the serum for the appearance of antibodies directed against GRP78 is also a good marker of progression.”

Pizzo said that the findings could yield cancer therapies that block the ÃŽ±2M-PSA complex from stimulating the cell receptor signaling cascade, and that his laboratory is investigating possibilities. He said the findings also might yield new kinds of early-detection tests for prostate cancer.

The study will be published in the Jan. 14 edition of the Journal of Biological Chemistry.

Pizzo credits lead author and signaling pathway expert, biochemist Uma Misra, Ph.D., with deducing that PSA may be involved in a signaling feedback loop that promotes more aggressive behavior in the human prostate cancer cells.

“If you were a cancer cell, you would like to turn on cell growth, turn off the process of death by cell apoptosis and you’d like to be able to migrate, and when the ÃŽ±2M antibody binds with the protease PSA molecule, all of that happens,” Pizzo said.

Years ago, Misra discovered the GRP78 receptor on the prostate tumor cell surface, the receptor that binds the ÃŽ±2M antibody and the ÃŽ±2M-PSA complex.

“We were surprised to find that this complex binds with the protein GRP78, because we thought the GRP78 molecule only lived deep inside the cell, where it was busy taking improperly folded proteins and helping them to fold properly,” Pizzo said. “It was a surprise to find GRP78 on the cell surface, with other functions. Based on the dogma of the time, we didn’t think that GRP78 could function as a receptor. Even when we identified it, I doubted our findings.”

Pizzo said that since Misra first made the observation about GRP78 working as a receptor, “it has turned into a cottage industry. GRP78 receptors have been discovered on many other cancer cells, including breast, ovary, liver, colon, melanoma and lung cancer cells.”

“This is going to be a generic phenomenon to tumors,” predicted Pizzo, who is also working to learn more about this receptor in other types of cancer cells. “Not all tumors will express GRP78 on their cell surfaces, but when they do, it probably will be a harbinger of a bad outcome.”

“I think we will find that nature favors conservation and it makes sense that the body uses the same types of molecules for different purposes,” Pizzo said. “We are beginning to see more of this in other studies, and I predict we will see many more instances.”

On the Net:

It Takes 2 For Improved Control Of Blood Pressure

Beginning treatment with 2 medicines gives better and faster results

New British-led research shows that starting treatment of blood pressure with two medicines rather than the one produces better and faster results and fewer side effects ““ findings that could change clinical practice world-wide.

The study, published in the Lancet, challenges popular medical practice for the treatment of high blood pressure. The research was led by Cambridge in collaboration with the Universities of Dundee, Glasgow and the British Hypertension Society.

Doctors usually start treatment with one medicine and then add others over a period of months, if needed, to control blood pressure. This study shows that it is best to start treatment with two medicines together at the same time – resulting in much faster and better control of blood pressure and surprisingly fewer side effects than with one medicine alone.

The two medicines can be incorporated into a single pill, simplifying things for patients who will still only have to take one pill. But by including two medicines in the same pill, they are taking a much more effective medicine with fewer side effects.

Professor Morris Brown, of the University of Cambridge and Addenbrooke’s Hospital, said, “The ACCELERATE study breaks the mould for treating hypertension. Most patients can now be prescribed a single combination pill and know that they are optimally protected from strokes and heart attacks.”

Prof Bryan Williams, of the British Hypertension Society, said, “This study is important and the findings could change the way we approach the treatment of high blood pressure.”

Currently there are almost 10 million people in the UK with high blood pressure and effective treatment is known to substantially reduce the risk of stroke and heart disease.

The investigators believe these important findings could change clinical practice and affect the future treatment of blood pressure for millions of people in the UK.

Professor Tom MacDonald, of the University of Dundee, said: “The research is a great result for patients with high blood pressure. Starting with two medicines is clearly better than starting with one and amazingly there were fewer side effects and not more.”

Gordon McInnes, Professor of Clinical Pharmacology at the University of Glasgow, said: “The results of this trial are of huge importance to doctors and people treated for high blood pressure. Future treatment will be more effective and, since fewer side effects will lead to better acceptance of therapy, many fewer heart attacks and strokes are likely.”

The ‘ACCELERATE’ study of 1250 patients with hypertension shows that a new accelerated treatment programme lowers blood pressure faster, more effectively, and with fewer side effects than conventional treatment.

ACCELERATE shows that patients who start treatment with a single tablet containing a combination of drugs will have a 25% better response during the first six months of treatment than patients receiving conventional treatment, and ““ remarkably ““ are less likely to stop treatment because of side effects. Still more remarkably, the blood pressure in the conventional treatment arm never caught up with the new treatment arm, even when all the patients in the study were being treated with the same combination of drugs.

The authors suspected that conventional treatment allows the body to partially neutralise each drug, and ACCELERATE was designed to show that the new treatment programme prevents this neutralisation from happening.

ACCELERATE was designed by The British Hypertension Society, who entered a unique partnership with Novartis in order for the treatment programme to be simultaneously tested in ten countries on four continents.

Currently, patients with hypertension take many months to have their blood pressure lowered, following guidance to start with a low-dose of one tablet, and gradually increase the dose and number of drugs. This traditional ‘start low, go-slow’ policy is encouraged in order to avoid side effects, but has been shown to delay the protection from strokes which is the main reason for treating hypertension. In the longer-term, patients are also less likely to take their medication if multiple tablets are required.

Professor Graham MacGregor, Chairman of UK charity the Blood Pressure Association, said: “High blood pressure is a major cause of stroke and heart disease, so we welcome new research which could lead to improved control of the condition. Many people being treated for high blood pressure need to take more than one medicine, so a combination medicine like this offers the opportunity to take one tablet instead of two, and may be more cost effective for those paying prescription charges.”

Funded by the British Heart Foundation, the British Hypertension Society Research Network is now doing a similar study with different medicines to be sure these results are generalisable.

On the Net:

New Cholesterol Measurement

(Ivanhoe Newswire) — It has been well-known that high levels of high-density lipoprotein (HDL) cholesterol, the “good” kind, are associated with a lower risk of heart disease. Recent studies have asked if pharmacologic increases in HDL cholesterol levels are beneficial to the patient. A new study shows that a different metric, a measure of HDL function called cholesterol efflux capacity, is more closely associated with protection against heart disease than HDL cholesterol levels themselves.

Atherosclerosis typically occurs with a build-up of cholesterol along the artery wall. Cholesterol efflux capacity, an integrated measure of HDL function, is a direct calculation of the efficiency by which a person’s HDL removes cholesterol from cholesterol-loaded macrophages (a type of white blood cell) — the sort that accumulate in arterial plaque.

“Recent scientific findings have directed increasing interest toward the concept that measures of the function of HDL, rather than simply its level in the blood, might be more important to assessing cardiovascular risk and evaluating new HDL-targeted therapies,” Daniel J. Rader, M.D., director of Preventive Cardiology at Penn., was quoted as saying. “Our study is the first to relate a measure of HDL function — its ability to remove cholesterol from macrophages — to measures of cardiovascular disease in a large number of people.”

In this study, Rader and colleagues at Penn measured cholesterol efflux capacity in 203 healthy volunteers who underwent assessment of carotid artery intima-media thickness, which is a measure of arthrosclerosis. There were 442 patients with confirmed coronary artery disease and 351 patients without such confirmed disease.

An inverse relationship was seen between cholesterol efflux capacity and carotid intima-media thickness both before and after adjustment for the HDL cholesterol level. After an age- and gender-adjusted analysis, increasing efflux capacity conferred decreased likelihood of having coronary artery disease. This relationship remained after the addition of traditional cardiovascular risk factors, including HDL cholesterol levels, as covariates. Additionally, men and current smokers had decreased efflux capacity.

“The findings from this study support the concept that measurement of HDL function provides information beyond that of HDL level and suggests the potential for wider use of this measure of HDL function in the assessment of new HDL therapies,” Rader said. “Future studies may prove fruitful in elucidating additional HDL components that determine cholesterol efflux capacity.”

SOURCE: New England Journal of Medicine, published online January 11, 2011

Wikipedia Looks To India For Growth

Wikipedia said on Wednesday that it would target India, and possibly Brazil, in seeking to reach its goal of 1 billion users.

Sue Gardner, Executive Director of the San Francisco, CA-based online encyclopedia, said Wikipedia seeks to reach its target within the next five years while maintaining its status as a non-profit organization.

“We don’t move in the world of IPOs and valuation and investment,” said Gardner, who runs the non-profit foundation behind Wikipedia, during an interview with Reuters.

“We never talk about it, we never think about it.”

Wikipedia currently boasts some 410 million unique visitors each month, making it the fifth most-visited Web site in the world.   The organization functions on an operating budget of about $20 million annually, raised primarily through donations.

Its main goal is to attract additional users.  However, with the Chinese market essentially cut off and Western markets maturing, India and Brazil present attractive growth opportunities.

This year, the organization will open its first overseas office, in India, where it will work to increase readership and articles in English and many Indian languages, Gardner said.

Among the 316 events in 104 nations marking Wikipedia’s 10th anniversary, 60 are scheduled in India.

“Our main strategic focus right now is on India and other countries in the developing world. Massive numbers of people are starting to get connected to the Internet, mostly through mobile phones but also through traditional PCs,” Gardner said.

“Brazil is provisionally next,” she said.

Wikipedia has not moved any servers into China because doing so would require agreeing to government limitations on publishing.   Adding any operations in China would be contingent upon guarantees that content would not be censored, Gardner said.

“We made a decision that we weren’t going to collaborate in our own filtering.”

Wikipedia relies on roughly 100,000 regular contributors who work for free, along with members of the general public who write and edit articles in some 270 languages.

“Isn’t it amazing?” Gardener said.

The site currently adds 1,100 articles per day, and now includes 17 million articles in English. Wikipedia is also working with 12 universities to add to the quality of articles about public policy, Gardner said.

The company says its quality control measures effectively block people from making malicious or inaccurate posts for very long.

“Over time, people are trusting us more,” said Gardner.

On the Net:

Sounds, Nerve Stimulation Ease Tinnitus Symptoms

Researchers have identified a way to ease tinnitus, or ringing in the ears, by stimulating a nerve in the neck while simultaneously playing certain sounds over an extended period of time.  These measures work together to essentially “reboot” the brain, the scientists said.

Tinnitus affects as many as 23 million U.S. adults, including one in 10 seniors and 40 percent of military veterans.  The hallmark of the condition, which is currently incurable, is often a persistent ringing in the ears that ranges in severity from annoying to debilitating.

Similar to pressing a reset button in the brain, the new therapy was found to help retrain the part of the brain in rats that interprets sound, so that errant neurons reverted back to their original state and the ringing disappeared.

“Current treatments for tinnitus generally involve masking the sound or learning to ignore it,” said Dr. James Battey, director of the National Institute on Deafness and Other Communication Disorders (NIDCD), which funded part of the research.

“If we can find a way to turn off the noise, we’ll be able to improve life substantially for the nearly 23 million American adults who suffer from this disorder.”

Tinnitus is a symptom some people experience as a result of hearing loss. When sensory cells in the inner ear are damaged, such as from loud noise, the resulting hearing loss changes some of the signals sent from the ear to the brain. For reasons that are not fully understood, some people will develop tinnitus as a result.

“We believe the part of the brain that processes sounds””the auditory cortex””delegates too many neurons to some frequencies, and things begin to go awry,” said Dr. Michael Kilgard, associate professor of behavior and brain sciences at UT-Dallas, and a co-principal investigator on the study.

“Because there are too many neurons processing the same frequencies, they are firing much stronger than they should be.”

The neurons also fire in sync with one another, and fire more frequently when it is quiet.

Dr. Kilgard said it is these changing brain patterns that produce tinnitus, which is typically a high-pitched tone in one or both ears, although it may also be perceived as clicking, roaring, or a whooshing sound.

Dr. Kilgard, along with co-principal investigator Dr. Navzer Engineer of MicroTransponder, Inc. and others on the research team, first sought to induce changes in the auditory cortex of a group of rats by pairing stimulation of the vagus nerve, a large nerve that runs from the head and neck to the abdomen, with the playing of a single tone.

When stimulated, the vagus nerve releases acetylcholine, norepinephrine, and other chemicals that help promote changes in the brain. The researchers wanted to see if they could induce more brain cells to become responsive to that tone over a period of time.

For 20 days, 300 times a day, researchers played a high-pitched tone, at 9 kilohertz (kHz), to eight rats.  An electrode simultaneously delivered a very small electrical pulse to the vagus nerve.

They found that the number of neurons tuned to the 9 kHz frequency had increased 79 percent in comparison to the control rats.

In a second group of rats, they randomly played two different tones””one at 4 kHz and the other at 19 kHz””but stimulated the vagus nerve only for the higher tone.   The results showed that neurons tuned to the higher frequency increased by 70 percent, while neurons tuned to the 4 kHz tone actually decreased in number.  This indicated that the tone alone was not enough to initiate the change, and that it had to be accompanied by vagus nerve stimulation (VNS).

The researchers then tested whether tinnitus could be reversed in noise-exposed rats by increasing the numbers of neurons tuned to frequencies other than the tinnitus frequency.

One group of noise-exposed rats with tinnitus received VNS that was paired with different tones surrounding the tinnitus frequency 300 times a day for about three weeks.   Rats in the control group received VNS with no tones, tones with no VNS, or no therapy at all.

For both groups, measurements were taken four weeks after noise exposure, then 10 days after therapy began, and one day, one week, and three weeks after therapy ended.

The researchers found that rats that received the VNS paired with tones showed promising results for each time point after therapy began, including midway through therapy, indicating that the ringing had stopped for the treated rats.

Conversely, the results from the control rats indicated their tinnitus had continued throughout the testing period. 

The researchers also followed two treated and two control rats for an additional two months, and found that the treated rats maintained this benefit for more than three months after noise exposure, while the controls continued to be impaired.

An analysis of the neural responses in the auditory cortex in these same rats revealed that neurons in the treated rats had returned to and remained at their normal levels, indicating that the tinnitus had disappeared. However, the control group levels continued to be distorted, indicating that the tinnitus had persisted.

The researchers concluded that the VNS treatment combined with tones had not only reorganized the neurons to respond to their original frequencies, but also made the brain responses sharper, decreased excitability, and decreased synchronization of auditory cortex neurons.

“The key is that, unlike previous treatments, we’re not masking the tinnitus, we’re not hiding the tinnitus. We are retuning the brain from a state where it generates tinnitus to a state that does not generate tinnitus. We are eliminating the source of the tinnitus,” said Dr. Kilgard.

VNS is currently being used to treat some 50,000 people with epilepsy or depression, and MicroTransponder hopes to conduct clinical studies using VNS with paired tones in tinnitus patients.

“The clinical protocol is being finalized now and a pilot study in tinnitus patients will be conducted in Europe in the near future,” said Dr. Engineer, vice president of preclinical affairs at MicroTransponder.

“The support of the NIDCD has been essential to allow our research team to continue our work in this important area of tinnitus research.”

In the meantime, the researchers are working to refine the procedure to better understand details such as the most effective number of paired frequencies to use for treatment, how long the treatment should last, and whether the treatment would work equally well for new tinnitus cases in comparison to long-term cases.

The study was published Wednesday in the advance online publication of the journal Nature.  

On the Net:

New Baylor Study Explores How Partners Perceive Each Other’s Emotion During A Relationship Fight

Some of the most intense emotions people feel occur during a conflict in a romantic relationship. Now, new research from Baylor University psychologists shows that how each person perceives the other partner’s emotion during a conflict greatly influences different types of thoughts, feelings and reactions in themselves.

Dr. Keith Sanford, a clinical psychologist and an associate professor in Baylor’s department of psychology and neuroscience, College of Arts and Sciences, and his research team studied 105 college students in romantic relationships as they communicated through different arguments over an eight-week period. Sanford focused on how emotion changed within each person across episodes of relationship conflict. They found demonstrated links between different types of emotion, different types of underlying concern, and different types of perceived partner emotion.

Sanford distinguished between two types of negative emotion as “hard” and “soft.” “Hard” emotion is associated with asserting power, whereas “soft” emotion is associated with expressing vulnerability. Sanford’s research also identified a type of underlying concern as “perceived threat,” which involves a perception that one’s partner is being hostile, critical, blaming or controlling. Another type of concern is called “perceived neglect,” which involves a perception that one’s partner is failing to make a desired contribution or failing to demonstrate an ideal level of commitment or investment in the relationship.

Sanford said the results show that people perceive a threat to their control, power and status in the relationship when they observe an increase in partner hard emotion and they perceive partner neglect when they observe an increase in partner flat emotion or a decrease in partner soft emotion. Both perceived threat and perceived neglect, in turn, are associated with increases in one’s own hard and soft emotions, with the effects for perceived neglect being stronger than the effects for perceived threat.

“In other words, what you perceive your partner to be feeling influences different types of thoughts, feelings and reactions in yourself, whether what you perceive is actually correct,” Sanford said. “In a lot of ways, this study confirms scientifically what we would have expected. Previously, we did not actually know that these specific linkages existed, but they are clearly theoretically expected. If a person perceives the other as angry, they will perceive a threat so they will respond with a hard emotion like anger or blame. Likewise, if a person is perceived to be sad or vulnerable, they will perceive a neglect and will respond either flat or soft.”

The study appeared in the journal Personal Relationships.

Sanford said some of the most interesting results in the study pertain to a complex pattern of associations observed for soft emotion. As expected, partner soft emotion was associated with decreased concerns over neglect, whereas self soft emotion was associated with increased concerns over neglect. Sanford said this is consistent with the idea that soft emotion is a socially focused emotion, often triggered by attachment-related concerns, and that expressions of soft emotion signal one’s own desire and willingness to invest in a relationship.

The study was supported in part by funds from the Faculty Research Investment Program and the Vice Provost for Research at Baylor.

On the Net:

Astronomers Find Close-knit Pairs Of Massive Black Holes

Astronomers at the California Institute of Technology (Caltech), University of Illinois at Urbana-Champaign (UIUC), and University of Hawaii (UH) have discovered 16 close-knit pairs of supermassive black holes in merging galaxies.

The discovery, based on observations done at the W. M. Keck Observatory on Hawaii’s Mauna Kea, is being presented in Seattle on January 12 at the meeting of the American Astronomical Society, and has been submitted for publication in the Astrophysical Journal.

These black-hole pairs, also called binaries, are about a hundred to a thousand times closer together than most that have been observed before, providing astronomers a glimpse into how these behemoths and their host galaxies merge””a crucial part of understanding the evolution of the universe. Although few similarly close pairs have been seen previously, this is the largest population of such objects observed as the result of a systematic search.

“This is a very nice confirmation of theoretical predictions,” says S. George Djorgovski, professor of astronomy, who will present the results at the conference. “These close pairs are a missing link between the wide binary systems seen previously and the merging black-hole pairs at even smaller separations that we believe must be there.”

As the universe has evolved, galaxies have collided and merged to form larger ones. Nearly every one””or perhaps all””of these large galaxies contains a giant black hole at its center, with a mass millions””or even billions””of times higher than the sun’s. Material such as interstellar gas falls into the black hole, producing enough energy to outshine galaxies composed of a hundred billion stars. The hot gas and black hole form an active galactic nucleus, the brightest and most distant of which are called quasars. The prodigious energy output of active galactic nuclei can affect the evolution of galaxies themselves.

While galaxies merge, so should their central black holes, producing an even more massive black hole in the nucleus of the resulting galaxy. Such collisions are expected to generate bursts of gravitational waves, which have yet to be detected. Some merging galaxies should contain pairs of active nuclei, indicating the presence of supermassive black holes on their way to coalescing. Until now, astronomers have generally observed only widely separated pairs””binary quasars””which are typically hundreds of thousands of light-years apart.

“If our understanding of structure formation in the universe is correct, closer pairs of active nuclei must exist,” adds Adam Myers, a research scientist at UIUC and one of the coauthors. “However, they would be hard to discern in typical images blurred by Earth’s atmosphere.”

The solution was to use Laser Guide Star Adaptive Optics, a technique that enables astronomers to remove the atmospheric blur and capture images as sharp as those taken from space. One such system is deployed on the W. M. Keck Observatory’s 10-meter telescopes on Mauna Kea.

The astronomers selected their targets using spectra of known galaxies from the Sloan Digital Sky Survey (SDSS). In the SDSS images, the galaxies are unresolved, appearing as single objects instead of binaries. To find potential pairs, the astronomers identified targets with double sets of emission lines””a key feature that suggests the existence of two active nuclei.

By using adaptive optics on Keck, the astronomers were able to resolve close pairs of galactic nuclei, discovering 16 such binaries out of 50 targets. “The pairs we see are separated only by a few thousands of light-years””and there are probably many more to be found,” says Hai Fu, a Caltech postdoctoral scholar and the lead author of the paper.

“Our results add to the growing understanding of how galaxies and their central black holes evolve,” adds Lin Yan, a staff scientist at Caltech and one of the coauthors of the study.

“These results illustrate the discovery power of adaptive optics on large telescopes,” Djorgovski says. “With the upcoming Thirty Meter Telescope, we’ll be able to push our observational capabilities to see pairs with separations that are three times closer.”

In addition to Djorgovski, Fu, Myers, and Yan, the team includes Alan Stockton from the University of Hawaii at Manoa. The work done at Caltech was supported by the National Science Foundation and the Ajax Foundation.

Image Caption: Three of the newly discovered black-hole pairs. On the left are images from the Sloan Digital Sky Survey. The images on the right show the same galaxies taken with the Keck telescope and the aid of adaptive optics, revealing pairs of active galactic nuclei, which are powered by massive black holes. Credit: S. George Djorgovski

On the Net:

Energy Drinks Don’t Blunt Effects Of Alcohol

By Lisa Chedekel, Boston University

Marketing efforts that encourage mixing caffeinated “energy” drinks with alcohol often try to sway young people to believe that caffeine will offset the sedating effects of alcohol and increase alertness and stamina.

But a new study led by researchers from the Boston University School of Public Health and the Center for Alcohol and Addiction Studies at Brown University has found that the addition of caffeine to alcohol — mixing Red Bull with vodka, for example — has no effect on enhancing performance on a driving test or improving sustained attention or reaction times.

“There appears to be little or no protective benefit from the addition of caffeine to alcohol, with respect to the safe execution of activities that require sustained attention with rapid, accurate decisions,” says the study, published in the February edition of the journal Addiction.

“The results of this study suggest that public education, via media and warning labels, should be considered regarding the safety of CABs [caffeinated alcoholic beverages], and that regulators should scrutinize energy drink and CAB advertising as it relates to promoting safety-related expectancies.”

The study, headed by Jonathan Howland, professor of community health sciences at BUSPH, comes amid increased government scrutiny of energy drinks, particularly when mixed with alcohol. Denmark has banned the sale of energy drinks, and the governments of Canada and Sweden have issued warnings about mixing energy drinks with alcohol.

In 2009, the US Food and Drug Administration issued a statement expressing concern about a lack of safety data on CABs, after survey results showed that the consumption of such beverages correlated with risky behavior among college students.

Howland and his co-authors note that while energy drink companies do not explicitly advertise that their products should be mixed with alcohol, “non-traditional youth-oriented marketing strategies” include claims that such drinks will “enhance attention, endurance, performance, weight loss, and fun, while reducing performance decrements from fatigue from alcohol.”

In the new study, the research team randomized 129 participants, ages 21 to 30, into four groups: one group that consumed caffeinated beer; a second that consumed non-caffeinated beer; a third that consumed caffeinated non-alcoholic beer; and a fourth that consumed non-caffeinated, non-alcoholic beer. Those receiving alcohol attained an average blood alcohol level of .12 grams percent – somewhat higher than 0.8 grams percent, the legal per se level for driving under the influence.

Thirty minutes after drinking, the participants were tested on a driving simulator and on a sustained attention/reaction time test.

The results indicate that caffeine does not mitigate the impairment effects of alcohol. On the driving test, the effect of alcohol on performance was significant — but the addition of caffeine did not make a noticeable difference. On the test for sustained attention and reaction times, the addition of caffeine made only a slight difference that the study deemed “borderline significant.”

Howland summed up the study results: “It is important that drinkers understand that adding caffeine to alcohol does not enhance safety.”

Consumption of energy drinks mixed with alcohol has mushroomed since 2001, with some surveys showing that one in four college students report mixing the two. Some studies have found that caffeine reverses alcohol-related performance impairment on tests of reaction time, attention and psychomotor speed, but not on error rates. Other studies have found that caffeine does not significantly impact alcohol-induced impairment of motor coordination.

Howland said the new study was one of the first to provide “a controlled evaluation of the acute effects of caffeine on driving impairment” after drinking to intoxication levels. The institutional review boards of Boston Medical Center, Brown University and the University of Michigan approved the study.

In addition to Howland, researchers on the study include: Damaris J. Rohsenow of the Center for Alcohol and Addiction Studies at Brown University; J. Todd Arnedt of the Sleep and Chronophysiology Laboratory of the University of Michigan Medical School; Daniel J. Gottleib of the Boston University School of Medicine; and Caleb A. Bliss, Sarah K. Hunt, Tamara Vehige Calise, Timothy Heeren, Michael Winter and Caroline Littlefield, all of BUSPH.

On the Net:

Using Arm Artery for Bypass No Better

(Ivanhoe Newswire) — Using the radial artery (within the forearm, wrist, and hand) doesn’t appear to be superior to the saphenous vein (from the leg) when performing coronary artery bypass grafting (CABG). Using the arm didn’t result in improved angiographic patency (the graft being open, unobstructed) one year after the procedure, according to a new study.

Coronary artery bypass grafting (CABG) is one of the most common operations performed, with a database indicating that in the United States, 163,048 patients had CABG surgery in 2008. The success of CABG depends on the long-term patency of the arterial and venous grafts. Arterial grafts are thought to be more effective than saphenous vein grafts for CABG based on experience with using the left internal mammary (breast) artery to bypass the left anterior descending coronary artery, according to background information in the article. The efficacy of the radial artery graft, which is easier to harvest than other arteries, is less clear. A database shows that more than 10,000 patients in the United States received radial artery grafts in 2008, suggesting that about 6 percent of patients undergoing CABG have radial artery grafts.

Steven Goldman, M.D., of the Southern Arizona VA Health Care System and the University of Arizona Sarver Heart Center in Tucson, and colleagues, compared one-year angiographic patency of radial artery grafts vs. saphenous vein grafts in 757 participants (99 percent men) undergoing elective, first-time CABG. The primary outcome measured was angiographic graft patency at one year after CABG. Secondary outcomes included angiographic graft patency at one week after CABG, heart attack, stroke, repeat revascularization and death.

The analysis included 733 patients (366 in the radial artery group and 367 in the saphenous vein group). The researchers found that there was no significant difference in one-year graft patency between radial artery (238/266; 89 percent) and saphenous vein grafts (239/269; 89 percent).

Also, there was no significant difference in one-week patency between patients who received radial artery grafts (285/288; 99 percent) vs. saphenous vein grafts (260/267; 97 percent) or in the other secondary outcomes. There was no difference in the number and types of adverse events, including serious adverse events.

“Although most clinicians assume that compared with vein grafts, arterial grafts have an improved patency rate, there are little multi-institutional prospective data on radial artery graft vs. saphenous vein graft patency,” the authors write.

SOURCE: JAMA, published online January 12, 2010

Effective Use Of Power In The Bronze Age Societies Of Central Europe

During the first part of the Bronze Age in the Carpathian Basin in Central Europe, a large proportion of the population lived in what are known as tell-building societies. A thesis in archaeology from the University of Gothenburg (Sweden) shows that the leaders of these societies had the ability to combine several sources of power in an effective way in order to dominate the rest of the population, which contributed towards creating a notably stable social system.

Tell-building societies are named after a distinct form of settlements with a high density of population and construction, which over the course of time accumulated such thick cultural layers that they took on the shape of low mounds.

On the basis of a discussion and analysis of previously published material from the Carpathian Basin and new findings from the tell settlement Százhalombatta-Földvár in Hungary, the author of the thesis, Claes Uhn©r, describes the ways in which leaders could exercise power. Tell-building societies had relatively advanced economies. The subsistence economy, which was based on agricultural production and animal husbandry, produced a good return, and the societies were involved in regional and long-distance exchange of bronzes and other valuable craft products.

“By exercising a degree of control over these parts of the economy, it was possible for leaders to finance political activities and power-exerting organizations,” says Uhn©r. He shows in his thesis that, through military power, leaders were able to control surrounding settlements from fortified tells. As the majority of these settlements were situated next to rivers and other natural transport routes, they could demand tribute from passing trade expeditions and act as intermediaries in the exchange of goods that took place in the region. In addition, a large tell was a manifestation of a successful society with a long history. This situation made it possible for leaders to use the cultural traditions of the society in ideological power strategies.

“The tells served as physical manifestations of a social system that worked well, which legitimized the social position of the elites and their right to lead. An important conclusion drawn by Uhn©r is that the sources of power could be used in strategies where they supported each other. Economic power made it possible to master military and ideological means of power. Military power was utilized to safeguard economic and ideological resources, while ideology legitimized the social system. This was largely possible because the tell settlements served as political power centers. Redistribution of staples and specialized production was attached to these sites, and they had key military and ideological significance. “By controlling tells and the activities carried out in them, leaders

had an organizational advantage over the rest of the population, and others found it very difficult to build up competing power positions,” says Uhn©r.

During the first part of the Bronze Age in the Carpathian Basin in Central Europe, a large proportion of the population lived in what are known as tell-building societies. A thesis in archaeology from the University of Gothenburg shows that the leaders of these societies had the ability to combine several sources of power in an effective way in order to dominate the rest of the population, which contributed towards creating a notably stable social system. Tell-building societies are named after a distinct form of settlements with a high density of population and construction, which over the course of time accumulated such thick cultural layers that they took on the shape of low mounds.

On the basis of a discussion and analysis of previously published material from the Carpathian Basin and new findings from the tell settlement Százhalombatta-Földvár in Hungary, the author of the thesis, Claes Uhn©r, describes the ways in which leaders could exercise power. Tell-building societies had relatively advanced economies. The subsistence economy, which was based on agricultural production and animal husbandry, produced a good return, and the societies were involved in regional and long-distance exchange of bronzes and other valuable craft products. “By exercising a degree of control over these parts of the economy, it was possible for leaders to finance political activities and power- exerting organisations,” says Uhn©r. He shows in his thesis that, through military power, leaders were able to control surrounding settlements from fortified tells. As the majority of these settlements were situated next to rivers and other natural transport routes, they could demand tribute from passing trade expeditions and act as intermediaries in the exchange of goods that took place in the region. In addition, a large tell was a manifestation of a successful society with a long history. This situation made it possible for leaders to use the cultural traditions of the society in ideological power strategies.

“The tells served as physical manifestations of a social system that worked well, which legitimized the social position of the elites and their right to lead. An important conclusion drawn by Uhn©r is that the sources of power could be used in strategies where they supported each other. Economic power made it possible to master military and ideological means of power. Military power was utilized to safeguard economic and ideological resources, while ideology legitimized the social system. This was largely possible because the tell settlements served as political power centers. Redistribution of staples and specialized production was attached to these sites, and they had key military and ideological significance. “By controlling tells and the activities carried out in them, leaders had an organizational advantage over the rest of the population, and others found it very difficult to build up competing power positions,” says Uhn©r. The thesis has been successfully defended.

On the Net:

Dangers of Smoking in Car with Kids

(Ivanhoe Newswire) — Authors of a new research article say there is enough evidence to support legislation that bans adults from smoking in cars with children.

The researchers conducted their study to determine how harmful secondhand smoke is to children riding in vehicles.

“We hope to show that, though the relevant data are rich and complex, a simple conclusion is possible,” Dr. Ray Pawson, from the University of Leeds in the United Kingdom, writes. “The evidence does not show an absolute risk threshold because a range of environmental, biological and social factors contribute to the risk equation. The evidence does, however, show conditional truths, and the careful enunciation of each contributory condition is the task of public health science.”

Investigators looked at the following factors in determining the risks involved:

*The mixture of chemicals that make up secondhand smoke and its concentration in cars under different conditions such as volume, speed and ventilation

*How long a person would be inside a car

*How long a person would be exposed to secondhand smoke

*The extent of difference between how secondhand smoke affects children and adults.

*The health impact of secondhand smoke

“Policy based on science and evidence has to exist amid uncertainty, and this is managed by acknowledging the contingencies,” write the authors. “Thus, because of the confirmed cabin space, and under the worst ventilation conditions, and in terms of peak contamination, the evidence permits us to say that smoking in cars generates fine particulate concentrations that are very rarely experienced in the realm of air-quality studies and that will thus constitute a significant health risk because exposure to smoking in cars is still commonplace, and children are particularly susceptible and are open to further contamination if their parents are smokers.”

The authors conclude that while the evidence is incomplete, there is enough to make a decision to legislate against smoking in cars with children.

SOURCE: Canadian Medical Association Journal, Jan. 10, 2011

Coughs And Sneezes Go The Distance

Little is known about the distances a cough or sneeze can travel, but researchers in Singapore are attempting to find out how airborne transmission of flu viruses takes place using a giant mirror and a high-speed camera.

“It’s really to inform infection control teams, because there is controversy now about which pathogens, e.g. flu, are airborne and if so, how significant this route is compared to others, such as direct contact,” said team leader Julian Tang, a virologist and consultant with Singapore’s National University Hospital according to Reuters.

Observing in real-time a person’s spray of minute liquid droplets when coughing, sneezing, laughing and talking, scientists hope the results can be used to make better guidelines for infection control.

While it is likely a flu sufferer can infect others by coughing or sneezing, are flu viruses transmitted whilst airborne? And what about sneezing, coughing or even laughing? Previous infection control guidelines are mostly based on modeling studies and expert estimates, not hard scientific data.

In their S$1.08 million ($833,000 USD) study, funded by the National Medical Research Council of Singapore, Tang and colleagues designed a large concave mirror, akin to those used in astronomical telescopes. Along with a camera that can capture up to 250,000 frames per second, the scientists can observe the aerosol, or spray, produced by a cough or sneeze across the mirror.

From images seen so far, whistling and laughing appear to spread infection very effectively.

“However, whether they will lead to infection and disease depends on many other factors, such as virus survival and host immune responses – which other teams are studying.”

Using volunteers, Tang and his colleagues will study the velocity and distance of exhaled airflows, or plumes, produced by coughs and sneezes, and even laughing, crying, singing, whistling, talking, snoring and breathing. “We will be studying these other forms of plumes, where possible, as all forms of exhaled jets have the potential to carry infectious agents over greater distances,” Tang said.

They will evaluate interventions such as coughing into a loosely clenched fist, a tissue and different types of face masks to see how effective they are in containing airflows.

“What people do every day, we can visualize in real-time. Studying intervention is very important because we want to know how effective they are,” Tang said. “This may have budgetary implications when planning for the next pandemic.”

Scientists hope to make improved recommendations for infection control with better knowledge of airflows, such as how far apart to place hospital beds and quarantine measures to be taken in a place found to be housing a person with an airborne infection, such as measles, flu and drug-resistant tuberculosis.

On the Net:

H1N1 Survivors May Hold Key To Universal Vaccine

The search for a universal flu vaccine has received a boost from a surprising source: the 2009 H1N1 pandemic flu strain.

Several patients infected with the 2009 H1N1 strain developed antibodies that are protective against a variety of flu strains, scientists from Emory University School of Medicine and the University of Chicago have found. The results were published online Monday in the Journal of Experimental Medicine.

“Our data shows that infection with the 2009 pandemic influenza strain could induce broadly protective antibodies that are only rarely seen after seasonal flu infections or flu shots,” says first author Jens Wrammert, PhD, assistant professor of microbiology and immunology at Emory University School of Medicine and the Emory Vaccine Center.

“These findings show that these types of antibodies can be induced in humans, if the immune system has the right stimulation, and suggest that a pan-influenza vaccine might be feasible.”

The antibodies isolated from a group of patients who were infected with the 2009 H1N1 strain could guide researchers in efforts to design a vaccine that gives people long-lasting protection against a wide spectrum of flu viruses, say the researchers. Next, the research team is planning to examine the immune responses of people who were vaccinated against the 2009 H1N1 strain but did not get sick.

The research comes from a collaboration between the laboratories of Rafi Ahmed, PhD, at Emory and Patrick Wilson, PhD at the University of Chicago. Ahmed is director of the Emory Vaccine Center and a Georgia Research Alliance Eminent Scholar. Wilson is assistant professor of medicine at the University of Chicago’s Knapp Center for Lupus and Immunology Research.

Scientists from Columbia, Harvard and the National Institutes of Health (NIH) also contributed to the study, which was funded by the National Institute of Allergy and Infectious Diseases, part of the NIH, and by the American Recovery and Reinvestment Act of 2009.

The nine patients studied were recruited through the Hope Clinic, the clinical division of the Emory Vaccine Center. They had a range of disease severities, from mild illness that waned after a few days to a severe case that required a two-month hospital stay including ventilator support. Most of the participants were in their 20s or 30s. Blood samples were usually taken about 10 days after the onset of symptoms.

The team of researchers identified white blood cells from the patients that made antibodies against flu virus, and then isolated the antibody genes from individual cells. They used the genes to produce antibodies in cell culture — a total of 86 varieties — and then tested which flu strains they reacted against.

Five antibodies isolated by the team could bind all the seasonal H1N1 flu strains from the last decade, the devastating “Spanish flu” strain from 1918 and also a pathogenic H5N1 avian flu strain.

Seasonal flu shots contain three inactivated viral strains, each grown in chicken eggs. Over the last decade, it was standard that one of the three is an H1N1 strain. However, vaccination with any one H1N1 strain doesn’t usually result in protection against all of them ““ that’s why the 2009 strain could make so many people sick.

Some of the antibodies the team identified stick to the “stalk” region of part of the virus (a protein called hemagglutinin). Because this part of the virus doesn’t change as much as other regions, scientists have proposed to make it the basis of a vaccine that could provide broader protection.

“Previously, this type of broadly protective, stalk-reactive antibody was thought to be very rare,” Wrammert says. “In contrast, in the patients we studied, these stalk-reactive antibodies were surprisingly abundant.”

The team tested whether three of the antibodies they isolated could protect mice against the 2009 H1N1 strain or two other common lab strains. Two antibodies could protect mice against an otherwise lethal dose of any of the three strains, even when the antibody was given 60 hours after infection. However, one antibody only protected against the 2009 H1N1 strain.

The antibody that only reacted to the 2009 H1N1 strain came from the patient with the most severe illness. The antibody genes from that patient suggest that the patient had a complete lack of preexisting immunity to H1N1 viruses, the authors write. In cases where patients experienced a milder illness, it appears that immune cells that developed in response to previous seasonal flu shots or infections formed a foundation of response to 2009 strain.

“The result is something like the Holy Grail for flu-vaccine research,” says study author Patrick Wilson, PhD, assistant professor of medicine at the University of Chicago. “It demonstrates how to make a single vaccine that could potentially provide permanent immunity to all influenza. The surprise was that such a very different influenza strain, as opposed to the most common strains, could lead us to something so widely applicable.”

Additional authors include Dimitrios Koputsananos, Gui-Mei Li, Srilatha Edupuganti, Megan McCausland, Ionna Slountzou, Behzag Razavi. Carlos Del Rio, Rama Rao Amara, Youliang Wang, Mark Mulligan, Richard Compans, and Aneesh Mehta from Emory University; Michael Morrissey, Nai-Ying Zheng, Jane-Hwei Lee, Min Huang, Zahida Ali, Kaval Kaur, and Sara Andrews from the University of Chicago; Mady Hornig and Ian Lipkin of Columbia University; Jinhua Sui and Wayne Marasco of Harvard Medical School; Suman Das, Christopher O’Donnell, Jon Yewdell and Kanta Subbarao of the NIH.

Drs. Ahmed and Wrammert and Emory University are entitled to royalties derived from the sale of products related to the research described in this paper. This study could affect their personal financial status. The terms of this arrangement have been reviewed and approved by Emory University in accordance with its conflict of interest policies.

On the Net:

Experts Call On UN To Prepare For Extraterrestrials

The United Nations (UN) should prepare a course of action, just in case the Earth should ever be contacted by extraterrestrials, scientists say in a new, extraterrestrial focused edition of the journal Philosophical Transactions of the Royal Society A.

The special issue, according to Guardian Science Correspondent Alok Jha, focuses on “all aspects of the search for extraterrestrial life, from astronomy and biology to the political and religious fallout that would result from alien contact.”

The issue is currently available online and is dated February 13, 2011.

In one article, Professor John Zarnecki of the Open University and Dr Martin Dominik of the University of St. Andrews say that “a lack of co-ordination can be avoided by creating an overarching framework in a truly global effort governed by an international politically legitimated body.” The UN’s Committee on the Peaceful Uses of Outer Space (COPUOS) is ready made for the task, they argue.

According to the organization’s official website, COPUOS was established by the UN General Assembly in 1959 in order to “review the scope of international cooperation in peaceful uses of outer space, to devise programs in this field to be undertaken under United Nations auspices, to encourage continued research and the dissemination of information on outer space matters, and to study legal problems arising from the exploration of outer space.”

In their paper, Zarnecki and Dominik assert that COPUOS member states should begin focusing on potential extraterrestrial affairs as well, following a format similar to their subcommittees already focused on dealing with threats from asteroids and outer space objects, according to Jha.

Other studies included in the Philosophical Transactions of the Royal Society A include a paper by Cambridge University Evolutionary Paleobiology Professor Simon Conway Morris advising those preparing for alien contact to prepare for a species that shares human-like tendencies “towards violence and exploitation,” and Pacific Lutheran Theological Seminary Professor Ted Peter reflecting upon the impact of extraterrestrial contact on the world’s religions.

“Because our religious traditions formulated their key beliefs within an ancient world view now out of date, would shocking new knowledge dislodge our pre-modern dogmas? Are religious believers Earth-centric, so that contact with ET would de-centre and marginalize our sense of self-importance?” Peters, a professor of systematic theology, wrote, according to the Guardian.

“Do our traditional religions rank us human beings on top of life’s hierarchy, so if we meet ETI who are smarter than us will we lose our superior rank? If we are created in God’s image, as the biblical traditions teach, will we have to share that divine image with our new neighbors?” he adds, noting that he believed traditional theologians would evolve into “astrotheologians” and that “contact with extraterrestrial intelligence will expand the existing religious vision that all of creation”¦ is the gift of a loving and gracious God.”

Image Caption: This shows Kenneth Arnold holding a picture of a drawing of the crescent shaped UFO he saw in 1947.

On the Net:

Music Can Trigger Dopamine Release

A Montreal study reports that dopamine is released in the brain by those who become euphoric over listening to music, along with food, money and psychoactive drugs.

McGill University researchers in Montreal, Canada recruited eight volunteers aged 19-24 who responded to advertisements requesting people who experienced “chills” — a marker of extreme pleasure, when listening to music.

Listening to their favorite piece of spine-tingling music, the eight volunteers showed a rush of physical activity and also unlocked a release of dopamine in the striatum area of the brain. The effect occurred even in anticipation, before the “chill” peak occurred.

The eight volunteers were put into a positron emission tomography (PET) scanner, which is able to spot a tagged chemical, raclopride, that works on dopamine receptors in brain cells.

A part of the striatum known as the caudate was involved during the anticipation phase. During the peak emotional response, a different striatum area known as the nucleus accumbens was involved.

The results shed light on the exclusive regard that humans have for music, say the researchers. This reward sensation may help explain why in every society music is appreciated and also why appreciation of it is a subjective or cultural thing.

No such dopamine surge was witnessed when volunteers listened to neutral music which, previous tests showed, was known to leave subjects emotionally cold. Researchers consider dopamine to be an early brain chemical that is essential for survival.

Seeking to find out more, the scientists then put the volunteers in a frequency magnetic resonance imaging (fMRI) scanner, which highlights flows of blood in the head, thus showing which part of the brain is being activated.

Dopamine dishes out feel-good jolts in response to life-supporting actions such as eating and for acquiring “secondary” tangibles such as money. The mechanism can also be triggered by drugs.

But music is abstract and is not directly essential for survival and is not one of these “secondary” or conditioned sources of reward, says the study. “(Abstract) stimuli have persisted through cultures and generations, and are pre-eminent in most people’s lives,” it says.

“Notably, the experience of pleasure to these abstract stimuli is highly specific to cultural and personal preferences, which can vary tremendously across individuals. One possible explanation for this is because of the emotions invoked by music — “expectations, delay, tension, resolution, prediction, surprise and anticipation,” among others.

The paper, headed by Valorie Salimpoor and Robert Zatorre, is published online in the journal Nature Neuroscience.

On the Net:

Grape Ingredient Resveratrol Increases Beneficial Fat Hormone

Resveratrol, a compound in grapes, displays antioxidant and other positive properties. In a study published this week, researchers at the UT Health Science Center San Antonio describe a novel way in which resveratrol exerts these beneficial health effects.

Resveratrol stimulates the expression of adiponectin, a hormone derived from cells that manufacture and store fat, the team found. Adiponectin has a wide range of beneficial effects on obesity-related medical complications, said senior author Feng Liu, Ph.D., professor of pharmacology and member of the Barshop Institute of Longevity and Aging Studies at the Health Science Center.

Both adiponectin and resveratrol display anti-obesity, anti-insulin resistance and anti-aging properties.

“Results from these studies should be of interest to those who are obese, diabetic and growing older,” Dr. Liu said. “The findings should also provide important information on the development of novel therapeutic drugs for the treatment of these diseases.”

The researchers confirmed the finding in cells and animal models. The study is in the Jan. 7 issue of the Journal of Biological Chemistry.

Previous studies

In July 2009 in the journal Nature, the Barshop Institute and collaborators reported that the compound rapamycin extended life in mice. Rapamycin, like resveratrol, is under scrutiny for its beneficial health effects.

In 2010, Dr. Liu and colleagues announced that resveratrol inhibits activity of the mammalian target of rapamycin (mTOR). This discovery was included in the prestigious Faculty of 1000 (F1000), a service that identifies and evaluates the most important articles in biology and medical research publications. The selection process involves a peer-nominated global “faculty” of the world’s leading scientists and clinicians who rate the best of the articles they read and explain their importance.

A reviewer said the study, which appeared in the Journal of Biological Chemistry, would open up work in a new area: explaining how resveratrol and rapamycin synergistically achieve their results.

On the Net:

Biofuel Grasslands Better For Birds Than Ethanol Staple Corn

Developing biofuel from native perennials instead of corn in the Midwest’s rolling grasslands would better protect threatened bird populations, Michigan State University research suggests.

Federal mandates and market forces both are expected to promote rising biofuel production, MSU biologist Bruce Robertson says, but the environmental consequences of turning more acreage over to row crops for fuel are a serious concern.

Ethanol in America is chiefly made from corn, but research is focusing on how to cost-effectively process cellulosic sources such as wood, corn stalks and grasses. Perennial grasses promise low cost and energy inputs ““ planting, fertilizing, watering ““ and the new study quantifies substantial environmental benefits.

“Native perennial grasses might provide an opportunity to produce biomass in ways that are compatible with the conservation of biodiversity and important ecosystem services such as pest control,” Robertson said. “This work demonstrates that next-generation biofuel crops have potential to provide a new source of habitat for a threatened group of birds.”

With its rich variety of ecosystems, including historic prairie, southern Michigan provided a convenient place to compare bird populations in 20 sites of varying size for each of the three fuel feedstocks. Grassland birds are of special concern, Robertson said, having suffered more dramatic population losses than any other group of North American birds.

In the first such empirical comparison and the first to simultaneously study grassland bird communities across habitat scales, Robertson and colleagues found that bugs and the birds that feed on them thrive more in mixed prairie grasses than in corn. Almost twice as many species made their homes in grasses, while plots of switchgrass, a federally designated model fuel crop, fell between the two in their ability to sustain biodiversity.

The larger the plot of any type, researchers found, the greater the concentration of birds supported. But if grasslands offer conservation and biofuel opportunities, Robertson said, the biodiversity benefits could decrease as biofuel grass feedstocks are bred and cultivated for commercial uniformity.

Robertson was a research associate at MSU’s W.K. Kellogg Biological Station in Kalamazoo County during the two-year research project. Today he is an MSU adjunct entomology professor and a postdoctoral fellow at the Smithsonian Conservation Biology Institute Migratory Bird Center in Washington, D.C. His research colleagues included John A. Hannah Distinguished Professor of plant biology Douglas Schemske and research associate Liz Loomis, both at the Kellogg Biological Station; Patrick Doran of The Nature Conservancy in Lansing; and statistician J. Roy Robertson of Battle Creek.

The research was funded by the U.S. Department of Energy Great Lakes Bioenergy Research Center with support from The Nature Conservancy’s Great Lakes Fund for Partnership in Conservation Science and Economics. Results were recently published in the scientific journal GCB (Global Change Biology) Bioenergy.

On the Net:

Alcoholism Vaccine To Be Tested Soon

A vaccine to combat alcoholism will begin human testing next year, claim Chilean researchers.

The genetic therapy is based on aldehyde dehydrogenase, a group of enzymes that metabolize alcohol and are thus responsible for alcohol tolerance, Professor Juan Asenjo told the AFP news agency. Asenjo heads a team of researchers at Chile’s Faculty of Sciences and Mathematics and the private lab Recalcine.

In October, US researchers announced they had discovered a gene variation known as CYP2EI that can protect against alcoholism and could lead to a preventative treatment. University of North Carolina researchers at the Chapel Hill School of Medicine report the gene variant known as CYP2EI is linked to people’s response to alcohol, and for 10 to 20 percent of people who have it, just a few glasses leads them to feeling more drunk than the rest of the population.

This CYP2EI gene — located in the brain, not the liver — has long been known to hold an enzyme for metabolizing alcohol, and generates molecules known as free radicals. However a specific variant of the gene makes people more sensitive to alcohol, according to University of North Carolina researchers.

Drugs that can be created to induce the CYP2EI gene could eventually make people more sensitive to alcohol or help sober them up if they have had too much, according to that research team.

Professor Asenjo tolds Radio Cooperativa. “The vaccine would similarly increase unease, nausea and tachycardia (accelerated heart beat). About 20 percent of the Asian population lacks this enzyme and thus experience such a strong reaction that it discourages consumption. With the vaccine, the desire to consume alcohol will be greatly reduced thanks to these reactions.”

Researchers have already successfully tested the vaccine on rats that were dependent on alcohol, and got them to halve their consumption. “The idea is to have 90-95 percent reduction of consumption for humans.” Asenjo continues.

Magnetic Pole Shift Forces Runway Closure At Florida Airport

The shifting of the planet’s northern magnetic pole forced Tampa International Airport to readjust their runways on Thursday, according to a report by Jeremy A. Kaplan of Foxnews.com.

Kaplan reports that the shifting of the Earth’s magnetic fields, spurred by the drifting of the north pole towards Russia, has prompted officials at the Florida airport to shut down their primary runway until January 13. The temporary closure will give them time to change their taxiway signs to account for the magnetic changes, Federal Aviation Administration (FAA) officials told Fox News.

“The poles are generated by movements within the Earth’s inner and outer cores, though the exact process isn’t exactly understood. They’re also constantly in flux, moving a few degrees every year, but the changes are almost never of such a magnitude that runways require adjusting,” Kaplan reported, citing FAA spokesman Paul Takemoto as a source.

The runway’s listing on aviation charts will be changed from 18R/36L (representing 180-degree approach from the north and the 360-degree approach from the south) to 19R/1L, according to various media sources.

When Kaplan asked Takemoto how often these kinds of adjustments were needed at airports, the FAA spokesman told him, “It happens so infrequently that they wouldn’t venture a guess”¦ In fact, you’re the first journalist to ever ask me about it.” He was also quick to point out that passenger safety will not be an issue, but that the changes were needed “to make sure the precision is there that we need.”

According to a Wednesday article in the Tampa Tribune, late this month, the airport’s east parallel runway and a seldom used east-west runway will also be closed so that officials can change signage to reflect their new designations as well.

“The Federal Aviation Administration required the runway designation change to account for what a National Geographic News report described as a gradual shift of the Earth’s magnetic pole at nearly 40 miles a year toward Russia because of magnetic changes in the core of the planet,” the Florida newspaper’s website also said.

On the Net:

Proba-2 Gets Unique View Of Eclipse

ESA’s Proba-2 microsatellite experienced a conjunction of the spheres on Tuesday, as the Sun, Moon and Earth all lined up in front of it.
 
As people on the ground observed the 4 January partial solar eclipse, Proba-2 provided a privileged top-of-atmosphere view ““ at least briefly.

Shortly after the Moon partially blocked Proba-2’s view of the Sun, the Sun-watching satellite flew into Earth’s shadow. At that point ““ when the video seen here goes dark ““ the Sun, Moon, Earth and Proba-2 were all on the same line in space.

“This is a notable event,” said Bogdan Nicula of the Royal Observatory of Belgium (ROB), who calculated where and when this double-eclipse would happen. “It is a nice exercise to model the orbit and relative positions of all three celestial bodies.”

The images making up this video were observed by Proba-2 with its SWAP imager ““ designed and operated by ROB ““ which operates at extreme-ultraviolet (EUV) wavelengths to monitor the swirling layer of the solar corona just above the Sun’s surface.

During the eclipse event, SWAP’s view of the Sun and Moon faded as EUV was progressively blocked by Earth’s atmosphere ““ an EUV-sunset. After passing through Earth’s shadow, Proba-2 saw a brightening Sun ““ an EUV-sunrise. At that point of the orbit the Moon was no longer eclipsing the Sun.

“We had to work very hard to get this high-resolution pointing needed for these images,” explained David Berghmans, SWAP’s principal investigator, adding that with the whole of Proba-2 less than a cubic meter in volume, SWAP is only the size of a large shoe box.

“And, as far as I am aware, the Mayans did not predict this alignment should cause concerns!”

The event proved scientifically useful for LYRA, Proba-2’s other Sun-monitoring instrument normally used to track solar radiation intensity, explained LYRA principal investigator Marie Dominique: “While the EUV sunset”“sunrise season blinds SWAP, it allows LYRA to track the amount of solar EUV light passing through Earth’s atmosphere, which helps determine its particle content.”

Proba-2’s eclipse season
 
Proba-2’s orbit is optimized for solar observation, but for part of the winter season it experiences sunsets and sunrises, with Earth starting to obstruct Proba-2’s view of the Sun for a few minutes per orbit.

Because both SWAP and LYRA are observing in particular areas of the EUV spectrum, these instruments experience gradually progressing EUV sunsets (and sunrises), as the light in question is absorbed by lower layers of the terrestrial atmosphere.

The satellite continues to operate well during this eclipse season, and in some cases scientifically-useful data can be gathered ““ by tracking how much EUV light is blocked, LYRA gains insight into atmospheric composition, for example.

Proba-3: blotting out the Sun
 
Another mission in ESA’s technology-testing Proba series will manufacture its own artificial solar eclipses.

Scheduled for 2015-16, Proba-3 will comprise two formation flying satellites, with one casting the other into shadow to allow ongoing observation of the faint outer layers of the still-mysterious solar corona.

Image 1: The 4 January 2011 eclipse as seen from ESA’s ESTEC technical center in Noordwijk, the Netherlands, photographed using a 500 mm f/5.6 Maksutov-Cassegrain Telescope with a Canon EOS 30D DSLR camera. No filter was used- the Sun was already too dim due to the clouds and thick atmosphere. Exposure time was ranging due to cloudiness: Typical value was 1/100s at ISO 200. Photographer Kosmas Gazeas comments: “This is the typical photographic equipment I use for solar and lunar eclipses when I travel around the world, since it provides portable, solid and light observing and recording setup.” Credits: Kosmas Gazeas

Image 2: Proba-2 is flight-testing a total of 17 technology demonstrators for future ESA missions. It also serves as a scientific platform for solar and space weather observations. Credits: ESA/Pierre Carril

On the Net:

FDA To Review New, Altered Tobacco Products

Tobacco companies must submit any products introduced or changes since early 2007 for review by the Food and Drug Administration (FDA) under new guidelines introduced by the US regulatory agency on Wednesday.

Under the new regulations, manufacturers have until March 22 to prove that the cigarettes and other tobacco products that they marked are “substantially equivalent” to the goods they offered for sale prior to February 15, 2007, the Associated Press (AP) is reporting.

“That means the ingredients and design are similar and do not raise different public health concerns,” the AP reporters said. “The FDA said it may deny an application if the product poses an increased health risk to users or causes nonusers to start using tobacco.”

Under the new policy, companies who introduce new products after March 22 must obtain a market order from the FDA before selling their cigarettes, roll-your-own tobacco, and smokeless products, according to various media reports.

“Manufacturers frequently alter ingredients without anyone knowing what they’re consuming,” Lawrence Deyton, Director of the FDA’s Center for Tobacco Products, told Bloomberg’s Molly Peterson and other reporters in a conference call Wednesday. “No longer will changes to products consumed by millions of Americans be made without anyone knowing.”

“For a new product to be a substantial equivalent, it must be the same in terms of ingredients, design, composition, heating source and other characteristics to an existing single predicate product,” added David Ashley, director of the Office of Science at the FDA’s Center for Tobacco Products. “If it has different characteristics, they must not raise different questions of public health.”

In a statement responding to the FDA announcement, Matthew L. Myers, President of the Campaign for Tobacco-Free Kids, said: “We applaud the FDA for quickly and effectively implementing its new authority over tobacco products. The FDA has seized the opportunity presented by the new law to protect our children and reduce the death and disease caused by tobacco use, the nation’s number one cause of preventable death.”

According to Centers for Disease Control and Prevention (CDC) statistics, 46.6 million people, or approximately one out of every five Americans, currently smoke cigarettes. Furthermore, the CDC states that up to 3% of the US adult population uses smokeless tobacco, and tobacco as a whole is responsible for more than 440,000 deaths annually.

On the Net:

Mass Animal Deaths Leading To End Times Panic

What started with reports of unusual blackbird deaths in the southern United States earlier this week has now snowballed into multiple reports of mass bird and fish deaths from around the globe, prompting some to theorize that they may be signs of the end times.

“When the term ‘dead fish’ became a top Google search Wednesday, soaring past the likes of Lindsay Lohan and leaving Justin Bieber in its scaly wake, it looked as if the end were near,” Jill Rosen of the Los Angeles Times reported on Thursday. “That’s what everyone was saying, anyway.”

“After millions of tiny fish went belly up in Chesapeake Bay this week, much of the populace immediately dismissed the official scientific explanation (the water was just too darn cold),” she added. “What made more sense, they reasoned? The approaching apocalypse. Of course.”

The Chesapeake Bay incident, which according to the L.A. Times saw millions of dead, tiny fish wash ashore in Maryland, is the latest in a series of seemingly unexplainable mass animal deaths being reported worldwide.

As previously reported here on RedOrbit, thousands of redwing blackbirds were found dead in a small Arkansas town over the weekend, and on Tuesday, 500 additional birds were found dead in Louisiana. Arkansas officials said that testing showed no sign of disease and that the likely cause of death was “acute physical trauma”–likely caused by the birds, who have poor eyesight, becoming frightened by New Year’s Eve fireworks and colliding with objects in the dark, according to reports.

Then on Wednesday, between 50 and 100 jackdaw crows were discovered on a snow-covered street in the Swedish town of Falkoeping. According to AFP reports, the birds were initially discovered by police around midnight, and five crows were taken by experts from the National Veterinary Institute, who planned to test them for bacterial and viral infections, including swine flu.

Wednesday also saw reports surface of 200 dead birds being discovered on a highway near Tyler, Texas, as well as thousands of dead fish being discovered in Florida, some 100 tons of sardines, croaker and catfish washing up dead on the coast of Brazil, hundreds more dead fish in New Zealand, hundreds of dead robins and starlings in the Kentucky town of Gilbertsville, and an estimated 40,000 dead devil crabs in England, according to Daily Mail reports.

These incidents “are the latest in a spate of incidents which are being blamed on New Year fireworks, thunderstorms, cold weather, parasites and even poisoning,” Daily Mail reporter Wil Longbottom said in a Wednesday evening article, in which he dubbed the phenomenon “Aflockalypse.”

“The internet has been abuzz with conspiracy theories about secret government experiments being behind the deaths, or it being a sign of a looming Armageddon at the end of the Mayan calendar next year.”

Longbottom notes that tests are being carried out on the dead animals, but results of the examinations will most likely not be available for several weeks. In the meantime, panicked people around the world began wondering whether or not they could be experiencing the end times.

“George Washington University religion professor Paul Duff, who has studied the Book of Revelation and the apocalypse, didn’t seem particularly alarmed about all this when reached for comment Wednesday,” Rosen said, noting that the professor told her, “There has not been a generation that has not cried, ‘The end is near.'”

“Duff said the disturbing nature of the wildlife deaths, combined with the unanswered questions behind some of them, create the perfect climate for a doomsday plot,” the L.A. Times reporter said. “Even if all the poor birds and rotting fish portend nothing in the end, Duff has little doubt that the apocalyptically inclined will not drop their case.”

“When they expect [doomsday] to come and it doesn’t, they don’t give up that belief,” Duff told Rosen on Thursday. “They’ll just recalculate. And push [the date] forward again.”

On the Net:

Majority Of YouTube CPR Videos Wrong Or Incomplete

A new study suggests that YouTube — a popular site for watching videos of anything from kittens playing to kids singing — is not the most reliable source for learning how to perform CPR (cardiopulmonary resuscitation).

Researchers found that of the 52 videos they discovered on YouTube of people teaching how to perform CPR, half were uploaded by individuals with no apparent health qualifications.

Of the rest, most were posted by either a private group (not a government agency or medical group with official CPR guidelines) or by people who claimed they were a certified CPR instructor, doctor or paramedic.

Lead researcher Dr. Karthik Murugiah told Reuters Health in an e-mail that many of the videos gave accurate information on how to perform CPR.

However, there were also many more that showed an incorrect or incomplete picture, said Murugiah, an assistant professor at the Medical College of Wisconsin in Milwaukee.

Nearly 65 percent, for example, either incorrectly described the rate of CPR chest compressions or did not cover the detail at all. And 57 percent fell short on showing viewers how deep the chest compressions should be.

The ideal rate is at least 100 compressions per minute, according to the American Heart Association (AHA). Each compression should be about two inches deep in adults and children, and about one and a half inches in infants. It is important to let the chest return to its starting position after each compression, so rescuers should not lean on the chest between compressions.

Also, only a handful of the videos dealt with “hands-only” CPR, where bystanders skip the traditional mouth-to-mouth breathing and perform chest compressions only. That is important because the AHA and other medical groups now recommend that whenever an adult suddenly collapses and is unresponsive, bystanders perform hands-only CPR — unless they are confident in their ability to do traditional CPR.

“I would say although there is very accurate information out there on YouTube,” Murugiah said, adding “it is difficult for the lay person to wade through all the content and watch the right videos. And there is a risk of dissemination of incorrect information.”

He said the findings, reported in the journal Resuscitation, suggest that guideline-making groups, like the AHA and Red Cross, should get more CPR information out to online venues such as YouTube.

Of course, content on YouTube changes rather quickly. And since the time of the study, which began in February 2010, the AHA has added a couple CPR-teaching videos to its YouTube channel (http://www.youtube.com/user/americanheartassoc).

The Red Cross also has a video demonstrating hands-only CPR on its channel (http://www.youtube.com/user/AmRedCross).

Murugiah said that both the AHA and the Red Cross videos are “certainly good, reliable sources of information on CPR.” But even their videos, he added, are somewhat vague on detail and show “room for improvement.”

On the Net:

IVF Pregnancy Prediction App Coming To Smartphones

Researchers from universities in Scotland and England say that they have developed a new, highly accurate assessment for couples hoping to have a child through in-vitro fertilization (IVF).

The experimental model, which was created by Scott Nelson from the University of Glasgow in Scotland and Debbie Lawlor from the University of Bristol in England, is currently available online and will soon be released as a smartphone app as well. The prediction model is also discussed in a paper published this week by the journal Public Library of Science (PLoS) Medicine.

The new IVF prediction model “provides a more accurate and contemporary assessment of likely outcomes after IVF than a previously established model, partly because the new model includes intracytoplasmic sperm injection outcomes,” the PLoS said in a Tuesday press release.

According to an article by Kate Kelland of Reuters, Nelson and Lawlor studied more than 144,000 IVF cycles while preparing their prediction model, and using those statistics they can reportedly give a 99% accurate prediction of live birth.

“However, before this new prediction model can be used to guide clinical decisions globally and be used to counsel patients outwith the UK, it needs to be validated using independent IVF data,” the PLoS study said, which is the reason that the tool is currently available for use, free of charge, at www.ivfpredict.com and in the forthcoming IVFpredict application for Apple’s iPhone and Android devices.

The formula developed by Nelson and Lawlor “takes into account the woman’s age, number of years trying to get pregnant, whether she is using her own eggs, the cause of infertility, the number of previous IVF cycles and whether she has previously been pregnant or had a baby,” Kelland added in her January 4 article.

On the Net:

Media Portrayal of Oceanic ‘Garbage Patch’ Misleading

There is a lot of plastic trash floating in the Pacific Ocean, but claims that the “Great Garbage Patch” between California and Japan is twice the size of Texas are grossly exaggerated, according to an analysis by an Oregon State University scientist.

Further claims that the oceans are filled with more plastic than plankton, and that the patch has been growing tenfold each decade since the 1950s are equally misleading, pointed out Angelicque “Angel” White, an assistant professor of oceanography at Oregon State.

“There is no doubt that the amount of plastic in the world’s oceans is troubling, but this kind of exaggeration undermines the credibility of scientists,” White said. “We have data that allow us to make reasonable estimates; we don’t need the hyperbole. Given the observed concentration of plastic in the North Pacific, it is simply inaccurate to state that plastic outweighs plankton, or that we have observed an exponential increase in plastic.”

White has pored over published literature and participated in one of the few expeditions solely aimed at understanding the abundance of plastic debris and the associated impact of plastic on microbial communities. That expedition was part of research funded by the National Science Foundation through C-MORE, the Center for Microbial Oceanography: Research and Education.

The studies have shown is that if you look at the actual area of the plastic itself, rather than the entire North Pacific subtropical gyre, the hypothetically “cohesive” plastic patch is actually less than 1 percent of the geographic size of Texas.

“The amount of plastic out there isn’t trivial,” White said. “But using the highest concentrations ever reported by scientists produces a patch that is a small fraction of the state of Texas, not twice the size.”

Another way to look at it, White said, is to compare the amount of plastic found to the amount of water in which it was found. “If we were to filter the surface area of the ocean equivalent to a football field in waters having the highest concentration (of plastic) ever recorded,” she said, “the amount of plastic recovered would not even extend to the 1-inch line.”

Recent research by scientists at the Woods Hole Oceanographic Institution found that the amount of plastic, at least in the Atlantic Ocean, hasn’t increased since the mid-1980s ““ despite greater production and consumption of materials made from plastic, she pointed out.

“Are we doing a better job of preventing plastics from getting into the ocean?” White said. “Is more plastic sinking out of the surface waters? Or is it being more efficiently broken down? We just don’t know. But the data on hand simply do not suggest that “Ëœplastic patches’ have increased in size. This is certainly an unexpected conclusion, but it may in part reflect the high spatial and temporal variability of plastic concentrations in the ocean and the limited number of samples that have been collected.”

The hyperbole about plastic patches saturating the media rankles White, who says such exaggeration can drive a wedge between the public and the scientific community. One recent claim that the garbage patch is as deep as the Golden Gate Bridge is tall is completely unfounded, she said.

“Most plastics either sink or float,” White pointed out. “Plastic isn’t likely to be evenly distributed through the top 100 feet of the water column.”

White says there is growing interest in removing plastic from the ocean, but such efforts will be costly, inefficient, and may have unforeseen consequences. It would be difficult, for example, to “corral” and remove plastic particles from ocean waters without inadvertently removing phytoplankton, zooplankton, and small surface-dwelling aquatic creatures.

“These small organisms are the heartbeat of the ocean,” she said. “They are the foundation of healthy ocean food chains and immensely more abundant than plastic debris.”

The relationship between microbes and plastic is what drew White and her C-MORE colleagues to their analysis in the first place. During a recent expedition, they discovered that photosynthetic microbes were thriving on many plastic particles, in essence confirming that plastic is prime real estate for certain microbes.

White also noted that while plastic may be beneficial to some organisms, it can also be toxic. Specifically, it is well-known that plastic debris can absorb toxins such as PCB.

“On one hand, these plastics may help remove toxins from the water,” she said. “On the other hand, these same toxin-laden particles may be ingested by fish and seabirds. Plastic clearly does not belong in the ocean.”

Among other findings, which White believes should be part of the public dialogue on ocean trash:

* Calculations show that the amount of energy it would take to remove plastics from the ocean is roughly 250 times the mass of the plastic itself;

* Plastic also covers the ocean floor, particularly offshore of large population centers. A recent survey from the state of California found that 3 percent of the southern California Bight’s ocean floor was covered with plastic ““ roughly half the amount of ocean floor covered by lost fishing gear in the same location. But little, overall, is known about how much plastic has accumulated at the bottom of the ocean, and how far offshore this debris field extends;

* It is a common misperception that you can see or quantify plastic from space. There are no tropical plastic islands out there and, in fact, most of the plastic isn’t even visible from the deck of a boat;

* There are areas of the ocean largely unpolluted by plastic. A recent trawl White conducted in a remote section of water between Easter Island and Chile pulled in no plastic at all.

There are other issues with plastic, White said, including the possibility that floating debris may act as a vector for introducing invasive species into sensitive habitats.

“If there is a takeaway message, it’s that we should consider it good news that the “Ëœgarbage patch’ doesn’t seem to be as bad as advertised,” White said, “but since it would be prohibitively costly to remove the plastic, we need to focus our efforts on preventing more trash from fouling our oceans in the first place.”

Image Caption: Larger plastic pieces can harbor microbes, both beneficial and harmful, scientists have discovered. (photo courtesy of C-MORE project)

On the Net: