Dinosaurs Had Night-vision For Stalking Prey

A new study by researchers at University of California-Davis finds that velociraptors had night vision that helped them stalk their prey at night.

The study reverses conventional wisdom that dinosaurs were active by day while early mammals scurried around at night, said Ryosuke Motani, professor of geology at UC Davis and co-author of the report.

“It was a surprise, but it makes sense,” he said.

Plant-eating dinosaurs also had some limited night vision, likely to satisfy their round the clock appetite, while flying dinosaurs, like birds, were active only during the day, the researchers said.

The study also sheds light into how ecology influences the evolution of animal shape and form over tens of millions of years, said Motani and collaborator Lars Schmitz, a postdoctoral researcher in the Department of Evolution and Ecology at UC Davis.

Motani and Schmitz inferred the dinosaur’s daily habits by studying their eyes.

Dinosaurs, lizards and birds all have a bony ring known as a “scleral ring” in their eye, something lacking in mammals and crocodiles.  Schmitz and Motani measured the inner and outer dimensions of this ring, plus the size of the eye socket, in 33 fossils of dinosaurs, ancestral birds and pterosaurs. They also obtained the same measurements in 164 living species.

Day-active, or diurnal, animals have a small opening in the middle of the ring. In nocturnal animals, the opening is much larger, while cathemeral animals, those active both day and night, tend to be in between.

The size of these features is affected by a species’ environment as well as by ancestry.  For example, two closely related animals might have a similar eye shape, something controlled by ancestry, even though one is active by day and the other by night.

By analyzing 164 living species, the researchers were able to confirm that eye measurements are quite accurate in predicting whether animals are active by day, by night or around the clock.

Next, the scientists applied the technique to fossils from plant-eating and carnivorous dinosaurs, flying reptiles called pterosaurs, and ancestral birds.

The measurements revealed that the large, plant-eating dinosaurs were active day and night, likely because they had to eat most of the time, except for the warmest hours of the day when they needed to avoid overheating.

Modern megaherbivores like elephants show the same activity pattern, Motani said.

Velociraptors and other small carnivores were night hunters, the study revealed. 

The researchers were not able to study large carnivores such as Tyrannosaurus rex, because there are no fossils with sufficiently well-preserved scleral rings.

Flying creatures, including early birds and pterosaurs, were mostly day-active, although some of the pterosaurs were apparently night-active.

The ability to separate out the effects of ancestry gives researchers a new tool to understand how animals lived in their environment and how changes in the environment influenced their evolution over millions of years, Motani said.

The study was published online April 14 in the journal Science.

Image 1: Close-up of the eye socket and ring of the dinosaur Protoceratops, active by day and night. Credit: Ryosuke Motani and Lars Schmitz

Image 2: The small carnivorous dinosaur Juravenator starki was nocturnal. Credit: Ryosuke Motani and Lars Schmitz

Image 3: The pterosaur Scaphognathus crassirostris was a day-active archosaur, evidenced by its eye. Credit: Ryosuke Motani and Lars Schmitz

On the Net:

Mapping Of Maya’s ‘Holtun’ Site In Central Lakes Region Of Guatemala Locates Triadic Pyramid, Astronomical Observatory, Ritual Ball Court, Residential Mounds, Plazas

Archaeologists have made the first three-dimensional topographical map of ancient monumental buildings long buried under centuries of jungle at the Maya site “Head of Stone” in Guatemala.

The map puts into 3-D perspective the location and size of Head of Stone’s many buildings and architectural patterns, which are typical of Maya sites: 70-foot-tall “triadic pyramid,” an astronomical observatory, a ritual ball court, numerous plazas and also residential mounds that would have been the homes of elites and commoners, according to archaeologist Brigitte Kovacevich, Southern Methodist University, Dallas.

The map situates the primary buildings relative to one another and also places them within the context of the site’s hills and valleys in the Central Lakes agricultural region of north-central Guatemala.

The buildings date from 800 B.C. to 900 A.D., says Kovacevich, an expert in Meso-American cultures and co-leader of an international scientific team that has been granted permission by the Guatemalan government to work the site, which has never before been excavated.

Movement to understand early periods, how kingship developed

Known for its far-reaching state-level government, Maya civilization during the “Classic” period from 200 A.D. to 900 A.D. consisted of huge monumental cities with tens of thousands of people ruled by powerful kings, palaces, pyramidal temples and complex political and economic alliances, Kovacevich says.

The ancient culture at its peak during the Classic period has been well-documented by archaeologists studying the civilization’s large urban centers, such as Tikal, which was one of the most powerful and long-lasting of the Maya kingdoms.

In contrast, “Head of Stone,” called “Holtun” in Maya, is a modest site from the “Pre-Classic” period, 600 B.C. to 250 A.D., she says. The small city had no more than 2,000 people at its peak. Situated about 35 kilometers south of Tikal, “Head of Stone” in its heyday preceded the celebrated vast city-states and kingship culture for which the Maya are known.

By excavating a small city, Kovacevich says, the archaeologists hope to understand early Maya trade routes and alliances, the importance of ritual for developing political power, how political power emerged, and how kingship lines evolved and solidified.

“There is a movement toward a greater understanding of these early periods, with smaller sites and common people,” says Kovacevich, an assistant professor in SMU’s Anthropology Department. “Little is known about how kingship developed, how individuals grabbed political power within the society, how the state-level society evolved and why it then was followed by a mini-collapse between 100 A.D. and 250 A.D.”

Kovacevich presented “‘Head of Stone’: Archaeological Investigation at the Maya Site of Holtun, Guatemala” during the 76th annual meeting of the Society for American Archaeology in Sacramento, Calif., March 30 to April 3.

Besides Kovacevich, archaeologists on the team and co-authors of the paper are Michael G. Callaghan, University of Texas at Arlington; Patricia R. Castillo, Universidad San Carlos, Guatemala; and Rodrigo Guzman, Universidad del Valle, Guatemala. The 3-D topographic map expands surveys from 1995 and 2002 by Guatemalan archaeologist Vilma Fialko and Guatemala’s Institute of Anthropology and History, which were documented by Fialko and archaeologist Erick M. Ponciano.

Situated in a patch of rainforest on defensible escarpment

Head of Stone today sits in a patch of rainforest surrounded by cow pastures and cornfields on a limestone escarpment, which would have made it highly defensible, Kovacevich says.

Holtun’s structures “” more than 100 of them “” now are overgrown with a thin layer of centuries-old jungle foliage and soil. The site is about one kilometer long and half a kilometer wide, or almost three-quarters of a mile long and one-third of a mile wide. The large mounds protruding here and there from the jungle floor signal to archaeologists the familiar building arrangements customary at a Maya site, Kovacevich says.

As with most Maya sites, looters have tunneled into many of the important structures. Kovacevich and her colleagues will dig more tunnels to further explore the buildings with the help of Guatemalan experts skilled at working Maya sites.

Key structures: “E Group,” residential group

The 3-D mapping has confirmed an “E Group,” a key Maya architectural structure. Holtun’s “E Group” dates from 600 B.C. to 600 A.D. and consists of stair-step pyramids and elongated buildings that likely served as astronomical observatories central to Maya rituals. A stepped pyramid to the west of a long narrow building directly oriented north-south served as the observational structure and was related to veneration of sacred ancestors, Kovacevich says.

“From the observational structure you can see the sun rising at the different solstices throughout the year, which is very important agriculturally, to know the timing of the seasons and when to plant and when to harvest,” she says. “So the people creating this are harnessing that knowledge to show their followers and constituents that they possibly are even controlling the change of seasons.”

Adjacent to the “E Group” are four structures that face one another around a central patio. The pattern usually indicates a residential group, where cooking and food processing were carried out on the patio, Kovacevich says.

“The closeness of the residential structure to the “E Group” suggests these were very early elites, and possibly kings,” she says. “Kingship was just being established during this period.”

The Maya often left offerings to their ancestors, such as jade or ceramics, at the base of structures.

Triadic pyramid represents Maya mythology?

Besides the “E Group,” a triadic pyramid dating from 300 B.C. to 300 A.D. sits at the north end of the site. As is typical at Maya sites, three pyramids about 10 feet tall sit atop a high platform that rises about 60 feet from the jungle floor, Kovacevich says. One of the pyramids faces south, flanked on either side by the other two, which face inward around a central patio. The platform sits atop “” and obscures “” an earlier sub-structure platform, buried underground and decorated with monumental masks that are visible from the looters’ tunnels.

“Some archaeologists argue that this configuration represents elements of Maya mythology: the three hearthstones of creation that were set down by the gods to create the first home and hearth, thereby civilizing humanity,” Kovacevich says. “Re-creation of that by the people at Holtun would show piousness and connection to ancestors.”

During the Classic period, kings were typically buried in Maya pyramids. During the Pre-Classic period, however, that isn’t the case and they were typically buried in their residence. It’s possible an early king of Holtun was buried in one of the residential structures, Kovacevich says.

“Ancestors are buried beneath the floor and kept very close and venerated,” she says. “The more ancestors a residence has, the more times the family redoes their floor, making a new floor, and so their mound gets higher and higher. A person with more ties, more ancestors, has more status.”

Another familiar structure is a ball court, signified by two long mounds that are exactly parallel, said Kovacevich.

“Those are the two sides of the ball court, and the ball would have been bounced in the center off of the sides,” she said. “Almost all Maya sites had a ball court.”

On the Net:

Algae Could Replace 17% Of US Oil Imports

Choosing optimal growing locations limits algal biofuel’s water use

High oil prices and environmental and economic security concerns have triggered interest in using algae-derived oils as an alternative to fossil fuels. But growing algae “” or any other biofuel source “” can require a lot of water.

However, a new study shows that being smart about where we grow algae can drastically reduce how much water is needed for algal biofuel. Growing algae for biofuel, while being water-wise, could also help meet congressionally mandated renewable fuel targets by replacing 17 percent of the nation’s imported oil for transportation, according to a paper published in the journal Water Resources Research.

Researchers at the Department of Energy’s Pacific Northwest National Laboratory found that water use is much less if algae are grown in the U.S. regions that have the sunniest and most humid climates: the Gulf Coast, the Southeastern Seaboard and the Great Lakes.

“Algae has been a hot topic of biofuel discussions recently, but no one has taken such a detailed look at how much America could make – and how much water and land it would require “” until now,” said Mark Wigmosta, lead author and a PNNL hydrologist. “This research provides the groundwork and initial estimates needed to better inform renewable energy decisions.”

Algal biofuel can be made by extracting and refining the oils, called lipids, that algae produce as they grow. Policy makers and researchers are interested in developing biofuels because they can create fewer overall greenhouse gas emissions than fossil fuels. And biofuels can be made here in the United States. In 2009, slightly more than half of the petroleum consumed by the U.S. was from foreign oil.

Wigmosta and his co-authors provide the first in-depth assessment of America’s algal biofuel potential given available land and water. The study also estimated how much water would need to be replaced due to evaporation over 30 years. The team analyzed previously published data to determine how much algae can be grown in open, outdoor ponds of fresh water while using current technologies. Algae can also be grown in salt water and covered ponds. But the authors focused on open, freshwater ponds as a benchmark for this study. Much of today’s commercial algae production is done in open ponds.

Crunching the numbers

First, the scientists developed a comprehensive national geographic information system database that evaluated topography, population, land use and other information about the contiguous United States. That database contained information spaced every 100 feet throughout the U.S., which is a much more detailed view than previous research. This data allowed them to identify available areas that are better suited for algae growth, such as those with flat land that isn’t used for farming and isn’t near cities or environmentally sensitive areas like wetlands or national parks.

Next, the researchers gathered 30 years of meteorological information. That helped them determine the amount of sunlight that algae could realistically photosynthesize and how warm the ponds would become. Combined with a mathematical model on how much typical algae could grow under those specific conditions, the weather data allowed Wigmosta and team to calculate the amount of algae that could realistically be produced hourly at each specific site.

Water for oil

The researchers found that 21 billion gallons of algal oil, equal to the 2022 advanced biofuels goal set out by the Energy Independence and Security Act, can be produced with American-grown algae. That’s 17 percent of the petroleum that the U.S. imported in 2008 for transportation fuels, and it could be grown on land roughly the size of South Carolina. But the authors also found that 350 gallons of water per gallon of oil “” or a quarter of what the country currently uses for irrigated agriculture “” would be needed to produce that much algal biofuel.

The study also showed that up to 48 percent of the current transportation oil imports could be replaced with algae, though that higher production level would require significantly more water and land. So the authors focused their research on the U.S. regions that would use less water to grow algae, those with the nation’s sunniest and most humid climates.

But the authors also found that algae’s water use isn’t that different from most other biofuel sources. While considering the gas efficiency of a standard light-utility vehicle, they estimated growing algae uses anywhere between 8.6 and 50.2 gallons of water per mile driven on algal biofuel. In comparison, data from previously published research indicated that corn ethanol can be made with less water, but showed a larger usage range: between 0.6 and 61.9 gallons of water per mile driven. Several factors “” including the differing water needs of specific growing regions and the different assumptions and methods used by various researchers “” cause the estimates to range greatly, they found.

Because conventional petroleum gas doesn’t need to be grown like algae or corn, it doesn’t need as much water. Previously published data indicated conventional gas uses between about 0.09 and 0.3 gallons of water per mile.

More to consider

Looking beyond freshwater, the authors noted algae has several advantages over other biofuel sources. For example, algae can produce more than 80 times more oil than corn per hectare a year. And unlike corn and soybeans, algae aren’t a widespread food source that many people depend on for nutrition. As carbon dioxide-consuming organisms, algae are considered a carbon-neutral energy source. Algae can feed off carbon emissions from power plants, delaying the emissions’ entry into the atmosphere. Algae also digest nitrogen and phosphorous, which are common water pollutants. That means algae can also grow in “” and clean “” municipal waste water.

“Water is an important consideration when choosing a biofuel source,” Wigmosta said. “And so are many other factors. Algae could be part of the solution to the nation’s energy puzzle “” if we’re smart about where we place growth ponds and the technical challenges to achieving commercial-scale algal biofuel production are met.”

Next up for Wigmosta and his colleagues is to examine non-freshwater sources like salt water and waste water. They are also researching greenhouse ponds for use in colder climates, as well as economic considerations for algal biofuel production.

The paper describes research funded by DOE’s Office of Energy Efficiency and Renewable Energy.

REFERENCE: Mark S. Wigmosta, Andre M. Coleman, Richard J. Skaggs, Michael H. Huesemann, Leonard J. Lane. National Microalgae Biofuel Production Potential and Resource Demand. Water Resources  Research. Published online April 13, 2011.http://www.agu.org/journals/wr/wr1104/2010WR009966/. DOI:10.1029/2010WR009966

Image Caption: A new PNNL study shows that 17 percent of the United States’ imported oil for transportation could be replaced by biofuel made from algae grown in outdoor raceway ponds located in the Gulf Coast, the Southeastern Seaboard and the Great Lakes. This June 2010 photo shows raceway ponds in Southern California and was taken by the QuickBird satellite. (PNNL)

On the Net:

Insulin: Predictor for Alzheimer’s?

Fernanda Barros, Ivanhoe Health Correspondent

(Ivanhoe Newswire) — Could Alzheimer’s be a form of diabetes? Brain levels of insulin and its related cellular receptors fall during the early stages of Alzheimer’s, and as insulin levels continue to drop, the disease becomes more severe. Now, doctors are looking at memory problems like Alzheimer’s disease as a form of brain starvation, and one doctor says glucose metabolism can be the key to helping prevent this deadly disease.

Alzheimer’s disease is the most common form of dementia. Most often, it is diagnosed in people over 65, although the less-prevalent, early-onset Alzheimer’s can occur much earlier. In 2006, there were over 26 million sufferers worldwide. Alzheimer’s is predicted to affect 1 in 85 people globally by 2050. A recent study showed the inability of the brain to properly use glucose might be a key factor in the development of the disease.

“Type 1 and 2 diabetes are diabetes of the body, which means the body can’t handle sugar properly. Type 3 diabetes means the brain can’t handle sugar properly, “Larry McCleary, M.D., a neurosurgeon and author of Feed your Brain, Lose Your Belly, told Ivanhoe.

Dr. McCleary says diabetics have four-times the risk of developing Alzheimer’s, and those with prediabetes have triple the risk. Insulin and its related protein, insulin-related growth factor-I, lose the ability to bind to cell receptors. This creates a resistance to the insulin growth factors, causing the cells to malfunction and die.

“If you can’t handle your primary fuel source, then you can’t generate energy, and you lose function, and that’s pretty much what happens in Alzheimer’s disease,” Dr. McCleary explained. “Changes in brain glucose metabolism can occur in people who have no symptoms. Their brains are functioning normally in their 20’s and 30’s, but yet if you do scan, you can see subtle changes in glucose metabolism in the brain and not just anywhere in the brain. They are actually in the regions where Alzheimer’s disease develops when you’re 65 or 75 years old.”

He says to prevent diabetes of the brain and the body, it’s important to make lifestyle changes that feed the brain while maintaining stable blood sugar and insulin levels.

“If your brain is functioning normally, but it’s starting not to work normally, that’s the time to start thinking about doing something about it,” Dr. McCleary said. “If you lose weight, you can get the glucose metabolism back to normal. If you can do that before you injure brain cells permanently, I predict that you should be able to reverse the changes in your brain.”

He says people with a family history of Alzheimer’s disease, or those who have had a head injury that leads to memory loss should get tested by doing a simple glucose tolerance test once in awhile.

“If your insulin glucose improves, then probably your brain health will improve as well, but it’s better to do it early on even if everything is still functioning than waiting until the nerve cells are starting to die because once they do, they don’t get replaced,” Dr. McCleary said.

Dr. McCleary says if insulin resistance could be minimized by making proper food choices, he estimates that 40 percent of Alzheimer’s disease cases could be prevented.

SOURCE: Interview with Dr. Larry McCleary, 19th Annual World Congress on Anti-Aging and Aesthetic Medicine, held in Orlando, FL, April 7-9, 2011

The World’s Smallest Wedding Rings

2 interlocking rings of DNA are only visible through the scanning force microscope

Creating artificial structures from DNA is the objective of DNA nanotechnology. This new discipline, which combines biology, physics, chemistry and material science makes use of the ability of the natural DNA-strains’ capacity for self assembly. Smileys or small boxes, measuring only 10s of nanometers (10 one-billionths of a meter) were created from DNA in a drop of water. Prof Alexander Heckel and his doctoral student Thorsten Schmidt from the “Cluster of Excellence for Macromolecular Complexes” at Goethe University were able to create two rings of DNA only 18 nanometers in size, and to interlock them like two links in a chain. Such a structure is called catenan, a term derived from the Latin word catena (chain). Schmidt, who got married during the time he was working on the nano-rings, believes that they are probably the world’s smallest wedding rings.

From a scientific perspective, the structure is a milestone in the field of DNA nanotechnology, since the two rings of the catenan are, as opposed to the majority of the DNA nanoarchitechtures that have already been realized, not fixed formations, but ““ depending on the environmental conditions ““ freely pivotable. They are therefore suitable as components of molecular machines or of a molecular motor. “We still have a long way to go before DNA structures such as the catenan can be used in everyday items”, says Prof Alexander Heckel, “but structures of DNA can, in the near future, be used to arrange and study proteins or other molecules that are too small for a direct manipulation, by means of auto-organization.” This way, DNA nano-architectures could become a versatile tool for the nanometer world, to which access is difficult.

In the manufacture of DNA nano-architecture, the scientists take advantage of the pairing rules of the four DNA nucleobases, according to which two natural DNA strands can also find each other (in DNA nano-architecture, the base order is without biological significance). An A on one strand pairs with T on the other strand and C is complementary to G. The trick is to create the sequences of the DNA strands involved in such a manner as to ensure that the desired structure builds up on its own without direct intervention on the experimenter’s part. If only certain parts of the strands used complement each other, branches and junctions can be created.

As reported by Schmidt and Heckel in the journal “Nano Letters” (online advance publication, dx.doi.org/10.1021/nl200303m), they first created two C-shaped DNA-fragments for the catenans. With the help of special molecules that act as sequence-specific glue for the double helix, they arranged the “Cs” in such a ways as to create two junctions, with the open ends of the “Cs” pointing away from each other (see images). The catenan was created by adding two strands that attach to the ends of the two ring fragments, which are still open. Thorsten Schmidt dedicated the publication to his wife Dr Diana Gonçalves Schmidt, who also appreciates the work on scientific level, since she was also a part of Alexander Heckel’s work group.

Since they are much smaller than the wavelengths of visible light, the rings cannot be seen with a standard microscope. “You would have to string together about 4000 such rings to even achieve the diameter of a human hair”, says Thorsten Schmidt. He therefore displays the catenans with a scanning force microscope, which scans the rings that have been placed on a surface with an extremely fine tip.

Image Caption: The world’s smallest wedding rings are built up by two interlocked DNA-strands. Credit: Alexander Heckel

On the Net:

Acne Antibiotic Use Not Linked To Bacterial Resistance

A new study suggests that people who use certain prescribed antibiotics for treatment of acne are unlikely to develop bacteria resistance to those drugs, even when they use the antibiotics for months at a time.

Researchers are surprised by the findings given what is known about bacteria’s ability to adapt to common antibiotics and become immune to their effects. And it’s interesting because the bacteria the researchers studied — Staphylococcus aureus — is the root cause of MRSA.

MRSA is an infection resistant to multiple antibiotics and is extremely difficult treat and more dangerous than other non-resistant infections, the researchers note in the journal: Archives of Dermatology.

“A lot of the work that we’ve done over the years has often shown problems with long-term antibiotic use,” Dr. David Margolis, a co-author of the study from the University of Pennsylvania School of Medicine in Philadelphia, told Reuters Health.

Tetracycline, one of the most common antibiotics for acne treatment, has been around for a long time and “it’s interesting to notice that Staphylococcus aureus hasn’t become overtly resistant to it,” said Margolis.

He and his colleagues took nose and throat swabs from 83 acne patients to search for signs of staph bacteria. Nearly half of the patients in the study had been treated with antibiotics, some for up to a year.

A total of 36 of the 83 patients in the study were colonized with S. aureus. Two of the 36 patients had MRSA; 20 had S. aureus solely in their throats; nine had S. aureus solely in their noses; and seven had S. aureus in both their noses and throats, the authors reported.

“Long-term use of antibiotics decreased the prevalence of S. aureus colonization by nearly 70 percent,” the authors report. “A decreased rate of colonization was noted with the use of both oral and topical antibiotics.”

Patients who took antibiotics to treat their acne were less likely to have the bacteria — perhaps because the drugs were killing nose and throat bacteria in addition to the bacteria that causes acne, said Margolis.

The researchers noted that only about 10 percent of all staph bacteria sampled were resistant to tetracycline antibiotics. Using the antibiotics to treat acne didn’t increase the chance that a patient would have staph bacteria that tested positive for tetracycline resistance.

While patients most often use antibiotics only for a week or so to fight off an infection, acne patients may be on antibiotics for months or even years, Margolis said.

This causes concern due to the fact that these patients may be more likely to harbor bacteria that are resistant to tetracycline. It could become a major problem if the normally harmless staph bacteria becomes a more serious infection or if those patients spread the resistant bacteria to others.

But this research adds to data suggesting that the odds of acne patients getting a serious, drug-resistant infection are low, noted Dr. Guy Webster, a dermatologist at Jefferson Medical College in Philadelphia.

“A lot of the public panic about treating acne with antibiotics is unwarranted, at least as far as Staph aureus and resistance goes,” Webster told Reuters Health.

Webster, who did not participate in the current study, said it “affirms what we kind of already know.” Dermatologists have long used tetracyclines in acne patients and yet they are still some of the few drugs that treat MRSA successfully, he added.

Webster said tetracycline resistance is still a possibility in acne patients who use antibiotics long-term. And there are other negative effects of tetracycline treatment — including the likelihood of yeast infections in women using the drugs.

The new study results contradict the current belief about long-term use of antibiotics.

“Specifically, in our study, the prolonged use of antibiotics from the tetracycline class that are commonly used to treat acne lowered the prevalence of colonization by S. aureus and did not increase resistance to the tetracycline antibiotics,” the authors explained.

“Future research should be conducted with respect to other organisms and antibiotics,” Margolis added.

But for now, when doctors are considering treatment for acne, the risks of these antibiotics seem to be smaller than the benefits, said Webster.

Staphylococcus aureus is found both in hospitals and the community. “While S. aureus colonizes the skin, it can also be responsible for localized skin infections and life-threatening systemic infections,” the authors wrote. “At one time, it was sensitive to many antibiotics and antimicrobial agents. However, because of its ability to adapt to these therapies and become resistant, clinical scenarios now exist in which few therapeutic options remain to treat this organism. Therefore, methicillin-resistant S. aureus (MRSA) has become commonplace.”

The report of the new study was published online Monday and will appear in the August print edition of Archives of Dermatology.

On the Net:

UCSF Analysis Shows Newer Surgery For Neck Pain May Be Better

A new surgery for cervical disc disease in the neck may restore range of motion and reduce repeat surgeries in some younger patients, according to a team of neurosurgeons from the University of California, San Francisco (UCSF) and several other medical centers that analyzed three large, randomized clinical trials comparing two different surgeries.

More than 200,000 Americans undergo surgery every year to alleviate pain and muscle weakness from the debilitating condition caused by herniated discs in the neck. For some, the team found, arthroplasty may work better.

The results do not suggest that the older surgery is ineffective or unsafe, but that arthroplasty is a viable option for some.

“For people younger than 50 who have cervical disc disease, arthroplasty is a good option,” said Praveen Mummaneni, MD, of the Department of Neurosurgery at UCSF.

Mumaneni and his colleagues are presenting their analysis today at the 79th Annual Scientific Meeting of the American Association of Neurological Surgeons in Denver.

Why Fewer Is Better

Neck surgery is not cheap and requires a patient to be placed under general anesthesia and a surgical team to perform the operation in a sterile room. They are typically reserved for patients who have failed to respond to other measures such as physical therapy or drugs, such as steroids.

For decades, the standard of care in this country was a procedure called anterior cervical discectomy and fusion. In this surgery, a surgeon cuts through the front of the neck, accessing the spine and removing the herniated disc, then replacing it with a piece of bone and a plate in the neck. That creates a solid union ““ or fusion ““ between two or more vertebrae to strengthen the spine.

Arthroplasty also begins with a surgery to remove the herniated disc. But instead of fusing the spine, the surgeon replaces the missing disc with an artificial one made of steel, plastic or titanium. The idea is that the artificial disc will provide more spine mobility after surgery and less stress on adjacent discs.

While arthroplasty has become more widely used in the United States since the U.S. Food & Drug Administration approved several models of artificial discs in the last few years, it is still performed less often than in Europe, where the procedure has been available for more than a decade.

Here in the United States, the older, surgical fusion technique remains more common ““ in part because not all insurance companies pay for the newer procedure, as is the case in California.

Both techniques have occasional failures. In the fusion surgery, the bone may not heal, requiring further fusion surgery months or years later. In the arthroplasty surgery, the artificial disk may loosen or not fit well and may need to be replaced.

“Surprising” Results

The new analysis looked at three randomized clinical trials that enrolled 1,213 patients with cervical disc disease at medical centers across the United States ““ including UCSF.

In the trials, 621 patients received an artificial cervical disc and 592 patients were treated with spinal fusion. The analysis looked at outcomes two years after surgery.

The results were surprising, Mummaneni said: “While the two-year surgical results for both techniques were excellent, the rate of repeat surgery is lower for arthroplasty than for fusion at the two-year timepoint.”

On the Net:

Childless Women Eat Healthier Than Mothers

A study found that mothers of young children were heavier and ate more calories, sugary drinks and fatty foods than childless women.

The study authors said that parents who choose quick and easy prepared foods may end up serving them to their children, perpetuating a cycle of unhealthy eating.

“This isn’t a study about blame,” co-author Jerica Berge, a University of Minnesota researcher, told the Associated Press (AP). “This is about identifying … a very high-risk time period” for parents that doctors should be aware of so they can offer solutions, she said.

The researchers said that may include diet advice, parent-child exercise classes, or just getting parents to take walks with their children.

The study questioned 1,520 aged 25 on average, including parents with children younger than 5 years old. 

The study found that mothers ate more fatty foods and drank about seven sugary drinks weekly, versus about four among childless women.  Moms ate an average of 2,360 calories a day, which was 368 more calories than women who did not have kids.

Mothers who were questioned said they had gotten about a little more than two hours of a moderate activity each week, versus three hours weekly from women who were childless.  Mothers also had a slightly higher average body-mass index than childless women.

The study found that fathers ate about the same amount of daily calories as childless men and both had an average BMI of about 25.  However, fathers got about five hours of physical activity a week, versus the seven that men without children had.

Sarah Krieger, an American Dietetic Association spokeswoman and St. Petersburg Florida dietitian, said in a statement that some of the mothers may have had postpartum depression, which might affect their eating and exercise habits.

The study was published online Monday in the journal Pediatrics.

On the Net:

New Chemo for Elderly Patients

(Ivanhoe Newswire) — Lymphoma is often difficult to treat in elderly patients because they cannot always tolerate chemotherapy. Now, a new study reveals a modified treatment approach may be an option for these patients.

The new approach uses a decreased dose of conventional chemotherapy combined with a standard dose of the drug rituximab. Between 2006 and 2009, 150 patients over 80 years of age were enrolled from 38 centers across France and Belgium. The patients all had diffuse B-cell lymphoma, which is a common cancer in the elderly. They were given six cycles of the modified therapy — known as R-miniCHOP — at three-week intervals.

Results showed the median overall survival was 29 months, and the two-year overall survival rate was 59 percent. The researchers say the R-miniCHOP regimen was well-tolerated, as the full planned dose was achieved in 72 percent of patients. They say these findings suggest that a large proportion of patients older than age 80 can be cured of B-cell lymphoma.

“R-miniCHOP offers a good compromise between efficacy and safety”¦and should be the standard treatment for patients older than 80 years who have diffuse large B-cell lymphoma and a good performance status,” the researchers were quoted as saying.

SOURCE: Lancet Oncology, April 7, 2011

Do Organic Food Labels Mislead Consumers?

Does labeling food as organic mislead the brain into thinking it is better tasting and healthier than it actually is?

Jenny Wan-Chen Lee, a graduate student in Cornell University, in New York, offers evidence that people may consume a higher amount of calories at fast-food restaurants that claim to serve ‘healthier’ foods, compared to the amount they eat at a typical burger bar.

Lee asked 144 subjects at a local store to compare what they were told to be conventionally and organically produced chocolate sandwich cookies, plain yogurt, and potato chips. However, all of the food products were of the organic variety, they were just labeled as being regular or organic.

Participants were then asked to rate each food on a scale of 1 – 9 for 10 different attributes (e.g., overall taste, perception of fat content). She also asked them to estimate the number of calories in each food item.

The study, released at the American Society for Nutrition annual conference, revealed that on average consumers rated the organic labeled items a full mark up the scale when it came to health. They also considered it to contain on average 60 fewer calories.

An increasing number of studies are indicating that a ‘halo’ effect may apply to foods, and ultimately influence what and how much we eat.

For instance, research has shown that people tend to consume more calories at fast-food restaurants claiming to serve so-called healthier foods, compared to the amount they eat at a typical burger place.

A halo effect would be positive feelings attributed to a product or person we associate with other favorable impressions. In other words, people perceiving a food to be more nutritious may tend to let their guard down when it comes to being careful about counting calories””ultimately leading them to overeat or feel entitled to indulge.

Specifically, some people mistakenly assume that organic foods will be more nutritious because they carry an organic label. This theory has been in long debate among nutrition researchers.

Subjects reported preferring the taste characteristics of the organically-labeled foods, even though they were actually identical to their conventionally-labeled counterparts.

Foods labeled organic were also perceived to be much lower in calorie count and a higher price for the items was expected. Overall, organically-labeled chips and cookies were considered to be more nutritious than their “non-organic” counterparts.

On the Net:

Dolphin Deaths Designated An ‘Unusual Mortality Event’

Scientists are baffled by the continuing numbers of dead baby bottlenose dolphins washing up on the shore of the Gulf of Mexico.

406 dolphins were found either stranded or dead between February 2010 and April 2011, prompting the National Oceanic and Atmospheric Administration (NOAA) to designate the deaths as an “unusual mortality event” (UME). The agency defines such events as a stranding incident that is unexpected or involves great losses of any marine mammal population.

Blair Mase, the agency’s marine mammal investigations coordinator, told CNN: “This is quite a complex event and requires a lot of analysis.”

Mase said NOAA is working with several agencies to determine not only why the dolphins are turning up dead in such large numbers but also why the mammals are so young. “These were mostly very young dolphins, either pre-term, neonatal or very young and less than 115 centimeters,” Mase said.

A number of factors could be at play including harmful algal blooms, infectious diseases, water temperatures, environmental changes, and human impacts. “The Gulf of Mexico is no stranger to unusual mortality events,” Mase noted.

Much sensitivity surrounding marine life has come about since last year’s BP oil disaster that spewed millions of barrels of crude oil in the Gulf waters.

As recently as two weeks ago, scientists documented a dead dolphin with oil on its remains, said Mase. Since the start of the oil spill on April 20, 2010, a total of 15 bottlenose dolphins have been found with either confirmed or suspected oil on their carcasses. Nine oiled dolphins have been found since the gushing oil well was capped.

But of those nine, one was found with oil that did not match that of samples from the Deepwater Horizon.

Mase said the dolphin deaths could be completely unrelated to the oil spill. “Even though they have oil on them, it may not be the cause of death,” she said. “We want to look at the gamut of all the possibilities.”

Scientists are also concerned over the number of sea turtles that are being stranded. Similar to the dolphin deaths , an abnormally high number of turtles have also been found floating close to shore or washed up along the Gulf coast shores.

“The vast majority of these are dead, with states in moderate to severe decomposition,” Barbara Schroeder, NOAA Fisheries national sea turtle coordinator, told CNN’s Vivian Kuo.

The majority of the turtles found have been Kemp’s Ridley sea turtles, but some have been loggerheads, which along with the Kemp’s are endangered.

“Since January 1st, we’ve had just under 100 strandings,” said Schroeder. “About 87 of those have been documented since the middle of March.”

Necropsies were performed on about a third of the turtles, Schroeder said. Seven of them showed indications that they had been in accidents with watercrafts, while another displayed injuries consistent with being caught on a hook.

The others appeared to have died by drowning near the bottom of the Gulf, either from forced submergence or an acute toxic event.

Tissue samples from both turtles and dolphins are being documented due to the civil and criminal litigation ongoing with BP, according to Dr. Teri Rowles, a coordinator with NOAA Fisheries Stranding Program.

“We are looking at what is the impact of the oil spill and the response activities to the oil spill event, and what impact they had on the Gulf of Mexico ecosystem,” said Rowles. “We did not say that the dolphins have died because of the oil, just that they have come back with oil on them.”

On the Net:

Vegans Less Prone To Cataracts

A British study has found that eating less meat and more vegetables is tied to a lower risk of cataracts.

Researchers found that about three in 50 meat eaters had cataracts, compared to about two in 50 vegans and vegetarians.

The results translated to a 30 to 40 percent lower cataract risk among vegetarians and vegans compared with the biggest meat eaters.

“People who don’t eat meat have a significantly lower risk of developing cataracts,” Naomi Allen, an epidemiologist at the UK’s University of Oxford who coauthored the study, told Reuters.

A cataract occurs when the lens of the eye becomes cloudy and blurs vision.  According to the National Eye Institute, they are more common in older people and over half of Americans either have cataracts by the time they are 80 or have had surgery for them.

Allen told Reuters that the researcher’s findings do not mean that people should become vegetarians to avoid getting cataracts.

Smoking, diabetes, and exposure to bright sunlight are all linked to an increased risk for cataracts.

Dr. Jack Dodick, who chairs the department of ophthalmology at New York University Lagone Medical Center, told Reuters that the new findings actually contradict a study done in India, where a vegetarian diet was associated with high numbers of cataracts.

“It means that still to this day we don’t know what influences cataracts. It may be more lifestyle. There may be other factors in causing cataract other than diet,” Dodick, who did not work on the current study, told Reuters Health.

The British researchers asked over 27,600 people older than 40 to fill out dietary surveys between 1993 and 1999.  The team then monitored the participants’ medical records between 2008 and 2009 to see if they developed cataracts.  About half of the participants had cataracts during the follow-up period.

The highest risk was seen among the heaviest meat-eaters, which were those who consumed over 3.5 ounces of meat a day.  Fish eaters’ risk was 15 percent lower than that of the heavy meat eaters, while vegetarians’ risk dropped 30 percent and vegans 40 percent.

Dodick said the study was well done, but there are “still a lot of questions that need to be answered.”

He said whether nutrition really plays a role in cataract risk is still unclear.

Cataract surgery can typically cost between $1,500 and $3,000.

“It’s the most performed operation in the U.S.,” Dodick told Reuters. “Approximately 3.5 million cataract surgeries are performed a year.”

To decrease the probability of early onset cataracts, “the top of my list would be always protect eyes against ultraviolet rays when outdoors (by wearing sunglasses),” Dodick said.

“The moral of the story is, live life in moderation,” Dodick added. “A healthy active lifestyle with exercise might decrease the risk of cataracts.”

The study was published in the American Journal of Clinical Nutrition.

On the Net:

When Deprived Of Online Media, Teens Suffer Withdrawal

A new study confirms that many young people, when deprived of their gadgets show withdrawal symptoms comparable to those of drug addicts going cold turkey.

Researchers at the International Center for Media & the Public Agenda (ICMPA) at the University of Maryland found 79 percent of students relieved of their smartphones, computers and any online access for one day reported adverse reactions ranging from distress to confusion and isolation.

One in five reported feelings of withdrawal like an addiction while 11 percent said they were confused or felt like a failure. Some students even reported stress from simply not being able to touch their phone.

Nearly one in five (19 percent) reported feelings of distress and 11 percent felt isolated. Only 21 percent said they appreciated the benefits of being unplugged from the online network.

“I am an addict. I don’t need alcohol, cocaine or any other derailing form of social depravity. Media is my drug; without it I was lost,” One participant reported.

Another wrote: “I literally didn’t know what to do with myself. Going down to the kitchen to pointlessly look in the cupboards became regular routine, as did getting a drink.”

Susan Moeller, lead researcher of the study, explains, “Technology provides the social network for young people today and they have spent their entire lives being plugged in. When the students did not have their mobile phones and other gadgets they reported they did get into more in-depth conversations.”

“Some said they wanted to go without technology for a while but they could not as they could be ostracized by their friends,’ claiming that technology ‘absolutely’ changed relationships.”

The ICMPA study asked around 1,000 students to give up all media for 24 hours and record their experiences. If you are under 25 and living in almost any country, you not only can’t imagine life without your cell phone, laptop and mp3 player, you can’t function without them.

‘The World Unplugged’ study, concluded that most college students, in any country, are strikingly similar in how they use media, and how often they use it also. Story after story highlighted their generation’s complete consumption of their time and focus to online media.

“My dependence on media is absolutely sickening,” said a student from Lebanon. “I felt like there was a problem with me,” wrote a student from Uganda.

One student from Hong Kong shared, “Because I became so addicted. I have less time for my studies and face-to-face meetings with my friends.”

Students were shocked by how much media dominates their lives. What was once thought of as a convenience, a way to communicate with friends and get news. After the study they came to recognize that they literally construct their identities through media. Going unplugged, therefore, was like losing part of themselves.

On the Net:

Risk Of Brain Damage From Long-term Ecstasy Use

Dutch researchers have found that long term use of the drug ecstasy can cause structural brain damage, according to a report published in the Journal of Neurology, Neurosurgery and Psychiatry.

The hippocampus is the area of the brain responsible for long term memory and was the focus of this study.

MRI scans were used to measure the volume of the hippocampus in two groups of young men. Ten of them were in their mid-20s who were long term ecstasy users, while the other group consisted of seven men in their early 20s with no history of ecstasy use.

Both groups of men have used similar amounts of recreational drugs, bar ecstasy, and drank alcohol regularly, but the ecstasy group had used more amphetamine and cocaine.

Before the start of the study, the ecstasy group had not been using for an average of more than two months, but had taken an average of 281 ecstasy tablets over the previous six and a half years, reports the journal.

The hippocampal volume in the ecstasy group was 10.5% smaller than the non-ecstasy group, with the overall proportion of grey matter about 4.6% lower on average. This was after the total brain volume was adjusted, the study found.

Taken together with previous research that suggested people who use ecstasy develop significant memory problems, “these data provide preliminary evidence suggesting that ecstasy users may be prone to incurring hippocampal damage, following chronic use of this drug,” state the authors of the study.

“Hippocampal atrophy is a hallmark for disease of progressive cognitive impairment in older patients, such as Alzheimer’s disease,” the researchers wrote.

However, the Dutch government’s former lead advisor on drugs misuse, Professor David Nutt says that, “the interesting pilot study”¦ is underpowered to provide definitive evidence of an effect of ecstasy.” The Guardian also reports Nutt as saying that the “evidence suggests that many drugs, including alcohol, can damage someone’s memory.

On the Net:

Scientists Use Stem Cells To Grow Retina

In a major breakthrough in the field of regenerative medicine, scientists have for the first time created a part of the eye critical for vision using animal stem cells, according to a study published Wednesday in the journal Nature.

The research could pave the way to new treatments for blindness and human eye diseases, and experts say it may even be possible to one day restore vision with transplanted retinas generated from a patient’s own stem cells.

The researchers, led by Yoshiki Sasai of the RIKEN Center for Developmental Biology in Japan, conducted lab experiments using mice.  They started with pluripotent stem cells, the universal stem cells for nearly every specialized cell in an organism.  Until now, stem cells have mainly been viewed as a potential source of replacement tissue composed of a single type of cell, like muscle cells, for example.   The ability to generate a more complex set of cells, or even an entire organ, was thought to require intricate chemical interactions with neighboring tissues during gestation, and therefore impossible in the absence of the natural process of cell division and growth.

However, Sasai and colleagues used new techniques, and were able to set in motion the transformation of mouse embryonic stem cells into an optic cup — the layered, three-dimensional structures that become the retina in an eye.

Most importantly, the cells did the work themselves, without being pushed into any particular shape.

“What we’ve been able to do is resolve a nearly century-old problem in embryology by showing that retinal precursors have the inherent ability to give rise to the complex structure of the optic cup,” said Sasai in a statement.

Starting as a disorganized mass, the stem cells formed themselves into the two-walled structure that corresponds to the inner and outer layers of the retina during the development of an embryo.

“We are now well on our way to becoming able to generate not only differentiated cell types, but organized tissues” Sasai said.

The breakthrough is particularly applicable for a group of genetic eye conditions known as retinitis pigmentosa, which attacks vision by damaging the retina, leading to blindness, Sasai said.  People with retinitis pigmentosa experience a slow, gradual decline in vision because photoreceptor cells degenerate and die.

“As a step forward in the lead-up to cell replacement or even organ therapy, this is a really significant piece of work,” said Richard Lang, director of the visual systems group at the Cincinnati Children’s Hospital, during an interview with the AFP news agency.

“It shows that relatively simple culture conditions can be used to generate whole organ primordia,” he said, referring to the early, embryonic stage of organ development.

Lang said the while the goal of generating human eye tissue remains far off, other scientists are making parallel progress on other types of tissue.

“It feels like it won’t be long before the first opportunity for experimental clinical use comes along,” he said.

In a commentary about the study, Robin Ali and Jane Snowden of University College London wrote that the new self-generating proto-eye had the signature molecular markers of both the neural retina, which is linked to the brain, and the retinal pigmented epithelium, which helps keep the eye free of debris.

“An even more striking proof that these are genuine retinas is that, in culture, the synthetic optic cups undergo cell differentiation … into all the main retinal cell types, including photoreceptors,” they wrote.

The study was published online April 6 2011 in the journal Nature.

Image Caption: An ES cell-derived optic cup virtually inserted into a test tube. Credit: RIKEN CDB/HO/M. Eiraku And Y.Sasai

On the Net:

Meditation More Powerful Than Drugs For Pain Management

A study published in the latest issue of the Journal of Neuroscience finds that meditation can deliver powerful pain-relieving effects to the brain, even with just 80 minutes of training for a beginner using an exercise called focused attention, AFP reports.

Fadel Zeidan Ph.D, lead author of the study explained, “This is the first study to show that only a little over an hour of meditation training can dramatically reduce both the experience of pain and pain-related brain activation.”

“We found a big effect — about a 40 percent reduction in pain intensity and a 57 percent reduction in pain unpleasantness. Meditation produced a greater reduction in pain than even morphine or other pain-relieving drugs, which typically reduce pain ratings by about 25 percent,” Zeidan, from Wake Forest Baptist Medical Center, added.

Fifteen volunteers who had never meditated took four 20-minute sessions to learn how to control breathing and ignore emotions and thoughts for the lessons. Brain activity was monitored with a special type of magnetic resonance imaging called “arterial spin labeling magnetic resonance imaging” (ASL MRI) and recorded before and after the sessions.

ASL MRI is able to give more precise readings on longer duration brain processes, such as meditation, over a standard MRI scan.

A device was placed on the participants’ right legs, heating a small area of their skin to 120° F, a temperature that most people find painful, but not damaging, over a five-minute period. Scans taken after meditation training showed that every participant’s pain ratings were reduced, with decreases ranging from 11 to 93 percent, Zeidan told the Telegraph.

The meditation was also shown to significantly reduced brain activity in the primary somatosensory cortex, an area that is involved in creating the feeling of where and how intense a painful stimulus is.

Scans taken before meditation training showed activity in this area was very high. Participants who were meditating during the scans, showed no detection of activity in this important pain-processing region.

The research also showed that meditation increased brain activity in areas including the anterior cingulate cortex, anterior insula and the orbito-frontal cortex. This is where the brain stores its experience of pain and comes up with coping mechanisms.

“Consistent with this function, the more that these areas were activated by meditation the more that pain was reduced. One of the reasons that meditation may have been so effective in blocking pain was that it did not work at just one place in the brain, but instead reduced pain at multiple levels of processing.”

“We found a big effect ““ about a 40 percent reduction in pain intensity and a 57 percent reduction in pain unpleasantness. Meditation produced a greater reduction in pain than even morphine or other pain-relieving drugs, which typically reduce pain ratings by about 25 percent,” said Zeiden.

With so little training required to produce such dramatic pain-relieving effects, Zeidan and colleagues believe that meditation has great potential for clinical use. “This study shows that meditation produces real effects in the brain and can provide an effective way for people to substantially reduce their pain without medications,” Zeidan said.

On the Net:

New Technology For Stroke Rehabilitation

Devices which could be used to rehabilitate the arms and hands of people who have experienced a stroke have been developed by researchers at the University of Southampton.

In a paper to be presented this week (6 April) at the Institution of Engineering and Technology (IET) Assisted Living Conference, Dr Geoff Merrett, a lecturer in electronic systems and devices, will describe the design and evaluation of three technologies which could help people who are affected by stroke to regain movement in their hand and arm.

Dr Merrett worked with Dr Sara Demain, a lecturer in physiotherapy and Dr Cheryl Metcalf, a researcher in electronic systems and devices, to develop three ‘tactile’ devices which generate a realistic ‘sense of touch’ and sensation – mimicking those involved in everyday activities.

Dr Demain says: “Most stroke rehabilitation systems ignore the role of sensation and they only allow people repetitive movement. Our aim is to develop technology which provides people with a sense of holding something or of feeling something, like, for example, holding a hot cup of tea, and we want to integrate this with improving motor function.”

Three tactile devices were developed and tested on patients who had had a stroke and on healthy participants. The devices were: a ‘vibration’ tactile device, which users felt provided a good indication of touch but did not really feel as if they were holding anything; a ‘motor-driven squeezer’ device, which users said felt like they were holding something, a bit like catching a ball; and a ‘shape memory alloy’ device which has thermal properties and creates a sensation like picking up a cup of tea.

Dr Merrett adds: “We now have a number of technologies, which we can use to develop sensation. This technology can be used on its own as a stand-alone system to help with sensory rehabilitation or it could be used alongside existing health technologies such as rehabilitation robots or gaming technologies which help patient rehabilitation.”

The academics’ paper: Design and Qualitative Evaluation of Tactile Devices for Stroke Rehabilitation will be presented at the Institution of Engineering and Technology (IET) Assisted Living Conference.

Image Caption: The new technologies will help patient rehabilitation. Credit: University of Southampton

On the Net:

First Polymer Solar-Thermal Device Heats Home, Saves Money

A new polymer-based solar-thermal device is the first to generate power from both heat and visible sunlight ““ an advance that could shave the cost of heating a home by as much as 40 percent.

Geothermal add-ons for heat pumps on the market today collect heat from the air or the ground. This new device uses a fluid that flows through a roof-mounted module to collect heat from the sun while an integrated solar cell generates electricity from the sun’s visible light.

“It’s a systems approach to making your home ultra-efficient because the device collects both solar energy and heat,” said David Carroll, Ph.D., director of the Center for Nanotechnology and Molecular Materials at Wake Forest University. “Our solar-thermal device takes better advantage of the broad range of power delivered from the sun each day.”

Research showing the effectiveness of the device appears in the March issue of the peer-reviewed journal Solar Energy Materials and Solar Cells.

A standard, rooftop solar cell will miss about 75 percent of the energy provided by the sun at any given time because it can’t collect the longest wavelengths of light ““ infrared heat. Such cells miss an even greater amount of the available daily solar power because they collect sunlight most efficiently between 10 a.m. and 2 p.m.

“On a rooftop, you have a lot of visible sunlight and heat from the infrared radiation,” Carroll said. “The solar-cell industry has for the most part ignored the heat.”

The design of the new solar-thermal device takes advantage of this heat through an integrated array of clear tubes, five millimeters in diameter. They lie flat, and an oil blended with a proprietary dye flows through them. The visible sunlight shines into the clear tube and the oil inside, and is converted to electricity by a spray-on polymer photovoltaic on the back of the tubes. This process superheats the oil, which would then flow into the heat pump, for example, to transfer the heat inside a home.

Unlike the flat solar cells used today, the curve of the tubes inside the new device allows for the collection of both visible light and infrared heat from nearly sunrise to sunset. This means it provides power for a much greater part of the day than does a normal solar cell.

Because of the general structure and the ability to capture light at oblique angles, this is also the first solar-thermal device that can be truly building-integrated ““ it can be made to look nearly identical to roofing tiles used today.

Tests of the solar-thermal device have shown 30 percent efficiency in converting solar energy to power. By comparison, a standard solar cell with a polymer absorber has shown no greater than 8 percent conversion efficiency.

The research team will build the first square-meter-size solar-thermal cell this summer, a key step in getting the technology ready for market.

On the Net:

Self-cooling Seen In Graphene Electronics

With the first observation of thermoelectric effects at graphene contacts, University of Illinois researchers found that graphene transistors have a nanoscale cooling effect that reduces their temperature.

Led by mechanical science and engineering professor William King and electrical and computer engineering professor Eric Pop, the team will publish its findings in the April 3 advance online edition of the journal Nature Nanotechnology.

The speed and size of computer chips are limited by how much heat they dissipate. All electronics dissipate heat as a result of the electrons in the current colliding with the device material, a phenomenon called resistive heating. This heating outweighs other smaller thermoelectric effects that can locally cool a device. Computers with silicon chips use fans or flowing water to cool the transistors, a process that consumes much of the energy required to power a device.

Future computer chips made out of graphene ““ carbon sheets 1 atom thick ““ could be faster than silicon chips and operate at lower power. However, a thorough understanding of heat generation and distribution in graphene devices has eluded researchers because of the tiny dimensions involved.

The Illinois team used an atomic force microscope tip as a temperature probe to make the first nanometer-scale temperature measurements of a working graphene transistor. The measurements revealed surprising temperature phenomena at the points where the graphene transistor touches the metal connections. They found that thermoelectric cooling effects can be stronger at graphene contacts than resistive heating, actually lowering the temperature of the transistor.

“In silicon and most materials, the electronic heating is much larger than the self-cooling,” King said. “However, we found that in these graphene transistors, there are regions where the thermoelectric cooling can be larger than the resistive heating, which allows these devices to cool themselves. This self-cooling has not previously been seen for graphene devices.”

This self-cooling effect means that graphene-based electronics could require little or no cooling, begetting an even greater energy efficiency and increasing graphene’s attractiveness as a silicon replacement.

“Graphene electronics are still in their infancy; however, our measurements and simulations project that thermoelectric effects will become enhanced as graphene transistor technology and contacts improve ” said Pop, who is also affiliated with the Beckman Institute for Advanced Science, and the Micro and Nanotechnology Laboratory at the U. of I.

Next, the researchers plan to use the AFM temperature probe to study heating and cooling in carbon nanotubes and other nanomaterials.

King also is affiliated with the department of materials science and engineering, the Frederick Seitz Materials Research Laboratory, the Beckman Institute, and the Micro and Nanotechnology Laboratory.

The Air Force Office of Scientific Research and the Office of Naval Research supported this work. Co-authors of the paper included graduate student Kyle Grosse, undergraduate Feifei Lian and postdoctoral researcher Myung-Ho Bae.

Image Caption: An atomic force microscope tip scans the surface of a graphene-metal contact to measure temperature with spatial resolution of about 10 nm and temperature resolution of about 250 mK. Color represents temperature data. Credit: Alex Jerez, Beckman Institute for Advanced Science and Technology

On the Net:

New Implant Better Than Open Heart Surgery?

Open heart surgery can be avoided by some with a new type of heart valve that is placed through a tube in an artery, cardiologists report in a study.

However, the downside would be a higher risk of stroke as well as uncertainty on how long these valves would last.

Edwards LifeSciences Corp. developed a heart valve replacement technique that spares patients open heart surgery. It had a lower death rate than open heart surgery at one year.

After one year, 24.2% of patients with the Edwards valve died, compared with the 26.8% of patients who had open heart surgery who died.

Patients suffering from severe aortic stenosis can benefit from the Edwards valve. Aortic stenosis is described as “a clogged valve that impedes the pathway of oxygen-rich blood by making the heart work harder to pump blood through a narrowing opening,” reports AFP.

9% of Americans over the age of 65 suffer from this condition. Up to half of patients die within two years if no treatment is given.

Dr. Edward McNulty, a cardiologist at the University of California, San Francisco, explained how this method works to the AP.

“Through an artery in the groin or the chest, a new heart valve is literally crimped on a balloon and advanced across the narrowed, older, diseased heart valve. The balloon is inflated and the new valve left in place.”

The survival rate from conventional surgery is similar, except that the risk of stroke and other major heart complications are elevated. However, the new method is less invasive, and can be used for patients who are too sick for surgery for a higher chance of survival.

The AP reports that Dr. Elliot Antman, a Brigham and Women’s Hospital cardiologist and American Heart Association spokesperson says that this would be a great option for those who are too sick to have surgery, but that it was too soon to be used for patients who are less sick.

Even though it is less invasive, some patients will still prefer open heart surgery because the fear the possibility of a higher stroke rate of the new method. The study found 5.1% of those who had the valve implanted experienced a stroke, while open heart surgery patients had a stroke rate of only 2.4%.

Despite the higher risk of stroke, cardiologists and heart experts believe that the Edwards valve used in what is called the Sapien method is the next major turning point in heart disease treatment.

AFP reports that this method is already being used in Europe, but has not gained the approval from the U.S. Food and Drug Administration. The agency considers the valve as an investigational device.

“This probably will be seen as one of the biggest steps in cardiovascular medicine, as far as intervention is concerned, potentially in our lifetime,” says David Moliterno, professor of medicine at the University of Kentucky.

Dr. Craig Smith, heart surgery chief at Columbia University and New York-Presbyterian Hospital, led the study that consisted of 699 patients. The results were presented at an American College of Cardiology conference in New Orleans.

This clinical trial study, called Partner, was paid for by Edwards Lifesciences Corp. of Irvine, California. The company is currently seeking federal approval to sell the valve for patients who cannot be operated on, and has plans to ask that the valves be used for less sick patients.

As a patient in the study, 89 year old Charles Cohen received an artery-placed valve two years ago at Cedars-Sinai Medical Center in Los Angeles.

The AP reports that Cohen was “a little leery” about either options, but he was hoping for the catheter method.

He says, “Now I walk half a mile,” whereas before he could only walk a block or so before having to stop and rest.

On the Net:

500,000 US Deaths Each Year From Smoking

A new study finds that smoking kills half a million Americans each year, with slightly more men than women dying from tobacco-related causes. 

Although the rates of smoking-related deaths in men were comparable to that found in previous studies, the numbers for women were higher than expected, the researchers said.

Dr. Brian Rostron, who at the time of the study worked at the University of California, Berkeley, used data from a national health survey that queried a quarter million people about whether they were current or former smokers, and how often they had smoked.

He then followed the participants for 2 to 9 years, and found that about 17,000 had died by 2006, when the study concluded.

Dr. Rostron, who now works for the U.S. Food and Drug Administration, calculated the mortality rates for smokers and non-smokers of different ages and genders.  He then applied the additional risks due to smoking to the total U.S. population.  

The calculations showed that there were 291,000 smoking-related deaths in men each year between 2002 and 2006, and 229,000 in women.

Some 2.5 million people die each year in the U.S., according to the U.S. Centers for Disease Control and Prevention. 

Among all current and former smokers, the greatest increase in risk of a tobacco-related death occurred between the ages of 65 and 74. After accounting for other factors such as weight and alcohol consumption, people in that age group were three times as likely to die from any cause if they currently smoked between one and two packs of cigarettes a day, compared to those who had never smoked.

“These figures are generally consistent with, but somewhat higher than, published estimates from the Centers for Disease Control and Prevention, particularly for women,” Dr. Rostron said.

The number of current U.S. smokers has declined in recent decades, with about 7 percent of U.S. adults now smoking more than 20 cigarettes per day, compared with 23 percent in 1965, according to a recent study published in the Journal of the American Medical Association. 

The current study was published online March 3, 2011 in the journal Epidemiology.  A summary can be viewed here.

Bacteria Outbreak Claims Nine Lives In Alabama

Contaminated total parenteral nutrition (TPN) fed intravenously to nineteen patients at six Alabama hospitals is a possible culprit in the death of nine patients, leaving 10 others seriously ill, reports AFP.

The Alabama Department of Public Heath (ADPH) released a statement saying, “TPN is a liquid nutrition fed through an IV using a catheter. Use of contaminated products may lead to bacterial infection of the blood.”

The affected hospitals include Baptist Princeton, Baptist Shelby, Baptist Prattville, Medical West, Cooper Green Mercy and Select Specialty Hospital in Birmingham.

An investigation by the Centers for Disease Control and Prevention (CDC) is underway for a possible outbreak of Serratia marcescens bacteremia, a bacterial infection that affects the blood. Although all the stricken patients were fed intravenously, it is not yet clear whether the bacteria was contracted by TPN.

“Of the 19 that received the substance, nine of those are no longer living,” says Dr. Jim McVay, a senior official with the ADPH.

He also mentions, “These were very fragile individuals and it’s not clear whether the bacteria contributed to their deaths.”

The bacteria were first identified in the patients and then cultures were tested on the TPN, McVay says.

The TPN in question was identified as being produced by a single pharmacy, Meds IV. All hospitals involved were determined to have received the TPN from this pharmacy, the ADPH stated, have since stopped using the TPN.  Med IV production of the TPN could be the common link to the contamination.

As of March 24, the pharmacy has informed their customers of the possibility of contamination, and all of its IV compounded products have been recalled and all production stopped.

The pharmacy and all six hospitals are cooperating with the investigation, and the U.S. Food and Drug Administration is aware of the voluntary recall, reports Reuters.

On the Net:

Chinese Herb Whitens Skin

(Ivanhoe Newswire) — Scientists have discovered a more effective alternative to skin whitening creams, and it comes from an ancient Chinese herb.

Skin whitening is popular in countries like China, Japan, Korea and India, where many women view whiter skin as a symbol of beauty, good health and high social status. According to one study, about half the women in Asian countries use lightening creams. This adds up to billions of dollars a year.

Many whitening creams and lotions contain substances like toxic mercury, hydroquinone and other potentially harmful chemicals. Some whitening creams are even thought to increase the risk of skin cancer.

Researchers isolated two chemicals from the evergreen bush known as Cinnamomum subavenium. These two chemicals have the ability to block tyrosinase, which is an enzyme that controls the synthesis of melanin — a dark pigment responsible for coloring skin, hair and eyes.

They tested the “melanogenesis inhibitors” on embryos of zebrafish that had a highly visible band of black pigment. Exposure to low levels of the two chemicals reduced melanin production in the fish embryos by almost 50 percent within four days, turning the embryos white.

“When we saw the results, we were amazed,” Hui-Min Wang, who is with Kaohsiung Medical University in Taiwan, was quoted as saying. “My first thought was, ‘Well, if these herbal whiteners can transform zebrafish embryos from black to white, maybe they can also lighten women’s skin.'”

Wang estimated that the chemicals are 100-times more effective in reducing melanin pigmentation than common skin-whitening agents. The substances did not appear to be toxic when tested in low doses on both human skin cells and zebrafish embryos.

SOURCE: 241st National Meeting and Exposition of the American Chemical Society in Anaheim, March 29, 2011

Small Birds Making A Comeback In UK After Harsh Winter

After a dramatic decline in their numbers last spring, a survey by the Royal Society for the Protection of Bird’s (RSPB) called “Big Garden Birdwatch” found small bird numbers bouncing back.

The winter of 2009 and 2010 was one of the coldest in 30 years, and experts feared the worst for small birds. However, the RSPB’s survey shows some of the species that were devastated by the long, harsh winter returning.

More than half a million people participated in counting birds in their garden for an hour on January 29 and 30. Over 70 species were recorded in 300,780 gardens.

“We expected last year’s trend to continue, and we were really concerned that this decline in small birds would continue,” says Richard Bashford, project manager of the Big Garden Birdwatch. Despite the U.K.’s coldest winter in 100 years, Bashford was surprised by the increase in small birds.

The survey saw U.K.’s smallest birds, the goldcrests, doubled in numbers. Long-tailed tits increased by a third, and coal tits by a quarter. 7,000 waxwings were recorded in almost 1,000 gardens. These birds flew in from Scandinavia in an influx known as a “waxwing winter” that only occurs every few years, reports AFP.

“We knew this was going to be a bumper year for waxwings as we’d had so many reports from all over the UK. But the Big Garden Birdwatch is the first indicator of exactly how many were seen in gardens, and we’re pleased that so many people got to enjoy sightings of these beautiful birds,” says Mark Eaton, a scientist at the RSPB and co-author of the survey.

A good breeding season shows the house sparrow at the top of the list in the number spotted in gardens. This is followed by the starling, the blackbird, the blue tit, the chaffinch, the wood pigeon, the great tit, the goldfinch, the robin, and rounding up the top 10 list is the collared dove.

Although house sparrows are at the top of the list, their numbers are far lower than the first Birdwatch survey that took place in 1979. Starling numbers have also fallen by three quarters since the first survey, reports AFP.

On the other hand, the blue tits, wood pigeons and collared doves are spotted in greater numbers than the original survey. In addition, the goldfinch, which was not spotted in 1979, is now eighth on the top ten list.

“It appears that many may have had a decent breeding season and have been able to bounce back a little. But we mustn’t be complacent. Another hard winter could see numbers back down, so it’s important everyone continues to feed their garden birds,” says, Sarah Kelly, Big Garden Birdwatch coordinator.

On the Net:

‘Real World’ Ecstasy Users Risk Brain Damage

Australian researchers studying the effects of drug use in the real world, attended parties where people were using a particular drug known as ecstasy (which often also contains a variety of other drugs), discovering that the drug posed far greater risk on users’ brains than anyone previously believed.

Dr. Thomas Newton, a professor at Baylor College of Medicine, who was not involved in the study, told Reuters that what was most concerning “is that most studies looking at toxicity in people or animals look at a single drug.”

“We have no idea what happens when you start mixing like this,” he added.

For the study, 56 people who had used ecstasy at least 5 times in the past allowed researchers to attend parties with them where they used ecstasy once again.

The researchers collected pill samples and also measured users’ blood levels of MDMA — a chemical found in the drug — every hour for five hours after the drug was taken. For participating and following through with the study, each user received the equivalent of 205 US dollars.

The team of researchers found that the amount of MDMA in some of the users reached levels that can cause severe brain injury or death in primates.

Only half of the pills taken consisted entirely of MDMA. The other half also contained methamphetamine or other chemicals related to MDMA. Some of the pills had no MDMA at all. But in the ones that did contain MDMA, the levels ranged from 25 mg to as much as 250 mg.

“This highlights a significant public health concern, particularly regarding the existence of pills containing more than 200 mg of MDMA,” the authors wrote.

The researchers also noted that number of pills taken by the participants varied as well. Most of the users ingested more than one pill, and some took as many as five in a single night.

“Taking multiple pills is likely to lead to very high blood concentration, which may be harmful,” Dr. Rod Irvine, lead author of the study, published in the journal Addiction, told Reuters Health in an email.

“We were surprised that the…concentrations continued to rise throughout the study,” Irvine, a professor at the University of Adelaide, said. “The higher levels are approaching those that have been shown to be damaging to brain cells in animal models.”

Three of the participants had blood concentrations greater than 700 mg/L, which in primates was found to be highly poisonous in lab studies. Another three of the users had concentrations close to that level.

Irvine said that most users continued to take more ecstasy throughout the night, even though blood concentration levels from their first pill had not reached peak levels.

It is possible that users may develop a tolerance to the drug while they are using it, noted the authors, adding that it could make them feel less intoxicated even while blood concentrations continue to increase.

The authors also noted that none of the ecstasy users in the study suffered any immediate health problems after using the pills.

The US National Institute on Drug Abuse states that ecstasy can interfere with heart rate and temperature regulation and can cause brain damage.

Irvine said that collecting data in the real world is a valuable way to get a sense of what people are actually exposing themselves to. In fourteen of the ecstasy users in the study, the amount of MDMA in the blood reached levels that had never been studied in humans in the lab, he said.

In lab studies, ethical considerations prevent researchers from testing such potentially damaging and deadly doses in people, so the amounts they experiment with “do not reflect the range used naturally,” Irvine wrote in the study report.

The National Institute on Drug Abuse reports that seven in every 100 twelfth-grade students say they have tried ecstasy.

The research was funded by the National Health and Medical Research Council of Australia.

On the Net:

Amazon Losing Its Green Flare After 2010 Drought

A new study funded by NASA has found that Amazon forests are losing their greenness due to last year’s record-breaking drought.

“The greenness levels of Amazonian vegetation — a measure of its health — decreased dramatically over an area more than three and one-half times the size of Texas and did not recover to normal levels, even after the drought ended in late October 2010,” Liang Xu of Boston University and the study’s lead author said in a statement.

Researchers used computer models to predict that climate change could cause some of the rainforests to be replaced by grasslands or woody savannas.  This change would release carbon stored in the rotting wood into the atmosphere, which could inevitably accelerate global warming.

An international team of scientists used over a decade’s worth of satellite data from NASA’s Moderate Resolution Imaging Spectroradiometer and Tropical Rainfall Measuring Mission.

The team used these instruments to produce detailed maps of vegetation greenness declines from the 2010 drought.

The researchers first developed maps of drought-affected areas using thresholds of below-average rainfall as a guide.  They then identified vegetation using two different greenness indexes as surrogates for green leaf areas and physiological functioning.

The maps found that the 2010 drought reduced the greenness of about 965,000 square miles of vegetation in the Amazon, which is over four times the area affected by the last severe drought in 2005.

“The MODIS vegetation greenness data suggest a more widespread, severe and long-lasting impact to Amazonian vegetation than what can be inferred based solely on rainfall data,” Arindam Samanta, a co-lead author from Atmospheric and Environmental Research Inc. in Lexington, Mass, said in a statement.

The 2010 drought brought record water levels in rivers across the Amazon basin, including the Rio Negro.

“Last year was the driest year on record based on 109 years of Rio Negro water level data at the Manaus harbor. For comparison, the lowest level during the so-called once-in-a-century drought in 2005, was only eighth lowest,” Marcos Costa, coauthor from the Federal University in Vicosa, Brazil, said in a statement.

The team also used the NASA Earth Exchange (NEX) in their research.  The scientists used NEX to obtain a large-scale view of the impact of the drought on the Amazon forests.

“Timely monitoring of our planet’s vegetation with satellites is critical, and with NEX it can be done efficiently to deliver near-real time information, as this study demonstrates,” study coauthor Ramakrishna Nemani, a research scientist at Ames, said in a statement

The study will be published in Geophysical Research Letters, which is a journal of the American Geophysical Union.

Image 1: NASA satellite sensors, such as MODIS, showed an average pattern of greenness of vegetation on South America: Amazon forests which have very high leaf area are shown in red and purple colors, the adjacent cerrado (savannas) which have lower leaf area are shown in shades of green, and the coastal deserts are shown in yellow colors. Image Credit: Boston University/NASA

Image 2: Red and orange identify areas where satellite measurements indicated reduced Normalized Difference Vegetation Index (first index of greenness) of the Amazon forest during the 2010 drought. Image Credit: Boston University/NASA

Image 3: Red and orange identify areas where satellite measurements indicated reduced Enhanced Vegetation Index (second index of greenness) of the Amazon forest during the 2010 drought. Image Credit: Boston University/NASA

On the Net:

How Heat Is Transported To Greenland Glaciers

Warmer air is only part of the story when it comes to Greenland’s rapidly melting ice sheet. New research by scientists at Woods Hole Oceanographic Institution (WHOI) highlights the role ocean circulation plays in transporting heat to glaciers.

Greenland’s ice sheet has lost mass at an accelerated rate over the last decade, dumping more ice and fresh water into the ocean. Between 2001 and 2005, Helheim Glacier, a large glacier on Greenland’s southeast coast, retreated 5 miles (8 kilometers) and its flow speed nearly doubled.

A research team led by WHOI physical oceanographer Fiamma Straneo discovered warm, subtropical waters deep inside Sermilik Fjord at the base of Helheim Glacier in 2009.  “We knew that these warm waters were reaching the fjords, but we did not know if they were reaching the glaciers or how the melting was occurring,” says Straneo, lead author of the new study on fjord dynamics published online in the March 20 edition of the journal Nature Geoscience.

The team returned to Greenland in March 2010, to do the first-ever winter survey of the fjord. Using a tiny boat and a helicopter, Straneo and her colleague, Kjetil VÃ¥ge of University of Bergen, Norway, were able to launch probes closer to the glacier than ever before””about 2.5 miles away from the glacier’s edge. Coupled with data from August 2009, details began to emerge of a complicated interaction between glacier ice, freshwater runoff and warm, salty ocean waters.

“People always thought the circulation here would be simple: warm waters coming into the fjords at depth, melting the glaciers. Then the mixture of warm water and meltwater rises because it is lighter, and comes out at the top. Nice and neat,” says Straneo. “But it’s much more complex than that.”

The fjords contain cold, fresh Arctic water on top and warm, salty waters from the Gulf Stream at the bottom. Melted waters do rise somewhat, but not all the way to the top.

“It’s too dense,” Straneo says. “It actually comes out at the interface where the Arctic water and warm water meet.”  This distinction is important, adds Straneo, because it prevents the heat contained in the deep waters from melting the upper third of the glacier. Instead, the glacier develops a floating ice tongue””a shelf of ice that extends from the main body of the glacier out onto the waters of the fjord. The shape of the ice tongue influences the stability of the glacier and how quickly it flows.

In addition, the team found that vigorous currents within the fjord driven by winds and tides also play a part in melting and flow speed. “The currents in the fjord are like waves in a bath tub,” Straneo says. “This oscillation and mixing contribute to heat transport to the glaciers.”

The March 2010 trip marked the first time the researchers were able to observe winter-time conditions in the fjord, which is how the system probably works nine months out of the year.

“One surprise we found was that the warm waters in the fjord are actually 1 degree Celsius warmer in winter, which by Greenland standards is a lot,” Straneo says. “It raises the possibility that winter melt rates might be larger than those in the summer.

“Current climate models do not take these factors into account,” she adds. “We’re just beginning to understand all of the pieces. We need to know more about how the ocean changes at the glaciers edge. It’s critical to improving predictions of future ice sheet variability and sea level rise.”

Co-authors of the work include Ruth Curry and Claudia Cenedese of WHOI, David Sutherland of University of Washington, Gordon Hamilton of University of Maine, Leigh Stearns of University of Kansas, and Kjetil VÃ¥ge of University of Bergen, Norway.

 Funding for this research was provided by the National Science Foundation, WHOI’s Ocean and Climate Change Institute Arctic Research Initiative, and NASA’s Cryosperic Sciences Program.

Image 1: Using a tiny boat and a helicopter, the research team returned to Greenland in March 2010, to do the first-ever winter survey of Sermilik Fjord at the base of Helheim Glacier. During the trip, they were able to launch probes closer to the glacier than ever before””about 2.5 miles away from the glacier’s edge. (Fiamma Straneo, Woods Hole Oceanographic Institution)

Image 2: WHOI physical oceanographer Fiamma Straneo lowers a Conductivity/Temperature/Depth Recorder (CTD) into Sermilik Fjord in March 2010. (Kjetil Vage, Woods Hole Oceanographic Institution)

On the Net:

Walnuts Are The Best Heart Healthy Antioxidant Nut

(Ivanhoe Newswire) ““ A new study shows walnuts in the No. 1 slot among a family of foods that lay claim to being Mother Nature’s most nearly perfect packaged foods: tree and ground nuts. According to this report, walnuts have a combination of more healthful antioxidants and higher qualits antioxidants than any other nut.

“Walnuts rank above peanuts, almonds, pecans, pistachios and other nuts,” Joe Vinson, Ph.D., who did the analysis, was quoted as saying. “A handful of walnuts contains almost twice as much antioxidants as an equivalent amount of any other commonly consumed nut. But unfortunately, people don’t eat a lot of them. This study suggests that consumers should eat more walnuts as part of a healthy diet.”

Vinson noted that nuts in general have an unusual combination of nutritional benefits “” in addition those antioxidants “” wrapped into a convenient and inexpensive package. Nuts, for instance, contain plenty of high-quality protein that can substitute for meat; vitamins and minerals; dietary fiber; and are dairy- and gluten-free. Years of research by scientists around the world link regular consumption of small amounts of nuts or peanut butter with decreased risk of heart disease, certain kinds of cancer, gallstones, Type 2 diabetes, and other health problems.

Vinson analyzed antioxidants in nine different types of nuts: walnuts, almonds, peanuts, pistachios, hazelnuts, Brazil nuts, cashews, macadamias, and pecans. Walnuts had the highest levels of antioxidants.

Vinson also found that the quality, or potency, of antioxidants present in walnuts was highest among the nuts. Antioxidants in walnuts were 2-15 times as potent as vitamin E, renowned for its powerful antioxidant effects that protect the body against damaging natural chemicals involved in causing disease.

“There’s another advantage in choosing walnuts as a source of antioxidants,” said Vinson, who is with the University of Scranton in Pennsylvania. “The heat from roasting nuts generally reduces the quality of the antioxidants. People usually eat walnuts raw or unroasted, and get the full effectiveness of those antioxidants.”

If nuts are so healthful and nutritious, why don’t people eat more? Vinson’s research shows, for instance, that nuts account for barely 8 percent of the daily antioxidants in the average person’s diet. Many people, he said, may not be aware that nuts are such a healthful food. Others may be concerned about gaining weight from a food so high in fat and calories. But he points out that nuts contain healthful polyunsaturated and monosaturated fats rather than artery-clogging saturated fat. As for the calories, eating nuts does not appear to cause weight gain and even makes people feel full and less likely to overeat. In a 2009 U. S. study, nut consumption was associated with a significantly lower risk of weight gain and obesity. Still, consumers should keep the portion size small. Vinson said it takes only about 7 walnuts a day, for instance, to get the potential health benefits uncovered in previous studies.

SOURCE: 241st National Meeting & Exposition of the American Chemical Society held in Anaheim, California from March 27-31, 2011

Chicken Fat Biofuel: Eco-friendly Jet Fuel Alternative?

In an RV nicknamed after an urban assault vehicle, scientists from NASA’s Langley Research Center traveled cross-country this month for an experiment with eco-friendly jet fuel.

The Langley team drove 2,600 miles (4,184 km) from Hampton, Va., to meet up with other researchers at NASA’s Dryden Flight Research Center in California.

Researchers are testing the biofuel on a NASA DC-8 to measure its performance and emissions as part of the Alternative Aviation Fuel Experiment II, or AAFEX II. The fuel is called Hydrotreated Renewable Jet Fuel.

“It’s made out of chicken fat, actually,” said Langley’s Bruce Anderson, AAFEX II project scientist. “The Air Force bought many thousands of gallons of this to burn in some of their jets and provided about 8,000 gallons (30,283 liters) to NASA for this experiment.”

Anderson and his team will test a 50-50 mix of biofuel and regular jet fuel, biofuel only, and jet fuel only. The jet fuel is Jet Propellant 8, or JP-8, a kerosene-like mix of hydrocarbons.

Two of the team members headed west in a specially equipped 32-foot (9.75 m) van on loan from Langley’s Aviation Safety Program. It’s dubbed “EM-50” by researchers after the urban assault vehicle used in the 1981 comedy “Stripes” with Bill Murray.

Collaborative Effort

Three more researchers from Langley flew to the experiment, and researchers from Dryden and NASA’s Glenn Research Center in Ohio have key roles as well. The effort includes investigators and consultants from private industry, other federal organizations, and academia. In all, 17 organizations are participating in AAFEX II.

“This is going to be a lot of hard work,” said Anderson.

Glenn researchers shipped instruments that will be used to measure particulate and gaseous emissions.

“AAFEX II will provide essential gaseous and particulate emissions data as well as engine and aircraft systems performance data from operation of the DC-8 on a fuel produced from a renewable resource,” said Glenn’s Dan Bulzan, who leads clean energy and emissions research in NASA’s Subsonic Fixed Wing Project.

“NASA Dryden is excited to continue contributing to the study of alternative fuels for aviation use,” said Frank Cutler, NASA’s DC-8 flying laboratory project manager. “These tests will assess exhaust emissions generated by modern turbine aircraft engines using man-made fuels.”

In 2009, researchers in the AAFEX I project tested two synthetic fuels derived from petroleum-based coal and natural gas.

Testing is being done at a time when the U.S. military has set a goal of eventually flying its aircraft using 50 percent biofuel. The Air Force is currently engaged in certifying its fleet to operate on a 50-percent blend of the same fuel being tested in AAFEX II. Some military cargo and fighter planes already use alternative fuels.

“The use of alternative fuels, including biofuels, in aircraft is a key element for substantially reducing the impact of aviation on the environment and for reducing the dependency on foreign petroleum,” said Glenn’s Ruben Del Rosario, manager of NASA’s Subsonic Fixed Wing Project, which is conducting the tests.

The tests are funded and managed by the Fundamental Aeronautics Program of NASA’s Aeronautics Research Mission Directorate in Washington.

Michael Finneran, NASA Langley Research Center

Image Caption: NASA’s DC-8 at Dryden Flight Research Center’s Aircraft Operations Facility in Palmdale, Calif. Credit: NASA Dryden/Tom Tschida

On the Net:

42 ‘Disease Clusters’ In 13 States: Study

A new report released Monday by the Natural Resources Defense Council (NRDC) documents 42 “disease clusters” in 13 states, each of which includes incidences of numerous types of cancer, birth defects or other chronic diseases.

The study, conducted by the NRDC and the National Disease Clusters Alliance, incorporated research by governments and peer-reviewed academic studies.  The authors are calling for renewed federal support to help confirm these clusters, and to determine their causes.

“The faster we can identify such clusters, and the sooner we can figure out the causes, the better we can protect residents living in the affected communities,” said Dr. Gina Solomon of the NRDC, the study’s co-author.
 
The study examined clusters that have occurred since 1976, when Congress passed the Toxic Substance Control Act to regulate the use of toxic chemicals in industrial, commercial and consumer products.

The U.S. Center for Disease Control and Prevention defines a cluster investigation as “a review of an unusual number, real or perceived, of health events (such as reports of cancer) grouped together in a time and location.”

The NRDC study examined clusters in Texas, California, Michigan, North Carolina, Pennsylvania, Florida, Ohio, Delaware, Louisiana, Montana, Tennessee, Missouri and Arkansas.  Additional studies are planned.

Only one of the 42 clusters, in Libby, Montana, demonstrated a specific source for chemical contamination (asbestos).  Other clusters exhibited signs that documented exposure to toxic chemicals had harmed people who lived nearby.

The NRDC said their study documented confirmed clusters of:

“¢ Birth defects in Kettleman City, California, including twenty babies born over less than two years with birth defects, and four children born with birth defects so severe that they have since died, in this town of only 1,500 people.

“¢ Amyotrophic Lateral Sclerosis (Lou Gehrig’s disease) in Herculaneum, Missouri, a town affected by a major lead smelter and decades of pollution.

“¢ Multiple sclerosis (MS) in Wellington, Ohio, where residents are three-times more likely to develop MS than in the rest of the country.  MS is a disease whose causes are unknown, but are believed to involve a combination of genetic and environmental causes.

“¢ Birth defects in Dickson, Tennessee, a striking cluster that was identified by a non-profit organization called Birth Defect Research for Children (http://www.birthdefects.org/), created by the mother of a child with birth defects, which gathers information about birth defects nationally, links families, and works with scientists to identify patterns that require investigation.

“¢ Male breast cancer, childhood cancer, and birth defects in Camp Lejeune, North Carolina. More than 60 men who lived on that base have been diagnosed with male breast cancer ““ a rare and alarming finding which is almost impossible to occur by chance alone.

The Senate Environment and Public Works Committee will conduct a hearing on Tuesday on disease clusters and environmental health.

On the Net:

Interventional Radiology Y-90 Liver Cancer-Busting Treatment: Safe, Fast, Extends Life

Multi-institutional study confirms previous findings: Treatment using yttrium-90 microspheres is safe, demonstrates anti-tumoral effect in patients where chemotherapy failed, preserves patient quality of life, extends life””all done on an outpatient basis

Interventional radiologists have been the leaders in the use of intra-arterial yttrium-90 radioembolization, since its introduction in 2000, to treat liver cancer. Now, new results from a large multi-institutional study show that treating liver tumors with higher doses of Y-90 than previously tried is safe, provides results when chemotherapies have failed, preserves the patient’s quality of life””and can be done on an outpatient basis. This study, presented by researchers at the Society of Interventional Radiology’s 36th Annual Scientific Meeting in Chicago, Ill., further validates previous findings on the safety and efficacy of liver cancer treatments using Y-90.

“We knew that this unique interventional radiology treatment, done on an outpatient basis, which combines the radioactive isotope Y-90 into microspheres (small beads about the width of five red blood cells) that deliver radiation directly to a tumor, was one of the best ways to give patients a treatment that doesn’t harm healthy cells,” explained Riad Salem, M.D., MBA, FSIR, professor of radiology, medicine and surgery, and director, interventional oncology, division of interventional radiology, department of radiology at Northwestern University in Chicago, Ill. “Now we know that patients can actually tolerate much higher doses of radiation than previously thought, which provides results in patients progressing on standard chemotherapy,” noted Salem. “While patients aren’t cured, their lives are being extended with less down time and their quality of life is improving,” he emphasized.

The four-year prospective study looked at 151 patients (the group was 55 percent male, with an average age of 64 years) with liver metastases from colorectal, neuroendocrine and other cancers. In the United States, 20,000 cases of primary liver cancer are diagnosed each year. For metastatic colon cancer, that number is 150,000 per year. “The surgical removal of liver tumors offers the best chance for a cure,” explained Salem. “For many reasons, a majority of patients are not candidates for surgical resection. Liver tumors are often inoperable because the tumors may be too large or numerous or have grown into major blood vessels or other vital structures. Historically, chemotherapy drugs become less effective as the disease progresses,” he added.

Radioembolization is a palliative, not a curative, treatment””but patients benefit by having their lives extended and experiencing fewer side effects (such as the fatigue that can last for seven to 10 days after standard cancer therapy). In this study, several subgroups showed high rates of progression-free survival, such as 186 days for neuroendocrine patients compared to 95 days for colorectal cancer patients. “These rates are an excellent indicator of the treatment’s effectiveness,” said Salem.

For example, a 60-year-old woman with advanced liver cancer that had metasasized from neuroendocrine tumors had lesions that were progressing as she continued standard chemotherapy treatments. Salem noted, “At the study’s higher dose, we were able to reverse the progression and achieve shrinkage of the tumors without any adverse events.” He said that more research is planned, including combining Y-90 treatments with chemotherapy, increasing and fractioning the dose.

With the Y-90 radioembolization treatment, the microspheres are injected through a catheter from the groin into the liver artery supplying the tumor. The beads become lodged within the tumor vessels where they exert their local radiation that causes cell death. This technique allows for a higher, local dose of radiation to be used, with no danger from radiation to the healthy tissue in the body, said Salem. And, he says, since Y-90 radiates from within and, since it is administered in the hepatic artery, it can be viewed as “internal” radiation.

In treating cancer patients, interventional radiologists can attack the cancer tumor from inside the body without medicating or affecting other parts of the body. Y-90 treatment adds to interventional radiology’s nonsurgical advances for liver cancer, such as delivering chemotherapy directly to the affected organ (chemoembolization), killing the tumor with heat (radiofrequency ablation) or freezing the tumor (cryoablation) to treat cancer locally. Interventional radiologists are at the forefront of patient need as they discover more ways to alleviate patient fears and provide reassurance on the safety and efficacy of these kinds of targeted, minimally invasive treatments.

“This study, at several very skilled and high profile centers, including Northwestern University, the Mayo Clinic, Johns Hopkins, Albany Medical Center and the Medical College of Wisconsin in Milwaukee, is one of the initial steps prior to other international multicenter studies,” said Salem.

On the Net:

‘Facebook Depression’ A Serious Issue Among Teens

According to a group of doctors, “Facebook depression” is a condition that may affect troubled teens who obsess over the online social network.

Dr. Gwenn O’Keeffe, a Boston-area pediatrician and lead author of new American Academy of Pediatrics social media guidelines, told The Associated Press (AP) that there are unique aspects of Facebook that can make it a tough social landscape to navigate for kids already dealing with poor self esteem.

O’Keeffe said that it can be more painful than sitting alone in a crowded school cafeteria for kids to see status updates and photos of happy-looking people having great times.  She said Facebook provides a skewed view of what is really going on and there is no way to see facial expressions or read body languages that help provide content.

The guidelines urge pediatricians to encourage parents to talk with their kids about online use and to be aware of Facebook depression, cyberbullying, sexting and other online risks.

The academy guidelines say that online harassment “can cause profound pyschosocial outcomes,” including suicide. 

“Facebook is where all the teens are hanging out now. It’s their corner store,” O’Keeffe said.

She said the benefits of kids using social media sites like Facebook should not be overlooked.

“A lot of what’s happening is actually very healthy, but it can go too far,” she said.

Dr. Megan Moreno, a University of Wisconsin adolescent medicine specialist who studied online social networking among college students, told AP that using Facebook can enhance feelings of social connectedness among well-adjusted kids, and have the opposite effect on those prone to depression.

Parents should not get the idea that using Facebook “is going to somehow infect their kids with depression,” she said.

On the Net:

Cookies Sales for Girl Scouts Go High-Tech

Girl Scouts in northeast Ohio who are adapting to modern times in how they do business. The beret and vest-clad girls are completing an increased number of cookie sales by being able to accept credit cards using a device called GoPayment, a free credit card reader that clips onto smartphones.

Troop leaders believe paying with plastic will increase sales in a society where carrying cash is declining rapidly. As the Girl Scouts prepare to celebrate its 100th anniversary next year, keeping pace with technology is a priority.

Marianne Love, director of business services for the Girl Scouts of Northeast Ohio, told Associated Press (AP) reporter Meghan Barr, “Normally I think a lot of customers would love to buy cookies, but they have to walk by the booth because they’re not carrying cash. I know I never carry cash when I’m out shopping.”

Pending any unforeseen issues, Love plans to allow all 2,700 troops in northeast Ohio to accept transactions via GoPayment, a division of Intuit. Ten troops in San Diego, Calif., are also testing out the device this month. “I know there’s a lot of interest across the country with other Girl Scout councils,” Love said. “So I wouldn’t be surprised if you see it everywhere this time next year.”

GoPayment is just one of several mobile payment applications that sprang into public view 2010, with hundreds of thousands of people signing up to use them, said Todd Ablowitz, president of Double Diamond Group of Centennial, Colo., a consulting company focused on the mobile payment industry.

“Everyone from delivery drivers to Girl Scouts to baby sitters are swiping cards on their phones to take a payment. I mean, this barely existed before 2010. The numbers are staggering,” Ablowitz told Barr.

“The technology has actually existed for years, but it wasn’t until San Francisco-based Square, Inc. began offering its card readers for free that the industry really gained momentum,” Ablowitz said according to Barr.

GoPayment and other similar applications are very easy to set up. They typically charge a small fee per transaction and offer various pricing plans to customers based on sale volume. GoPayment has been on the market for about two years and charges the Girl Scouts its lowest rate, at 1.7 percent plus 15 cents per transaction. Most customers pay 2.7 percent per transaction.

Chris Hylen, vice-president of Intuit’s payments business, explained: “We saw people that wanted to take electronic payments and just didn’t have a way to do it. It’s been the fastest-growing part of our business.”

The sale of cookies from the Girl Scouts started in 1917 in Muskogee, Okla., when Girl Scouts began baking cookies at home with their mothers, said Michelle Tompkins, spokeswoman for Girl Scouts of the USA. Commercial sales began in 1935 and this year will net nearly $714 million to cookie sales this year.

Commercial kitchens do all the baking these days of course and receive a small portion of the profit. The remainder goes to local troops who are free to use the money for whatever they like. Some troops decide to pool their funds to travel abroad, while others donate money to charity.

Mobile payments are a natural for most of the girls, said Gwen Kolenich, a troop leader in Parma, a Cleveland suburb. “This is something that makes it easy because we’re now in a touch generation. So being about to offer this kind of payment method and technology to girls is right up their alley.”

Linda Bellomy, bought 10 boxes and donated them to the troops, and used her credit card because she never carries cash anymore. “I gave her my card, they zipped it through, and they actually were able to key in an email address that my receipt goes to,” she said.

About 30 miles away in Parma, the Girl Scouts pulling their wagon from door to door encountered a problem that can’t be fixed by technology. Most people weren’t home to answer the door.

An iPhone application called “USA Cookie Finder,” is another digital convenience that was recently introduced. It uses GPS technology to pinpoint the user’s location and map out the nearest cookie sales. Users can even post cookie sale locations on Twitter and Facebook.

“When it comes to technology, I think the best way to sum up Girl Scouts is: We are where the girls are,” Tompkins said. “We listen to what they say. And when they tell us that they are on Facebook, then we go on Facebook.”

On the Net:

Biodiversity Use May Co-Exist In Tropical Forests

Local participation in forest management may simultaneously promote biodiversity and sustainable resource use for household livelihoods

Contrary to popular belief, the biodiversity of a tropical forest may be conserved while its resources are used to support local household livelihoods, according to a new study published in the March 25 issue of Science. But biodiversity and resource use are most likely to successfully co-exist in forests that are managed under systems that receive inputs from local forest users or local communities.

These study results imply that one important way for governments to simultaneously promote biodiversity and forest-based livelihoods is to formalize the rights of local people to contribute to rulemakings on the management and use of local forests.

This study, which was partially funded by the National Science Foundation, was conducted by a team led by Lauren Persha of the University of Michigan.

Forest policy decentralization reforms that transfer ownership and management responsibilities to local forest user organizations have already been introduced in more than two-thirds of the developing world. However, this approach’s effectiveness has been questioned because of its potential to enable elites to dominate resource use and because of potential weaknesses in links between local decision-makers and larger governing bodies.

But despite such criticism, the Persha team found that forest management inputs from local forest users and communities may promote the growth of forests that are biologically diverse and support local household livelihoods. The researchers attribute the dual success of local inputs to their potential for generating rules that support sustainability and accommodate specific local forest conditions. Such rules thereby help foster forests that support local livelihoods over the long term and so gain legitimacy and relevance.

Study results also indicate that larger tropical forests are more likely to simultaneously support biodiversity and forest-based livelihoods than are smaller tropical forests. But for a given forest size, the probability of achieving such dual success is higher in forests where local forest users or local communities maintain a formal role in management.

Nevertheless, the researchers say that their findings are particularly relevant for small forest patches in human-dominated landscapes, which–when supporting local livelihoods–face the most difficult conservation challenges.

The results of the study challenge some scholarly research that has depicted the conservation of tropical forests and resource use to support livelihoods as mutually exclusive. However, these previous studies tended to focus on either social outcomes or ecological outcomes–not on these two potential outcomes together.

By contrast, this study is the first study to identify which social, ecological and governance factors simultaneously promote biodiversity and forest-based livelihoods. Also, the study, which is based on data from 84 sites in six countries in East Africa and South Asia, is only one of a few studies that identifies factors that may promote biodiversity or forest-based livelihoods across multiple countries, instead of just in specific locations and specific contexts.

“One thing that is clear is that overcoming forest governance challenges is central to maintaining the diverse benefit flows of tropical forest,” said Persha. “The effort involved in funding and collecting data on the scale of this research, across so many sites and countries, is substantial–but also vital for generating a more solid evidence base to help decision-makers construct better policies for forest sustainability to meet multiple social and ecological goals.”

Team member Arun Agrawal of the University of Michigan added: “Interdisciplinary research requires giving up entrenched disciplinary biases. We are glad to see the most prestigious research journals in the world recognizing the need for such research and making it possible to pursue such work.”

“This study illustrates how research on coupled natural-human systems can inform governance policies for land use and resource management that enhance both ecological and economic sustainability,” said Alan Tessier, an NSF program director.

“This article builds on research supported by a diverse set of NSF programs,” noted Thomas Baerwald, another NSF program director. “And it demonstrates how increased knowledge about the complex interactions between people and the natural environment can help address societally significant problems.”

It is vitally important to find ways to conserve tropical forests because over one billion people depend on them for their livelihoods. In addition, tropical forests currently store more than 500 billion tons of carbon–more than all of the carbon that is currently stored in the atmosphere. If these tropical forests were lost, their vast stores of carbon would be released into the atmosphere and potentially impact climate in significant ways.

On the Net:

Bats Keep Separate Households

The use of different environments by males and females in the parti-colored bat makes population estimation and thereby the conservation of the species more difficult

The use of different resources by males and females exacerbates the estimation of population sizes. However, the monitoring of population sizes, particularly for rare and threatened species, is pivotal to quick and effective conservation action. Scientists from the Max Planck Institute for Ornithology in Radolfzell investigated the ecological niches of male and female parti-colored bats (Vespertilio murinus) and discovered that the sexes use entirely different foraging grounds. Their findings demonstrate that a finer grained view of what different demographic subsets of species do is essential for correct estimation of population trends with important implications on action plans for conservation.

Reliable knowledge of population sizes and changes thereof, which is often obtained by field surveys is essential for conservation. Differences in behavior between demographic subsets of species, for example males and females, can lead to differences in resource use such as in diet or roost use. These differences can lead to specialization and ultimately translate into spatial segregation within species. Reliable estimates of population sizes are however much hampered by sexual segregation. For threatened and rare species monitoring of population trends are essential for fast and appropriate conservation action.

Scientists from the Max Planck Institute for Ornithology in Radolfzell and their collaborators at the Swiss bat conservation center now propose a novel way to obtain better estimates of population size for sexually segregating species.

They investigated the parti-colored bat (Vespertilio murinus) in Switzerland. Although the distribution of this species stretches over a vast area reaching from the Netherlands all the way to China, the species is rare in Western Europe. Despite the fact that males and females are barely different in size and fur coloration and identical in their preferences for day roosts, at a close glance they are fundamentally different. As so often among mammals in the parti-colored bats too, females carry the full load of parental care with no support whatsoever from males in raising their twin pups. Twinning is rather exceptional among bats and presumably leads to even higher energetic costs imposed on the females. These differences in the investment between the sexes, so the scientists argue, result in different tolerance of males and females towards the quality of their foraging areas and the amount of prey they need to sustain themselves, leading to a segregation of the sexes.

Using radio telemetry data of male and female parti-colored bats in conjunction with environmental data the scientists modeled the ecological niches of each of the sexes within the geographic area of Switzerland. This approach not only allowed to compare the amount of suitable habitat available to each of the two sexes by generating so called habitat suitability maps, but also allowed to estimate and compare the degree and overlap in ecological specialization between sexes. “Female parti-colored bats seem to be highly specialized and rely heavily on lake shores for their foraging activities” says Mariëlle van Toor from the Max Planck Institute for Ornithology. “Male parti-colored bats, albeit being also highly specialized, use other and a broader range of resources than females such as rivers, cities, and agricultural areas. The maps which showed no spatial overlap of foraging grounds between the sexes revealed therefore that suitable habitat is almost three times more abundant for males than females. From an ecological perspective males and females behave like two different species”, van Toor explains.

The study suggests that within the parti-colored bat species, females are more vulnerable to habitat change and that conservation action has to pay special attention to their needs. More importantly, monitoring efforts should take these differences into account. This might represent a particular problem for bat surveys, since the most widely used method of acoustical survey which relies on counts of the echolocation calls emitted by the bats for orientation during flight does not allow to distinguish between males and females in the field. With the models presented in the study it will be possible to estimate the bias in sex specific habitat use where probabilities of sex specific habitat use can be associated with acoustic monitoring depending on the location in which the recordings were made.

Van Toor concludes: “This study shows that in species where sexes segregate not necessarily the typical or in other words the most common habitat should be regarded as vital resources that may need protection, but the habitat of the more specialized and thus more vulnerable demographic subsets of the entire species pool.” For the parti-colored bats in Switzerland the availability of aquatic ecosystems such as lakes and marshes is essential for reproductive females to ensure that this species finds enough prey also in the future.

Reference: M.L. van Toor, C. Jaberg, K. Safi. Integrating sex-specific habitat use for conservation using habitat suitability models. Animal Conservation, 24.03.2011, 1469-1795 Doi: 10.1111/j. 1469-1795 2011.00454.x

Image 1: Separate homesteads: scientists from the MPI for Ornithology investigating the ecological niches of parti-colored bats have discovered that males and females use entirely different foraging grounds. This means that from an ecological perspective, males and females behave like two different species. © Max Planck Institute for Ornithology

Image 2: Reliable estimates of bat population sizes, which are essential for species conversation, are hampered by sexual segregation. Scientists are now proposing novel ways to obtain more accurate data by using radio telemetry data in conjunction with environmental data to create so-called “habitat suitability maps”. Their findings suggest that it is not necessarily the typical habitat that should be regarded as the resource needing protection, but rather that of the more specialized demographic subset of the species pool. © Max Planck Institute for Ornithology

On the Net:

Molecular Combination Activates Immune System Against Cancer

A new combination molecule ““iMyD88/CD40 ““ acts as a molecular “master switch” to turn on dendritic cells (immune system cells), improving the effectiveness of tumor vaccines, said a consortium of researchers led by Baylor College of Medicine in a report that appears in The Journal of Clinical Investigation.

To accomplish this improved antitumor response, Dr. David Spencer, vice chair of pathology and immunology at BCM, and his colleagues fused the “universal” adapter molecule, MyD88, to CD40, a molecule that further activates dendritic cells when they reach the lymph nodes. MyD88 triggers the initial “danger” alarm that tells the immune system an infection has occurred.

Safer, more potent immunological treatments To endow this chimeric protein with an on/off switch feature, Spencer and his colleagues fused this to a third human-derived protein domain (a part of the protein molecule that can develop a function of its own) called FKBP12. FKBP12 has a high affinity to the synthetic molecule AP1903 that dimerizes or connects two similar molecular subunits (e.g., MyD88/CD40) called monomers that are modified to contain the dimerizer-binding domain. Upon administration of AP1903, the two key signaling domains are multimerized or become a protein complex made of up similar molecules. This activity simulates normal activation mechanisms.

This development points the way to safer and more potent immunological treatments for a variety of cancers, Spencer said.

“We are just one step away from a vaccine that targets dendritic cells in living organisms,” said Spencer, who is also a member of the Dan L. Duncan Cancer Center at BCM. “This work illuminates the path toward a truly “off-the-shelf” vaccine. This could lead to more effective cancer vaccines with fewer side effects.”

Previous work used chemicals called adjuvants as a “danger signal” to alert the protective dendritic cells to the presence of tumors. However, to avoid systemic effects, these adjuvants need to be carefully removed prior to administration, said Spencer.

Result: Control of aggressive tumors in mice

MyD88 replaces the adjuvant signal and thus eliminates adjuvant toxicity. When activated by the drug AP1903, the “inducible” iMyD88/CD40 fusion protein leads to increased levels of the interleukin-12 and other factors secreted by the immune system.

This ultimately increases the number of killer T-cells while activating dendritic cells in lymph nodes.

“This resulted in the elimination or control of pre-established, aggressive tumors in mice,” said Spencer. Tumor vaccines are promising cancer treatments that seek to arouse the body’s own immune defenses against cancer. However, they have not proven long lasting or potent enough to eliminate tumors in most studies. Spencer and his colleagues seek to correct those problems.

Others who took part in this work include Priyadharshini Narayanan, Dr. Natalia Lapteva, Mamatha Seethammagari and Dr. Jonathan M. Levitt, all of BCM, and Dr. Kevin M. Slawin with the Vanguard Urologic Institute and The University of Texas Health Science Center at Houston. Seethammagari is also with the Diana Helis Henry Medical Research Foundation in New Orleans. Spencer holds the Roger D. Rossen, M.D. Endowed Professorship.

Funding for this work came from the Diana Helis Henry Medical Research Foundation, the National Institutes of Health and the Vanguard Urologic Research Foundation. Clinical applications of this vaccine and other cell technologies are being developed by Bellicum Pharmaceuticals, a privately held biotherapeutics company co-founded by Slawin and Spencer.

On the Net:

All About Labor Induction

Labor Induction is a process of giving an artificial start to birth with medical intervention or other methods. When an induction is not performed for emergency or other medical reasons, the method is considered an elective process. The decision to induce labor has increased in recent years due to its convenience or because it easily accommodates busy schedules.

The American College of Obstetricians and Gynecologists, however, say that labor should only be induced when it is more risky for the baby to remain in the mother’s uterus than to be born.

There are several reasons why labor induction should be performed. These include:

* Pregnancy lasting more than 42 weeks. After 42 weeks the placenta normally stops functioning properly enough for the baby to receive adequate nutrition and oxygen.

* Pregnancy lasting more than 38 weeks when having twins.

* The pregnant woman has high blood pressure caused by pregnancy.

* The pregnant woman has an infection in her womb.

* The woman’s water has broken, but contractions have not begun.

* The woman has health problems, such as diabetes.

* There are health risks to the woman if pregnancy is continued.

* A growth problem is causing the baby to be too small or too big.

* Intrauterine fetal growth retardation (IUGR).

* Premature rupture of the membranes (PROM). This occurs when the membranes have ruptured, but labor does not start within a certain amount of time.

* Premature termination or abortion.

* Fetal death.

If an induction causes complications, a Caesarean section is almost always conducted in place of inducing. An induction will most likely be successful when a woman is close to or in the early stages of labor. Signs of impending labor may include softening of the cervix, dilation and increasing frequency or intensity of contractions. The Bishop score may be used to assess how suitable induction would be.

The Bishop score, which is also used to assess the odds of spontaneous preterm delivery, grades patients who would be most likely to achieve a successful induction. The duration of labor is inversely correlated with the Bishop score; a score that exceeds 8 describes the patient most likely to achieve a successful vaginal birth. Bishop scores of less than 6 usually require that a cervical ripening method be used before other methods.

Induction Methods

Use of medication is a common method in labor induction.

* Intravaginal, endocervical or extra-amniotic administration of prostaglandin, such as dinoprostone or misoprostol. Extra-amniotic administration has appeared to be more efficient than Intravaginal or endocervical administration in the few controlled studies that have been done.

* Intravenous administration of synthetic oxytocin preparations, such as Pitocin.

* Natural Induction. Natural induction includes the use of herbs, castor oil or other medically unconventional agents to stimulate or advance a stalled labor.

* Mifepristone use has been described.

* Relaxin has been studied, but is not a commonly used medication.

There are also other processes and methods for inducing labor besides the use of medication.

* Stripping the membranes (separating the amniotic sac from the wall of the uterus): The amniotic sac is the lining inside the uterus that contains the baby. The doctor gently puts a gloved finger through the woman’s cervix. Using the finger, the doctor separates the sac from the uterine wall. The woman may feel some cramping or spotting with this method.

* Ripening the cervix: The doctor places a small tablet or suppository in the vagina up against the cervix. This helps to soften and thin the cervix. After receiving the suppository, the woman may start to have gentle contractions.

* Nipple Stimulation: This is a natural form of labor induction that can be done manually or with an electric breastfeeding pump. The hormone oxytocin will naturally be produced to cause contractions. The concept works the same as when a baby nurses right after birth, stimulating contractions, which slows the bleeding.

* Artificial rupture of the membrane (AROM): When the amniotic sac breaks or ruptures, production of the hormone prostaglandin increases, speeding up contractions. A doctor may suggest rupturing the amniotic sac artificially. A sterile, plastic hook is brushed against the membrane just inside the cervix. The baby’s head will move down against the cervix, which usually causes the contractions to become stronger. This method releases a gush of warm amniotic fluid from the vagina.

AROM has advantages and disadvantages.

Advantages include shortening labor by an hour or so, allowing the amniotic fluid to be examined for the presence of me conium, which can be a sign of fetal distress, and doctors can monitor heart rate with direct access to the baby’s scalp.

Disadvantages include the baby possibly turning to a breech position, making birth more difficult if the membranes are ruptured before the baby’s head is engaged, and leaving the possibility for the umbilical cord to slip out before the baby. Infection can occur if there is too much time between the rupture and the birth.

When to Induce

Until recently, the most common practice has been to induce labor by the end of the 42nd week of pregnancy. While this practice is still very common, recent studies have shown an increasing risk of infant mortality for births in their 41st and 42nd week of gestation, as well as higher risk of injury to the mother and child. The recommended date for induction of labor has now been moved to the end of the 41st week of gestation in many countries including Canada and Sweden.

Risks of Induction

Like any medical procedure, labor induction has potential side effects and health risks to both the mother and the child. Some common ones include:

* Oxytocin can make contractions quite strong and lower the baby’s heart rate. Throughout the induction process, it is important for the baby’s heart rate to be monitored. Adjusting the dosage of a drug can increase the strength of the contractions and reduce the effect on the baby’s heart.

* Women who have inductions are at an increased risk of having an infection, and so are their babies.

* The umbilical cord may slip out into the vagina before the baby does. This is more likely to occur if the baby is breech. Also, the cord may become compressed, decreasing the baby’s oxygen supply.

* Often the treatment may not work properly and the mother has to have an emergency cesarean delivery.

A less common complication with induction is uterine rupture, which can cause severe bleeding. Women who have previously had a C-section are at an increased risk of uterine rupture, as cesarean deliveries leave a scar in the uterus.

There is also a risk of babies being born “late preterm.” Inductions may contribute to the growing number of “late preterm” births between 34 and 36 weeks gestation. While babies born at this time are usually considered healthy, they are more likely to have medical problems than babies born a few weeks later at full term (37-42 weeks).

While induction has risks, it is sometimes needed to protect the health of the mother and the baby. The pregnant woman needs to understand both benefits and risks of labor induction.

Expectations with Induction

In most cases, labor induction goes well, and the woman can deliver her baby through the birth canal normally. An induction can take anywhere from two or three hours to as long as two or three days, depending on how the woman’s body responds to the treatment she is receiving. An induction may take longer if the woman is pregnant for the first time or if the baby is not full term.

Every pregnancy is different. Having an induction is not a sign of failure and it may be the best thing for both the health of the baby and the mother. Medicines used for inducing labor may upset a woman’s stomach so normally it is recommended that she eats lightly before going to the hospital. Foods such as Jell-O and soup are good light foods. Medicines may also cause strong contractions. The woman should know that she can always ask if she needs help for her pain.

Criticism of Induction

As induced labor tends to be more intense and painful for women, it can lead to increased use of analgesics and other pain-relieving medications. These medications have been said to lead to an increased likelihood that the pregnancy might result in cesarean delivery for the baby.

However, studies into the issue indicate that labor induction has no effect on the rates of cesarean deliveries. Two recent studies have shown that induction may increase the risk of C-section if performed before the 40th week of gestation, but it has no effect or actually lowers the risk if performed after the 40th week of gestation.

At least one study has indicated that cesarean delivery rates increase with induction. Research published in the Journal of Perinatal and Neonatal Nursing showed induction increased a woman’s likelihood of having a C-section by two to three times.

Menthol Cigarettes Not More Harmful Than Regular Ones

On March 23, the Journal of the National Cancer Institute published findings on the effects of mentholated cigarettes on developing lung cancer in black and white smokers.

They were expecting the study to support that mentholated cigarettes were more toxic to smokers, but were surprised to find that the risk of lung cancer was not higher in menthol smokers than non-menthol smokers.

In the study, William Blot of the Vanderbilt-Ingram Cancer Center in Nashville, Tennessee says, “In fact, it was a bit lower in mentholated compared to non-mentholated smokers and there was no significant difference in the rate of quitting smoking,”

Blot told Reuters via telephone: “It was about 30 percent lower,” which is “statistically significant,” concluding that menthol cigarettes are not more toxic than regular ones.

Bolt and his colleagues led a prospective study among more than 85,000 people who are enrolled in the Southern Community Cohort Study – a large ongoing multiracial study conducted in 12 states. Within the cohort, the researchers identified 440 lung cancer patients to be compared with 2,213 matched controls. The control group consisted of “other people in the study with the same demographics, such as race, age and sex, but without lung cancer,” reports the Journal of the National Cancer Institute.

The study concluded that menthol cigarettes may be less harmful than non-mentholated cigarettes. Out of their sample of participants, the researchers found that there was lower lung cancer development and fewer deaths than regular cigarette smokers.

For example, menthol smokers of 20 or more cigarettes a day were 12 times more likely to develop lung cancer, while non-menthol smokers with the same number of cigarettes per day were 21 times more likely to have the disease. These numbers are compared to people who have never smoked a cigarette.

The study also found that menthol cigarettes are not more addictive; in fact, it suggested that they smoke fewer cigarettes than regular cigarette smokers.

“Our data indicated there is no evidence that menthol smokers have a harder time quitting smoking,” says Bolt.

Menthol cigarettes have been getting popular with adolescents, with the highest use among newer and younger people, reports Reuters. Blot’s study looked at older smokers; therefore, he said it could not make any conclusions that it is easier for younger smokers to tolerate smoking when menthol is added to cigarettes.

It has been argued by health advocates that menthol flavoring covers up the harshness of tobacco and makes it easier to start smoking and harder to quit. Manufacturers of menthol cigarettes have always defended menthol, saying it does not make a cigarette more harmful or addictive.

The U.S. Food and Drug Administration is currently considering whether to ban or regulate mentholated cigarettes and the study couldn’t come more timely.

“Cigarette smoking remains the leading cause of premature death in the United States, but undue emphasis on reduction of menthol relative to other cigarettes may distract from the ultimate health prevention message that smoking of any cigarettes is injurious to health,” the study states.

On the Net:

Arthritis Drug Could Thwart Melanoma: Study

A breakthrough discovery may offer an effective new treatment for melanoma, one of the deadliest forms of skin cancer, scientists said on Wednesday.

The researchers found that leflunomide – a drug commonly used to treat rheumatoid arthritis ““ also inhibits the growth of malignant melanoma.  Furthermore, when leflunomide was used in combination with PLX4720, a promising new melanoma therapy currently undergoing clinical trials, the effect was even more powerful ““ leading to a nearly complete block of tumor growth.

Leflunomide was initially sold under the brand name Arava by Sanofi-Aventis, but is now sold generically.

If further trials prove successful — both for leflunomide alone and in combination with Roche and Plexxikon’s PLX4032 — patients could have access to new treatments in three to five years, said the scientists from Britain’s University of East Anglia (UEA) and Children’s Hospital Boston.

Japanese drugmaker Daiichi Sankyo acquired Plexxikon for $805 million in March.

“This is a really exciting discovery — making use of an existing drug specifically to target melanoma,” said researcher Grant Wheeler of the UEA.

Melanoma is a cancer of the pigment cells in our skin, and is the most aggressive form of skin cancer.  If caught early, surgery can be used to safely remove the tumor.  However, if the cancer returns and spreads there are virtually no good alternative treatments and the chances of survival are very low. Conventional chemotherapy typically works in only 10 to 20 percent of these cases.

Unlike most other cancers, the incidence of melanoma is increasing.

In the current research, scientists began examining the effects of various chemical compounds on pigments in frogs and zebrafish in a screening process.

“Cancer is a disease not only of genetic mutations, but also one determined by the identity of the cell in which the tumor arises. By studying cancer development in zebrafish and frogs, we gain a unique insight into the very earliest changes that occur in those cells,” said lead author Dr. Richard White of Children’s Hospital Boston and Harvard Medical School.

The results of the screening process eventually led the researchers to test the drug leflunomide in mice engrafted with human melanoma cells.

After observing that the drug slowed the growth of these tumors, they combined it with Roche and Plexxikon’s drug — a so-called BRAF inhibitor known as PLX4720.

“We thought combining that drug, which targets a specific oncogenic mutation, with leflunomide, which changes the cell’s lineage, could have a beneficial effect,” said Leonard Zon, one of the research team leaders from Children’s Hospital Boston.

The results proved Zon correct. Compared with each drug alone, the combination led to a dramatic decrease in melanoma, and even with low doses of each drug the tumors went away completely in 40 percent of the mice, the researchers said.

The results mean that human trials could start within the next six months, since the safety of both drugs has already been established, the researchers said.

“Deaths from melanoma skin cancer are increasing and there’s a desperate need for new, more effective treatments,” Wheeler told Reuters.

“If this does what we think it will and we find patients are responding to it in the trials, then it could be being used in patients in three to five years.”

Interestingly, some of genes that leflunomide pauses are those controlled by myc, a well known oncogene implicated in several cancer types, the researchers said.

The studies appear in the March 24, 2011, issue of the journal Nature. 

On the Net:

US Man Receives Full Face Transplant

A power line accident left a 25-year-old man horribly disfigured, but now he has successfully received the nation’s first full face transplant, various media outlets are reporting.

Dallas Wiens underwent over 15 hours of surgery by a team of 30 doctors, nurses and other staff at Brigham and Women’s Hospital in Boston. He received a new nose, lips, skin, muscle and nerves from an organ donor. The match was based on gender, race, age and blood type, said the hospital.

Wiens’ was injured in November 2008 while working in a cherry picker. His head touched a high voltage electrical wire, and his face was practically erased.

The transplant will allow him to smile again and feel his daughter’s kisses.  Weins told the Assoiated Press (AP) that his daughter and his faith kept him going. She will turn 4 next month.

“Daddy has a boo boo, but God and the doctors are making Daddy’s boo boo all better,” Wiens quotes his daughter.

“Dallas always said after the injury that he now had a choice: he could just choose to get bitter, or choose to get better. His choice was to get better. Thank God today he’s better,” Wiens’s grandfather, Del Peterson, says.

The $300,000 operation was paid for by the U.S. Department of Defense as part of its research to help severely wounded service personnel. A total of$ 3.4 million was granted for five transplants reports the AP.

U.S., France, Spain and China have operated on a dozen face transplants. Wiens’s was the third in the U.S. and the Boston hospital’s second. Connie Culp was the first recipient to receive a partial face transplant in 2008, followed by James Maki in April 2009, who was injured in a freak accident when he fell on the electrified rail in a Boston subway station.

Charla Nash, the Connecticut woman who was left disfigured and blind by a friend’s 200-pound chimpanzee is on the transplant waiting list, says plastic surgeon Dr. Bohdan Pomahac.

Pomahac and his team reported that the prognosis was good for these patients who have experienced groundbreaking operations.

Wiens’ grandfather said that his grandson hopes to become an advocate for facial donations. “You will forever remain in our hearts and our prayers and we are grateful for your selflessness,” Peterson thanked the donor family.

On the Net:

Synthetic Drug Kills Teen, Injures More

Despite Minnesota lawmakers seeking a ban on chemicals used in common synthetic drugs, revelers at a spring break party in a Minneapolis suburb overdosed this week on one of the risky substances, leaving one dead, officials told Reuters on Sunday.

Iowa Rep. Sen. Charles Grassley, just two days ago proposed a similar ban on chemicals used in such synthetic drugs known as Spice and K2, a synthetic marijuana.

Trevor Robinson, 19, died of an overdose at the spring break party while ten other people ages 16 to 22 were hospitalized, said Anoka County, Minn. Police Detective Larry Johnson.

Timothy Lamere, 21, was arrested on suspicion of third-degree murder Friday, The Minneapolis Star Tribune reported. Lamere provided his friends with 2C-E, which is not listed as a controlled substance, according to police officials. A Drug Enforcement Administration spokesman, however, told UPI it is illegal under the Federal Analog Act, since it is similar to the outlawed 2C-B.

One of the hospitalized, Jake Kruse, 19, explained that the drug, “Hits you hard right away and then hits you again 20 minutes after that.”

Kruze said Robinson’s “eyes were fluttering, his arms flailing. He punched a hole in the wall. He had no control.” Party goers attempted to offer Robinson CPR, then mouth-to-mouth resuscitation, and finally transported him to Unity Hospital in Fridley. He was removed from life support Thursday afternoon.

2C-E, combined with other drugs, including alcohol, can be fatal, said Carol Falkowski, a drug abuse official of the Minnesota Department of Human Services.

These drugs are readily available through internet sales and their emerging chemical complexity and popularity, make regulation difficult, Drug Enforcement Administration spokesman Rusty Payne told Reuters.

“This stuff comes in from other countries, and people just have no idea what they are getting. We have chemists and forensic scientists constantly evaluating chemicals on an ongoing basis, but there hasn’t been any federal legislation passed on those substances yet,” Payne explained.

Image Caption: 20mg Capsules of 2C-E. Credit: Wikipedia 

On the Net:

Using Lambs To Weigh-In On Obesity

(Ivanhoe Newswire)– The results of a new study on obesity in sheep and lambs may lend a hand in understanding obesity in humans. The new study is the first to show a definite link between obesity in a mother and obesity in her child in mammals that bear “mature offspring.” Since humans also bear mature offspring, the results of this study are the first that can possibly be applied to humans.

While the relationship between maternal obesity and offspring obesity has been proven in rats, rats do not bear mature offspring– their young are born immature, and therefore a link between maternal and offspring obesity in rats cannot be applied to humans.

However, in the new study, by researchers at the Center for Pregnancy and Newborn Research, University of Texas and the University of Wyoming, lambs were used for the research. Scientists regulated the diets of sheep for 60 days before conception and throughout their pregnancy: Half the ewes were fed a normal diet, and half received a diet to produce obesity. After giving birth, the scientists monitored the appetite and weight gain of their offspring, observation which continued for 19 months.

One area in which the newborn lambs were monitored was hormone levels. Researchers took blood samples from the lambs and tested leptin levels. Leptin is a hormone that regulates appetite and is produced by fat cells. In lambs born to obese mothers, scientists saw no peak in leptin levels, but in lambs born to mothers with normal body weight, leptin levels peaked six to nine days after birth.

After analyzing blood samples from one-day-old lambs, scientists observed higher cortisol levels in obese sheep. The scientists now suspect exposure to high levels of cortisol in the womb may be the reason leptin levels did not peak in lambs of obese mothers.

Professor Peter Nathanielsz, lead author of the research, was quoted as saying, “Given the epidemic of obesity both in the developed and developing world, the search for environmental factors occurring around the time of birth which predispose the offspring of overweight mothers to lifelong obesity is important”¦ Seeing these hormonal changes in lambs”¦is advancing our understanding of what programs appetite. We are getting closer to understanding what causes obesity in humans.”

SOURCE: Journal of Physiology, Published March 15, 2011

US Reports First Case Of AIDS From Live Organ Donor

A transplant patient in New York has contracted the AIDS virus from the kidney of a living donor, the nation’s first case of transmission from a living organ donor since screening for HIV began in the mid-1980s, the New York State Department of Health said on Thursday.

The donor had unprotected gay sex in the 11 weeks between the time he tested negative for the AIDS virus and the time the surgery took place in 2009, according to a report issued Thursday by the U.S. Centers of Disease Control and Prevention.

The health agency said it recommends that organ donors have repeat HIV tests one week before surgery.

“The most sensitive test needs to be done as close as possible to the time of transplant,” said Dr. Colin Shepard, who manages the tracking of HIV cases for the New York City Health Department.

The CDC also said potential organ donors should be advised to avoid behavior that can increase their chances of infection.

Although living organ donors in the U.S. are routinely tested for infectious diseases such as hepatitis and HIV, the organization that oversees organ transplants does not have a clear policy on when such screening should occur.  Instead, that decision is left to the transplant centers.

Patient confidentiality rules prevented health officials from releasing many details about the donor, recipient, their relationship or the hospital where the transplant took place, except to say that it was in New York.

Neither the donor nor the recipient was aware of their infections until a year after the transplant, the CDC report said.

Health officials said that the transplant recipient developed AIDS, perhaps due to drugs taken to suppress the immune system to prevent organ rejection, while the donor did not.

Both are receiving treatment.

“We don’t know how frequently this is happening and we need better surveillance,” said Dr. Matthew Kuehnert, co-author of the CDC report.

HIV infections in a donor or recipient may not be discovered until long after the transplant occurred.  In this case, once health authorities were notified of the AIDS infection late last year, they spent months investigating whether the transplanted kidney was the source.  Genetic analysis of the virus confirmed that the infection had originated in the donor.

There has been one confirmed report since the 1980s of a deceased donor’s organs spreading the AIDS virus to a recipient.  That case involved organs from a 38-year-old gay man that went to four recipients in 2007.

For years, transplant organizations focused mainly on screening organs taken from the dead donors, which accounted for the vast majority of transplants. But kidney transplants from live donors are becoming increasingly common.  Indeed, while just 32 percent of kidney transplants came from live donors in 1988, that number had grown to more than 46 percent by last year, according to federal government data.   Donors are typically relatives or friends.

Some 88,000 people are currently on the kidney waiting list, according the United Network for Organ Sharing, a nonprofit group that oversees the United States’ organ transplant system for the federal government.

The organization is currently developing new policies for live donors, according to spokesman Joel Newman.

Transplant centers employ teams that evaluate potential donors and search for physical or psychological problems. 

CDC officials recommend a more sensitive HIV test be used, which can detect the virus within 10 days after a person is first infected.   However, the traditional test, which does not detect HIV antibodies until three to eight weeks after infection, is more commonly used.

The CDC report can be viewed at http://www.cdc.gov/mmwr/preview/mmwrhtml/mm6010a1.htm?s_cid=mm6010a1_w

Netflix Looks To Gain Rights To Spacey TV Series

Netflix is reportedly looking to purchase the Internet streaming rights to an original television series starring Kevin Spacey, in a move that would put the video streaming service in competition with cable television.

According to online magazine Deadline.com, Netflix outbid HBO and AMC for the rights to stream “House of Cards,” a 26-episode drama directed by David Fincher, who headed last year’s “The Social Network” project.

An anonymous industry source familiar with the negotiations confirmed Netflix’s interest in the TV series.

“They’re sort of taking a page out of the playbook that HBO and Showtime used,” Sam Craig, director of the Entertainment, Media and Technology program at NYU Stern School of Business, told ABC News.

If the deal goes through, it would be an all-new venture for Netflix, which started out as a video subscription service before making the transition to its popular streaming service. Netflix has more than 20,000 titles in its streaming library, but the majority of them are previously aired TV shows and older movies.

“By far, the majority of our delivery and our service is now streaming,” Netflix co-founder Reed Hastings told ABC’s Nightline in January. “Streaming is everything.”

A right-to-stream victory for “House of Cards,” which would allow Netflix to offer it before the series debuts on cable television, would mark Netflix as an even bigger threat to pay-TV channels such as HBO and Starz. Some analysts believe that Netflix, with 20 million-plus subscribers, could be as big as HBO by the end of this year. HBO has about 28 million subscribers.

HBO began its operations in the late 70s and showed only full-length movies. But as other pay-TV competitors began following in HBO’s footsteps, the network made a move into original programming to set itself apart from its rivals.

Now Netflix is looking to set itself apart from its rivals in much the same way as HBO. While it dominates the streaming market, delivering 3 out of every 5 movies in the first two months of the new year according to NPD group, a number of competitors have begun offering their own streaming services, including Amazon Prime and Hulu’s premium service.

“Netflix has sort of gone to the place where streaming and delivery of traditional DVD’s is pretty much a commodity and they’re getting a lot of competition. So I think it’s essential that they develop distinctive content,” said Craig.

According to Deadline.com, Netflix has committed to streaming two seasons of “House of Cards” at a cost of more than $100 million. If the figures are correct, it would be an unconventional move for an age-old Hollywood practice that normally requires a pilot episode to air before committing to a series.

Bob Thompson, a professor of popular culture at Syracuse University, said “House of Cards” has some serious potential to become a milestone for a streaming service that is still in its infancy. It could easily be compared to “Amos “Ëœn’ Andy,” which helped popularize radio in the 1920s, and Milton Berle’s “Texaco Star Theater” that drove television sales in the 1950s.

“I suspect that great series are eventually going to be distributed by these alternate systems,” Thompson told ABC, adding that streaming original programming is likely to be an inevitable move. “Whether ‘House of Cards’ is really going to be the one that makes the breakthrough, we’ll have to see how good it is.”

Netflix has become more of a major inconvenience for long-established TV services because households are increasingly reducing or canceling their cable TV subscriptions as they find other ways to get their daily dose of video viewing, such as streaming over high-speed Internet.

HBO has in the past refused to sell streaming rights of its original programming to Netflix because it felt the extra money from licensing fees wouldn’t offset the cost of helping a rival company.

Netflix’s biggest streaming deal to date, was last year’s agreement to pay nearly $1 billion in a five-year deal for the first rights to show movies and television shows from pay-TV channel Epix, which is jointly owned by Viacom Inc., Metro-Goldwyn-Mayer Inc. and Lions Gate Entertainment Corp.

Netflix spent more than $400 million on Internet streaming rights last year, six times what it had spent in 2009.

Media Rights Capital II LP is the production company backing the “House of Cards” series.

On the Net:

Fire At Japanese Nuke Plant, Spike In Radiation

Japanese Earthquake And Tsunami Updates

UPDATE: Wednesday, 16 March 2011 6:10am CST

Japanese authorities have been desperately trying to prevent an environmental catastrophe at the crippled Fukushima Dai-ichi nuclear complex some 140 miles north of Tokyo. 

Workers at the facility have been scrambling to keep the radioactive cores of the reactors from evaporating, which would lead to overheating and possibly a dangerous meltdown.

The tsunami that hit Japan following the devastating magnitude 9.0 earthquake knocked out the backup diesel generators required to cool the facility’s fuel rods, triggering the current crisis.

In the latest in a series of setbacks, Japan ordered workers to withdraw from the complex on Wednesday due to a spike in radiation levels.  The spike was apparently the result of a release of pressure that had accumulated in the No. 2 reactor, officials said.

Steam and pressure build up in the reactors as workers frantically scramble to cool the fuel rods.  This leads to controlled pressure releases through vents and, at times, uncontrolled explosions.

Chief Cabinet Secretary Yukio Edano said the workers had no alternative but to withdraw from the most dangerous areas.

“The workers cannot carry out even minimal work at the plant now,” said Edano.

“Because of the radiation risk we are on standby.”

Just hours before the Fukushima workers were evacuated, another fire broke out at the facility’s No. 4 reactor, sending low levels of radiation as far away as Tokyo, and generating fresh fears that the crisis may be deepening.

Hajimi Motujuku, a spokesman for the plant’s operator, Tokyo Electric Power Co., said the outer housing of the containment vessel at the No. 4 unit erupted in flames early Wednesday.

However, Japan’s nuclear safety agency said that fire and smoke could no longer be seen at Unit 4, although it was unable to confirm that the blaze had been put out.

Officials are also considering having helicopters dump water onto the facility’s No. 3 reactor in a desperate effort to cool its fuel rods.  An earlier explosion had damaged the reactor’s roof.

However, Edano spoke cautiously about such a move, warning of the risks involved.

“It’s not so simple that everything will be resolved by pouring in water. We are trying to avoid creating other problems,” he said.

“We are actually supplying water from the ground, but supplying water from above involves pumping lots of water and that involves risk. We also have to consider the safety of the helicopters above.”

By late Wednesday local time, radiation levels had decreased, although it was not clear whether workers had been allowed back in.

A core team of 70 experts have been working around the clock to contain the crisis, rotating in and out of the facility to minimize their radiation exposure.

Meanwhile, officials in Ibaraki prefecture, just south of Fukushima, said radiation levels were about 300 times normal levels late Wednesday morning, the Associated Press reported.  While those levels are unhealthy for prolonged periods, they are far from lethal.

Millions of people have been left with little food, water or heat in the aftermath of the quake.  Some half a million people are now in temporary shelters, while more than 11,000 people are officially listed as dead or missing. 

Most officials expect that number to rise.

In an exceptionally rare address, Emperor Akihito, 77, expressed his condolences and encouraged the nation not to give up.

“It is important that each of us shares the difficult days that lie ahead,” said Akihito, for whom Japan’s citizens have tremendous respect.

“I pray that we will all take care of each other and overcome this tragedy.”

He also expressed his concerns about the ongoing nuclear crisis.

“With the help of those involved I hope things will not get worse,” he said.

The government has ordered some 140,000 people in the vicinity of the nuclear complex to stay indoors. Slight amounts of radiation were detected in Tokyo, triggering a panicked rush to purchase food and water.

There are six reactors at the Fukushima facility.  Reactors 1, 2 and 3 automatically shut down when the quake hit, but have each been rocked by explosions in the following days.

Units 4, 5 and 6 were not operating at the time of the quake, but still have nuclear fuel that must be kept cool.

The Nuclear and Industrial Safety Agency estimated that 70 percent of the rods have been damaged at the No. 1 reactor, while Japan’s Kyodo news agency reported that 33 percent of the fuel rods at the No. 2 reactor were damaged.  The cores of both reactors are thought to have partially melted.

“We don’t know the nature of the damage,” said Minoru Ohgoda, spokesman for Japan’s nuclear safety agency.

“It could be either melting, or there might be some holes in them,” the AP quoted him as saying.

Scores of flights to Japan have been cancelled or rerouted as air travelers avoid Tokyo out of fears of radiation exposure.  On Wednesday, France urged its citizens in Tokyo to either leave Japan altogether, or move to the southern part of the country.

UPDATE: Tuesday, 15 March 2011 12:05pm CST

The US Geological Survey (USGS) has raised the magnitude of the deadly earthquake that struck offshore northern Japan on Friday to 9.0 from 8.9. Japanese authorities have also arrived separately at the same measurement. Revised measurements of magnitude are common after earthquakes as recorded data is analyzed and refined.

“This magnitude places the earthquake as the fourth largest in the world since 1900 and the largest in Japan since modern instrumental recordings began 130 years ago,” Reuters is reporting from a USGS statement.

The updated magnitude of the quake to 9.0 reveals the quake was 1.5 times more powerful than the previous measurement of 8.9. This makes the quake the fourth largest in the world since 1900 behind the magnitude 9.1 Sumatra quake in 2004, AP reports.

Updated death toll estimates are 3,373 people dead and 6,746 others unaccounted for in Japan from the immediate quake and resulting tsunami, Local media reports and estimated 530,000 evacuees are now recovering in approximately 2,600 shelters near quake-hit areas.

Food, drinking water, medicine and fuel shortages are reported in quake-hit areas. Communications and other infrastructure disruptions are being felt as rescue and recovery efforts continue, Xinhua News Agency is reporting.

UPDATE: Monday, 14 March 2011 11:50am CST

NASA Imagery Of Japan Quake – http://www.nasa.gov/topics/earth/features/japanquake/index.html

UPDATE: Monday, 14 March 2011 6:05am CST

The US Geological Survey (USGS) claims the Friday’s 8.9 magnitude quake along the northern coast of Japan has moved the island nation eastward by an estimated 8 feet. The tectonic shift was a result of “thrust faulting” along the boundary of the Pacific and North America plates, according to the Paul Earle of the USGS.

The North American plate is pushed by the Pacific plate at the rate of about 3.3 inches per year, however an earthquake large enough can provide a shove that is able to displace the plates with dramatic results. Similar movements had been recorded for the recent Indonesian and Chilean earthquakes, Earle continued.

“With an earthquake this large, you can get these huge ground shifts,” Earle said. “On the actual fault you can get 65 feet of relative movement, on the two sides of the fault.”

A tsunami resulted from this earthquake that inundated populated areas of Japan’s northeastern coast, washing away anything along low-lying areas in what Prime Minister Naoto Kan said was an “unprecedented national disaster.”

A 9.1 magnitude quake in December 2004 near the coast of Sumatra caused a tsunami that killed an estimated 228,000 people. In February 2010, an 8.8 quake off the coast of Chile killed more than 500.

No similar ground shift was noticed from the 7.0 earthquake that devastated Haiti last year. “A magnitude 7.0 is much smaller than the earthquake that just happened in Japan,” he said. “We’ve had aftershocks (in Japan) larger than the Haiti earthquake.”

“We know that one GPS station moved, and we have seen a map from GSI (Geospatial Information Authority) in Japan showing the pattern of shift over a large area is consistent with about that much shift of the land mass,” explained Kenneth Hudnut, a USGS geophysicist told CNN.

NASA scientist Richard Gross reported that the earth’s rotation was also affected. Initial estimates show the rotation of the planet has shortened by approximately 1.8 microseconds (a microsecond is one millionth of a second), and also shifted Earth’s figure axis by about 6.5 inches. These are not time scales that would be noticeable by people but can be read by satellites, Gross told CBS News.

UPDATE: Saturday, 12 March 2011 6:40am CST

Although imperceptible to humans, days will be a tiny bit shorter after Friday’s magnitude 8.9 earthquake off the coast of Japan.

NASA geophysicist Richard Gross calculated that the Earth’s rotation accelerated by 1.6 microseconds because of the shift in mass caused by the massive quake — the fifth strongest since 1900.

That change in rotation speed is slightly more than the one caused by last year’s earthquake in Chile.   However, an even larger earthquake in Sumatra in 2004 caused a 6.8-microsecond shortening of the day, the Associated Press reported.

UPDATE: Friday, 11 March 2011 1:35pm CST

A 6.2 magnitude quake has struck the mountainous Nagano area in Central Japan, apparently on a different faultline than the 8.9 magnitude quake.

Also, CNN reports that the Fukushima Daiichi nuclear reactor, roughly 160 miles north of Tokyo, “remains at a high temperature” because it “cannot cool down.”

Japan’s Kyodo News Agency reports that Trade Minister Banri Kaieda said that a small radiation leak could occur at the plant.

UPDATE: Friday, 11 March 2011 12:50pm CST

During a press conference earlier this afternoon, President Barack Obama said that he told Japanese Prime Minister Naota Kan that the US is ready to offer whatever help they can as the Asian country attempts to recover from this morning’s massive earthquake.

Also, according to Reuters, Obama told reporters that “his administration is closely monitoring the tsunami threat to the United States” and that the Prime Minister told him that there was currently no leak at the Fukushima Daiichi nuclear power plant.

UPDATE: Friday, 11 March 2011 12:35pm CST

As reported earlier, an estimated 3,000 people were evacuated from the area surrounding the Fukushima Daiichi nuclear plant.

According to Reuters, Japan’s Chief Cabinet Secretary Yukio Edano had stated that a cooling system for a nuclear reactor was not working, and the issue appeared to be under control.

Now CNN.com has posted an update to the story, and according to Edano, the reactor at the plant “remains at a high temperature” and that they have been unable to get it to “cool down.”

“There is no radioactive leakage at this moment outside of the facility,” he said. “At this moment, there is no danger to the environment.” However, the CNN.com Wire Staff reports that the radiation level was rising in a turbine building within the plant.

UPDATE: Friday, 11 March 2011 12:15pm CST:

Reports of the tsunami’s impact on Hawaii and the West Coast of the US mainland are trickling in. The Associated Press says that, despite reports of flooding on Maui and watery roadways on Hawaii’s big island, the post-earthquake waves “didn’t immediately cause major damage after devastating Japan and sparking evacuations throughout the Pacific.”

“Initial reports from U.S. civil defense officials and residents of coastal communities suggested the force of the tsunami, a giant wall of water, had dissipated as it sped across the Pacific Ocean toward North America,” Peter Henderson of Reuters said. “Tidal surges in the Hawaiian island chain were generally little higher than normal, officials said, and there were no reports of injuries or severe inland property damage.”

However, as James Song and Mark Niesse wrote, “Scientists and officials warned that the first tsunami waves are not always the strongest and said residents along the coast should watch for strong currents and heed calls for evacuation.”

“The tsunami warning is not over,” Hawaii Governor Neil Abercrombie said, according to the AP. “We are seeing significant adverse activity, particularly on Maui and the Big Island. By no means are we clear in the rest of the state as well.”

At 8:30am CST, President Barack Obama received a briefing on this morning’s earthquake and the resulting tsunamis, as he spoke with Homeland Security Secretary Janet Napolitano and FEMA Administrator Craig Fugate via telephone and met with Chief of Staff Bill Daley, Assistant to the President for Homeland Security John Brennan, National Security Advisory Tom Donilon, Deputy National Security Advisor Denis McDonough, Senior Advisor David Plouffe, Deputy Chief of Staff Alyssa Mastromonaco, National Security Staff Senior Director for Resilience Richard Reed and National Security Staff Director Asian Affairs Daniel Russel in person.

According to a White House media advisory released this afternoon, “The senior officials provided the President with an update on the evolving situation stemming from the earthquake and subsequent tsunami that struck Japan early this morning including the actions being taken to assist U.S. states and territories that could be affected by the tsunami, as the President directed earlier this morning ““ as well as the work being done to be prepared to assist the people of Japan.”

“The US government continues to monitor the situation closely throughout the Pacific region,” the press release added. “To support potentially impacted areas in the United States, the federal government remains in close contact and coordination with state and local officials, and stands ready to support them. The government’s message to the public is simple: listen to the instructions of state and local officials. We urge everyone in the regions who could be impacted to listen to a NOAA Weather Radio and their local news to monitor for updates and directions provided by their local officials.”

According to Patricia Zengerle and David Morgan of Reuters, “The U.S. Defense Department was preparing American forces in the Pacific to provide relief after the quake, which generated a tsunami that headed across the Pacific Ocean past Hawaii and toward the west coast of the U.S. mainland.”

To put into perspective just how severe an earthquake this was, consider this Twitter post from USGS Nebraska: “The Richter scale is exponential, logarithmic. An 8.9 like today’s is about 8,000 times stronger than the recent serious quake in New Zealand.”

That earthquake, which hit the city of Christchurch on February 22, was 6.3 in magnitude. There are 166 confirmed deaths as a result of that disaster, and as reported by CNN.com, the Kyodo News Agency, and now by Reuters as well, the death toll for Friday morning’s earthquake is expected to surpass 1,000.

UPDATE: Friday, 11 March 2011 11:20am CST:

CNN.com’s live blog is reporting that a dam in Fukushima prefecture had broken and that “scores” of homes in the region had been washed way. They also say that the Defense Ministry has confirmed that approximately 1,800 homes have been destroyed there.

They also state that the Kyodo News Agency has reported that they expect the death toll from Friday morning’s 8.9 magnitude earthquake to surpass 1,000.

UPDATE: Friday, 11 March 2011 10:50am CST:

Russell Goldman and Lyneka Little of ABC News report that, in addition to Oregon, residents in northern California have “reported seeing the tell-tale sign of an impending tsunami–the waterline quickly withdrawing from the beach prior to large incoming waves.”

In an article posted at 8:19 Pacific Time, the Los Angeles Times’ L.A. Now blog reported that “Southern California beaches are expected to be hit within a half-hour with unusual ripple currents” resulting from the earthquake-caused tsunami.

“Most beaches are likely to see 1- to 3-foot waves starting around 8:30 am [local time], said Bill Patzert, an oceanographer at Jet Propulsion Laboratory in La Canada-Flintridge. Unlike typical waves, the force of the current is expected to build slowly and could last up to a half-hour,” they added.

Furthermore, Japan’s National Police have told CNN.com that “at least 133 people were killed, 722 were injured and 530 were missing” following the earthquake, in addition to the 200 to 300 bodies that were discovered “in the coastal city of Sendai in Miyagi Prefecture following the subsequent tsunami that struck that area. The death toll is likely to rise.”

According to Reuters, tsunami warnings have been lifted for Taiwan, Indonesia, Philippines, and alerts are no longer in effect for Australia, New Zealand and Guam.

UPDATE: Friday, 11 March 2011 10:00am CST:

Jaymes Song and Mark Niesse of the AP are now reporting that the first tsunami waves have reached the US mainland, with high water reaching Port Orford, Oregon at 7:30am PST on Friday. Residents living near the coast of Oregon, California, and Washington had previously been evacuated. An approximately five-foot wave had also been spotted at Shemya in Alaska’s Aleutian Islands, Song and Niesse said.

Andrew Marshall of Reuters is reporting that tsunami warnings had been lifted for Guam, Indonesia, and Taiwan. Pacific Tsunami Warning Center reports that warnings remained in effect for Mexico, Guatemala, El Salvador, Costa Rica, Nicaragua, Panama, Honduras, Chile, Ecuador, Colombia and Peru.

Reuters is also reporting, via their live blog, that NHK is confirming that the death toll in Japan had now risen above 300, with several hundred others missing.

CNN.com had confirmed that “a spokesman for the U.S. military bases in Japan said all service members were accounted for and there were no reports of damage to installations or ships.”

AP Reporters Jaymes Song and Mark Niesse now report that the tsunami waves that hit the beaches of Hawaii early Friday morning “didn’t cause any major damage.”

Finally, Jesse McKinley and Timothy Williams of the New York Times are reporting that waves from the tsunami “were projected to arrive at the Northern California coast at about 7:15 a.m. local time, and hit the San Francisco Bay Area at about 8 a.m., the National Weather Service said.”

“Officials in Southern California closed beaches as a precaution, prohibiting swimming, surfing and fishing off the coast,” McKinley and Williams said. “Experts said that they did not expect flooding and did not call for mandatory evacuations. But they warned that there could be large waves and unusual changes in the currents for several hours.”

“Several schools in the beach areas also planned to close Friday,” the New York Times writers continued, adding that authorities told them that wave surges “could reach three to seven feet along the California, Oregon and Washington coasts” and three-feet in Southern California.

UPDATE: Friday, 11 March 2011 9:35am CST:

A televised CNN report states that all US military personnel in Japan have been accounted for.

Reuters reporter Ross Chainey, via their live earthquake coverage blog, is reporting that White House Chief of Staff Bill Daley “says fears about the tsunami on Hawaii and California seem to have passed,” and that Bristol University Seismology Professor George Helffrich tells BBC News “the speed at which the tsunami waves move across the ocean is around 1km every four seconds.”

ABC News reporters Akiko Fujita, Leezel Tanglao, and Jessica Hopper are reporting that the quake itself has been called the fifth largest ever recorded by the USGS. They note that the tremors lasted five minutes–far longer than the devastating 1994 earthquake in Northridge, California, which had a duration of just six seconds–and bent the famed Tokyo tower landmark.

As of 9:20am CST, Canada had issued tsunami advisories for parts of British Columbia, according to Reuters.

UPDATE Friday, 11 March 2011 09:15am CST:

ABC News Channel 25 out of Waco, Texas is reporting that the tsunami, which initially hit Hawaii approximately two hours ago, had caused damage to docks, and that the waves are “leaving fish in parking lots near the beach.”

They add that a Tsunami Warning is in effect for coastal areas of California, Oregon, and Alaska–meaning that residents in those areas should immediately move to safer locations inland and towards higher ground–and that several other west coast areas were under a less-serious Tsunami Advisory.

According to CNN.com, the earthquake’s epicenter was located approximately 231 miles away from Tokyo, and between 60,000 and 70,000 people were being evacuated to shelters in the Sendai area. Fires had been reported in 80 different areas, the CNN Wire Staff added, and more than 20,000 people were left stranded at the Narita and Haneda airports.

UPDATE Friday, 11 March 2011 08:55am CST:

The Associated Press (AP) is reporting that death toll of a massive 8.9 magnitude earthquake that hit Japan early Friday morning is now in the hundreds.

According to the AP’s Malcolm Foster, “Police said 200 to 300 bodies were found in the northeastern coastal city of Sendai. Another 88 were confirmed killed and 349 were missing. The death toll was likely to continue climbing given the scale of the disaster.”

Meanwhile, according to AP reporters Jaymes Song and Mark Niesse, 3-foot high tsunami waves from the earthquake had begun hitting the Hawaii islands of Oahu and Kauai early Friday morning. Song and Niesse now report that waves nearing 6 feet high had been recorded on the island of Maui, and that Gerard Fryer, a geophysicist said that it did not appear that they would cause “major” damage, but that he was positive there would be “some damage” there.

“Roadways and beaches were empty as the tsunamis struck the state, which had hours to prepare,” Song and Niesse said. “Residents in coastal areas of Hawaii were sent to refuge areas at community centers and schools while tourists in Waikiki were moved to higher floors of their high-rise hotels.”

Furthermore, they report that “Waves are predicted to hit the western coast of the United States between 11 a.m. and 11:30 a.m. EST Friday” and that evacuations had been ordered “in parts of Washington and Oregon.”

“The US Geological Survey [USGS] said the 2:46 p.m. quake was a magnitude 8.9, the biggest earthquake to hit Japan since officials began keeping records in the late 1800s, and one of the biggest ever recorded in the world,” Foster said, adding that more than 50 aftershocks had been recorded.

In the wake of the disaster, American President Barack Obama pledged his aid to Japan.

According to AP reporter Julie Pace, Obama has declared that the United States “stands ready to help” those affected by the earthquake, and that the Federal Emergency Management Agency (FEMA) will be ready to help those in Hawaii, as well as any other state ultimately affected by the tsunamis.

“FEMA administrator Craig Fugate said tsunami warnings and watches have been issued for the U.S. territories of Guam, the Northern Marianas Islands, and coastal areas in Hawaii, Alaska, California, Oregon and Washington,” Pace said. “Fugate urged people living in those areas to monitor their local news for instructions from their state and local officials, and evacuate if ordered to do so. And the Coast Guard said it was making preparations to provide support where necessary.”

As reported previously, Bloomberg writers Stuart Biggs and Aaron Sheldrick said that the earthquake “shook buildings across Tokyo and unleashed a tsunami as high as 10 meters, engulfing towns along the northern coast.”

Furthermore, Tokyo Electric Power told the Bloomberg reporters that over 4 million homes were without power, and in a nationally televised briefing in the wake of the disaster, Prime Minister Naoto Kan said that “major damage” had occurred in an area located north of Tokyo.

In a live blog updating the situation in real time, Mark Kolmar of Reuters posted that the Kyodo news agency is reporting that “8,000 defense force troops have been dispatched for quake relief.”

The US State Department has issued a travel alert, noting that airports in Tokyo were currently closed and asking all Americans to avoid non-essential travel to Japan, Pace is reporting. The agency added that public transportation was also closed in Tokyo and interrupted in many other locales, and that US citizens currently in Japan should “contact family and friends in the United States to confirm their well-being at the earliest opportunity.”

Meanwhile, many other nations are continuing to brace themselves for the resulting tsunamis.

“The Philippines, Indonesia and Papua New Guinea were among more than 20 countries bracing for a possible tsunami, after the Pacific Tsunami Warning Center raised an alert,” Biggs and Sheldrick had warned earlier Friday, adding that AP reports stated that “The West Coast and Alaska Tsunami Warning Center issued a warning for the entire U.S. west coast.”

Roig and Barut said that the tsunami warnings also extended from Mexico south through the Pacific coast of South America, and Kolmar had also posted that warnings had been issued for “the coastal areas of California and Oregon from Point Concepcion, California to the Oregon-Washington border; the coastal areas of Alaska from Amchitka Pass, Alaska (125 miles W of Adak) to Attu, Alaska; the coastal areas of California from the California-Mexico border to Point Concepcion, California; and the coastal areas of Washington, British Columbia and Alaska from the Oregon-Washington border to Amchitka Pass, Alaska (125 miles W of Adak).”

As they have with other disasters in the past, Google has established an online “Person Finder” for those affected by the earthquake and tsunami. According to Benny Evangelista of the San Francisco Chronicle, “The Person Finder page can be found at http://japan.person-finder.appspot.com/ and it is available both in English and Japanese.”

Earlier today, Reuters had reported that residents near the Fukushima Daiichi nuclear plant had been told to evacuate. They have since learned from Japan’s Chief Cabinet Secretary Yukio Edano that a cooling system for a nuclear reactor was not working.

Reuters is also reporting, via the Kyodo news agency reports that a ship carrying a crew of 100 people had been swept away by the tsunami. More details on that when they become available.

Image 1: Image of Fukushima I Nuclear Power Plant Unit 1 before and after explosion. Credit: Wikipedia   

Image 2: Ground rupture caused by the Sendai Earthquake 2011. Credit: Danny Choo/Wikipedia (CC BY-SA 2.0)  

Image 3: Map of the Senadai Earthquake 2011 Credit: Wikipedia   

Image 4: These images show the effects of the tsunami on Japan’s coastline. The image on the left was taken on Sept. 5, 2010; the image on the right was taken on March 12, 2011, one day after an earthquake and resulting tsunami struck the island nation. Credit: German Aerospace Center (DLR)/Rapid Eye  

On the Net:

BCM Researchers Win Top Awards From Society For The Study Of Reproduction

Two veteran Baylor College of Medicine researchers will receive the two top awards from the Society for the Study of Reproduction when the group holds its annual meeting July 31″“Aug. 4 in Portland, Ore.

Dr. JoAnne S. Richards, a distinguished professor in the BCM department of molecular and cellular biology, will receive the Carl G. Hartman Award, the society’s highest award given in recognition of a research career and scholarly activities in the field of reproductive biology. Dr. Francesco J. DeMayo, also a professor in the BCM department of molecular and cellular biology, will receive the SSR Research Award that recognizes an active, regular member of the society for outstanding research published during the previous six years.

“These two awards demonstrate the breadth and depth of research going on at Baylor College of Medicine in the area of reproductive medicine,” said Dr. Bert O’Malley, chair of molecular and cellular biology at BCM and a former Hartman award winner.

Working for better detection, treatments “I am excited and humbled by this recognition,” said Richards, whose work focuses on ovarian cell differentiation, ovulation and cancer. Her current research focuses on several animal models of ovarian cancer and she now focuses her research in the area of ovary formation, how endocrine signals and genes regulate follicular growth and follicular cell function and how ovulation occurs. Her laboratory’s goal is to provide translational information for better detection, screening and cancer treatment strategies for women with ovarian cancer.

She received her bachelor’s degree from Oberlin College in Ohio and her master’s and Ph.D. from Brown University in Providence, RI. She did her postdoctoral work at the University of Michigan in Ann Arbor and was a member of the faculty there until 1981, when she came to BCM.

She has received the SSR Research Award, the Basic Research Award from the Society for the Advancement of Women’s Health Research, the Pioneer Lecture Award from the Frontiers in Reproduction Course of the National Institutes of Health, the Women in Endocrinology Mentor Award and the Michael E. DeBakey Award in Research Excellence in 2009.

Progesterone, uterine function

DeMayo said that the work for which he was recognized involves the study of how progesterone regulates uterine function and implantation – a goal sought by numerous investigators since the 1970s. He also has developed many novel mouse models to dissect molecular pathways and for the study of endometrial cancer.

“The strength of our research in reproductive biology has been one of Baylor’s best kept secrets,” said DeMayo.

DeMayo received his bachelor’s degree from Cornell University and his master’s and Ph.D. from Michigan State University in East Lansing in 1983 and came to BCM in 1986. He is co-director and will soon become director of BCM’s Center for Reproductive Biology Research, which is part of the Eunice Kennedy Shriver National Institute of Child Health and Human Development’s Specialized Cooperative Centers Program in Reproduction and Infertility Research. He was a recipient of the Michael e. DeBakey Excellence in Research Awards for 2006.

This year’s winner of the Society’s Young Investigator Award is Dr. Derek Boerboom, professor at the Universit© de Montr©al in the department of veterinary biomedicine. Boerboom did his postdoctoral studies at BCM as well.

For more information on awards, go to Society for the Study of Reproduction.

On the Net:

Germany Shuts Down Seven Nuclear Reactors For Review

On Tuesday, Germany announced the temporary shutdown of seven of its oldest nuclear reactors pending a safety review.

“We are launching a safety review of all nuclear reactors … with all reactors in operation since the end of 1980 set to be idled for the period of the (three-month) moratorium,” Chancellor Angela Merkel said in a statement.

Germany said 10 years ago that it would be nuclear-free by 2020, which has since been postponed until the mid-2030s by Merkel’s government.

Japan’s government said that radiation levels near the Fukushima nuclear plant reached levels harmful to humans.  It has advised people to stay indoors after two explosions and a fire at the facility Tuesday.

Four of the six reactors at the Japanese plant have overheated and sparked explosions since Friday’s 8.9 magnitude earthquake.

“After this moratorium, which will run until June 15… we will know how to proceed,” Merkel said in a statement after crisis talks in Berlin took place with premiers of German states where there are nuclear plants.

She said that Berlin would use the time to discuss what to do with radioactive waste, boosting renewable energies, and international safety standards for nuclear power.

“Safety standards in Germany are one thing, they are important, but safety standards in Europe, being able to compare then, and international safety standards are also important,” Merkel said.

Germany’s older plants include one in Bavaria, two near Frankfurt and two in Baden-Wuettemberg.

ARD published a surveyed on Tuesday that had 53 percent of respondents saying all reactors should be taken out of service as soon as possible.

The poll showed that 70 percent thought that an accident similar to that in Japan could happen in Germany, and 80 percent want Merkel to reverse the government’s extension of operating times.

German citizens have been uneasy about the safety of nuclear power, with shipments of nuclear waste regularly attracting angry protests.

Germany relies on nuclear power for 23 percent of its energy needs.  It is the first European country to take these measures after explosions at Japan’s Fukushima plant sparked safety concerns.

Guenther Oettinger, the European Union energy commission, said in an interview with ARD television that Germany’s move raises the prospect of a nuclear-free Europe.

“It has to raise the question of whether we in Europe, in the foreseeable future, can secure our energy needs without nuclear power,” he said.

Tornado Alley Film Premieres In Theaters

Tornado Alley features never-before-captured tornado footage as scientists seek to understand the structure and origin of tornadoes, and how they can be forecasted more accurately

On March 10, at the Museum of Science and Industry in Chicago, Giant Screen Films presented the world premiere of Tornado Alley, a film that documents the effort to understand the origins and evolution of tornadoes.

The film features never-before-seen footage of Sean Casey, star of the Discovery Channel program Storm Chasers and a National Science Foundation (NSF)-funded team of scientists from the recently concluded VORTEX2 tornado research mission. The teams pursue tornadoes in “Tornado Alley,” a region in the center of the United States that is home to Earth’s most violent and destructive tornadoes.

As Casey’s team seeks to capture ground-breaking footage of a tornado from point-blank range, the VORTEX2 team strives to understand the origin and structure of tornadoes. VORTEX2 is the largest and most ambitious tornado research project to date.

The researchers hope to use their findings to improve the ability to predict when and where tornadoes will strike, ensuring greater safety to those in harm’s way.

In Tornado Alley, leading researchers Joshua Wurman, Karen Kosiba and Don Burgess represent a team of more than 100 international VORTEX2 participants, who traveled throughout the Plains region of the United States in 2009-2010, using cutting-edge weather measurement instruments to collect scientific data during the life cycle of a tornado.

The scientists deployed more than 40 scientific vehicles, including the radars, tornado pods, instrumented vehicles, deployable arrays of weather stations, laser disdrometers, and unmanned aerial system, and mobile weather balloon launchers.

“Currently, the average lead time, the time between a tornado warning and the tornado occurring, is just 13 minutes, and the false alarm rate is 70 percent,” said Wurman. “In order to increase warning times, in order to reduce the false alarm rate, VORTEX2 hopes to learn the differences between which storms will make tornadoes, and which ones will not, and which storms will make the worst, most violent and potentially killer tornadoes.”

Kosiba explained that to obtain information about the tornadoes, VORTEX2 used 11 different radars to study various components of the storms. As part of that, the Center for Severe Weather Research operated three NSF-funded Doppler on Wheels (DOW) mobile radars that from the ground collect high-resolution radar data of the inside of the storm.

Tornado Alley also features another radar called DOW7 that takes data in the rear and flank of the storm and provides three-dimensional wind and precipitation maps of the potentially tornado-like regions of storms. DOW7 has a crew of four team members, including Kosiba and Wurman. Each team member focuses on a specific task, for example, driving, navigating, operating the radar and coordinating other radars and teams which deploy weather stations called “tornado pods” in front of the tornadoes.

Some of the special features of the DOW7 include a 56-foot mast that extends its radio range to more than 30 miles at times. DOW7 also has 14 monitors connected to more than 15 different computers so that the team can look at weather forecasts and radar data, track their teams, and communicate with other VORTEX2 teams via the internet.

Yet, working inside of the DOW7, let alone in the midst of a tornado, proves to be no simple feat. “We have to be extremely ambitious, getting instruments to just the right places at the right times,” said Wurman. “But at the same time, we have to be extremely cautious, keeping all of the teams and vehicles-some led by over 80 students participating in VORTEX2-safe from not only the tornado, but lightning, flooding, 100 mph thunderstorm winds, softball sized hail, and of course, the tornado.”

To safely capture up-close footage of the tornado, Casey and his team travel in the TIV-2, a seven-ton, armored “tornado intercept vehicle” that was engineered and built by Casey. Casey carries a 92-pound IMAX camera that belonged to his father to film the inside of a tornado. The TIV-2 also carries weather instruments, the data from which Wurman and his team use with DOW data to map tornado winds.

While Tornado Alley depicts several tornados, its climax features a tornado intercept in Goshen County, Wyo., on June 5, 2009.

In order to find this particular tornado, Wurman explained how VORTEX2 made forecasts for where storms might form that day. “We drove to locations east of the Wyoming Rockies, specifically the Cheyenne Ridge,” said Wurman.

“When the forecast storms started propagating off of the Ridge, VORTEX2 drove towards the most prominent storm and began setting up its array of DOW radars [and other instrumentation]. About 22 minutes after we set up the first DOW, a tornado formed and moved into our array, allowing VORTEX2 to collect data during the entire lifecycle, from before birth, as it intensified and through the death of the tornado,” he said.

Tornado Alley will open nationally on March 18th in IMAX and other Large Format Theaters. The Tornado Alley Movie website has more details.

In addition to showing in theaters, Tornado Alley will promote an outreach campaign in which Sean Casey and the TIV will visit museums across the country and additional support from NSF will send DOW radar trucks and VORTEX2 scientists on tour to demonstrate the DOW’s research tools, tornado pods and to engage audiences with the exciting and innovative aspects of tornado research.

Tornado Alley is a collaboration of award-winning producers Giant Screen Films, Graphic Films and Sean Casey, with support from the Giant Dome Theater Consortium, a newly founded partnership comprised of seven leading museum institutions.

NSF’s Informal Science Education program provided funding for VORTEX2 in partnership with NSF’s Physical and Dynamic Meteorology program and the special programs of the University Corporation for Atmospheric Research and Lower Atmospheric Facilities Oversight Section.

Image Caption: The newest Doppler-On-Wheels (DOW) observing the Goshen County, Wyoming tornado on June 5, 2009. The DOWs and VORTEX2 observed this tornado from before birth, through its intensification, until its dissipation. A scientist engaged in photogrammetry research stands behind the DOW. Credit: Herb Stein / CSWR

On the Net:

CDC Makes Reproductive Health Surveys Available Through IHME’s New Global Health Data Exchange

A wealth of maternal and child health data is being made immediately and freely accessible through a new collaboration between the Institute for Health Metrics and Evaluation (IHME) and the Centers for Disease Control and Prevention’s (CDC) Division of Reproductive Health.

The Division chose IHME’s Global Health Data Exchange (GHDx) to host its reports and datasets for an extensive series of reproductive health survey data from more than 30 countries that have received technical assistance from the Division from 1975 to the present. The datasets cover a wide range of topics, including pregnancies, births, contraceptive use, prenatal care, nutrition, delivery assistance, immunizations, behavioral risk factors, and domestic violence.

The announcement was made today at the Global Health Metrics & Evaluation (GHME) conference in Seattle.

“Prior to this partnership, the reports and datasets from our Reproductive Health Surveys have primarily been available upon request,” said Dr. Paul Stupp, the demographer who coordinates technical assistance for the Division. “We had been actively trying to find a place to make these datasets available and had explored several options when IHME created its new data catalog. The GHDx seemed like the best fit because it is global in reach, focused on health, and simple to use.”

The GHDx is a user-friendly, searchable data catalog of global health, public health, and demographic datasets. The GHDx provides free access to high-quality metadata and contains essential information about datasets, including data sources and providers, as well as all the information needed to properly cite the data. IHME is sharing all of its research results through the GHDx and invites governments and other organizations to make their health-related data publicly available at high transfer speeds with no fees.

Researchers around the world will be able to download the CDC’s reproductive health data to measure the health status of populations, to track progress in health interventions, and to design policies that will have a strong impact on improving population health.

“The purpose of the GHDx is to give people one place where they can locate a broad range of data that are available in global health, ideally with those data available for download,” said Peter Speyer, Director of Data Development at IHME. “The CDC collaboration is a great foundation for us to start making more data directly available on the GHDx, and ultimately building a community for data producers and data users.”

On the Net:

You Are What Your Mother Ate

(Ivanhoe Newswire)– Sardines and an ice cream sundae, pickles and peanut butter… those weird cravings are a common effect of pregnancy. However, new research shows a poor diet during pregnancy can have a negative effect on the baby’s long-term health. According to scientists at the University of Cambridge, children born to mothers who had an unhealthy diet during pregnancy are more prone to type 2 diabetes later in life.

New research establishes a link between the regulation of Hnf4a, a gene linked to type 2 diabetes, and a mother’s diet during pregnancy. Prior research has shown that Hnf4a has a role in the development of the pancreas and the production of insulin, and scientists theorized that a mother’s diet during pregnancy has a far-reaching influence on the gene, affecting its expression later in a child’s life. The idea that environmental factors, such as diet, affect genes throughout life is a proven fact, and the reason behind these effects is epigenetics: modifications to DNA that control how much of a gene is produced.

To test their theory, researchers altered the protein content of rats’ diets during pregnancy to cause their offspring to develop type 2 diabetes in old age. When studying cells from the pancreas of the offspring of both well-nourished and malnourished mothers, researchers found the Hnf4a gene was expressed far less in the offspring prone to type 2 diabetes. Though the amount of Hnf4a decreased with age in both groups of rats, when scientists studied the rats’ DNA, they found the aging-related decrease in Hnf4a to be far more pronounced in rats whose mothers had poor diets during pregnancy.

The scientists then studied DNA from the pancreas cells in humans and proved the expression of Hnf4a in humans is controlled the same way as in rats. Therefore, in humans as well as rats, a decreased amount of Hnf4a leads to a decrease in pancreatic function, impacting its ability to produce insulin. Basically, a decrease in Hnf4a means an increase in the risk of diabetes.

However, these new findings do not mean expectant mothers should worry. Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation was quoted as saying, “We already know that a healthy pregnancy is important in shaping a child’s health”¦ The reasons why are not well understood, but this study in rats adds to the evidence that a mother’s diet may sometimes alter the control of certain genes in her unborn child. This research doesn’t change our advice that pregnant women should try to eat a healthy, balanced diet.”

Source: PNAS, 07 March 2011