Super-Adhesive Material Inspired By Gecko Feet

For years, biologists have been amazed by the power of gecko feet, which let these 5-ounce lizards produce an adhesive force roughly equivalent to carrying nine pounds up a wall without slipping. Now, a team of polymer scientists and a biologist at the University of Massachusetts Amherst have discovered exactly how the gecko does it, leading them to invent “Geckskin,” a device that can hold 700 pounds on a smooth wall.

Doctoral candidate Michael Bartlett in Alfred Crosby´s polymer science and engineering lab at UMass Amherst is the lead author of their article describing the discovery in the current online issue of Advanced Materials. The group includes biologist Duncan Irschick, a functional morphologist who has studied the gecko´s climbing and clinging abilities for over 20 years. Geckos are equally at home on vertical, slanted, even backward-tilting surfaces.

“Amazingly, gecko feet can be applied and disengaged with ease, and with no sticky residue remaining on the surface,” Irschick says. These properties, high-capacity, reversibility and dry adhesion offer a tantalizing possibility for synthetic materials that can easily attach and detach heavy everyday objects such as televisions or computers to walls, as well as medical and industrial applications, among others, he and Crosby say.

This combination of properties at these scales has never been achieved before, the authors point out. Crosby says, “Our Geckskin device is about 16 inches square, about the size of an index card, and can hold a maximum force of about 700 pounds while adhering to a smooth surface such as glass.”

Beyond its impressive sticking ability, the device can be released with negligible effort and reused many times with no loss of effectiveness. For example, it can be used to stick a 42-inch television to a wall, released with a gentle tug and restuck to another surface as many times as needed, leaving no residue.

Previous efforts to synthesize the tremendous adhesive power of gecko feet and pads were based on the qualities of microscopic hairs on their toes called setae, but efforts to translate them to larger scales were unsuccessful, in part because the complexity of the entire gecko foot was not taken into account. As Irschick explains, a gecko´s foot has several interacting elements, including tendons, bones and skin, that work together to produce easily reversible adhesion.

Now he, Bartlett, Crosby and the rest of the UMass Amherst team have unlocked the simple yet elegant secret of how it´s done, to create a device that can handle excessively large weights. Geckskin and its supporting theory demonstrate that setae are not required for gecko-like performance, Crosby points out. “It´s a concept that has not been considered in other design strategies and one that may open up new research avenues in gecko-like adhesion in the future.”

The key innovation by Bartlett and colleagues was to create an integrated adhesive with a soft pad woven into a stiff fabric, which allows the pad to “drape” over a surface to maximize contact. Further, as in natural gecko feet, the skin is woven into a synthetic “tendon,” yielding a design that plays a key role in maintaining stiffness and rotational freedom, the researchers explain.

Importantly, the Geckskin´s adhesive pad uses simple everyday materials such as polydimethylsiloxane (PDMS), which holds promise for developing an inexpensive, strong and durable dry adhesive.

The UMass Amherst researchers are continuing to improve their Geckskin design by drawing on lessons from the evolution of gecko feet, which show remarkable variation in anatomy. “Our design for Geckskin shows the true integrative power of evolution for inspiring synthetic design that can ultimately aid humans in many ways,” says Irschick.

The work was supported by the U.S. Defense Advanced Research Projects Agency (DARPA) through a subcontract to Draper Laboratories, plus UMass Amherst research funds.

Image 2: A card-sized pad of Geckskin can firmly attach very heavy objects such as this 42-inch television weighing about 40 lbs. (18 kg) to a smooth vertical surface. The key innovation by Bartlett and colleagues was to create a soft pad woven into a stiff fabric that includes a synthetic tendon. Together these features allow the stiff yet flexible pad to “drape” over a surface to maximize contact. Photo courtesy of UMass Amherst

On the Net:

First EarthScope ‘Transportable Array’ Seismic Station Reaches US East Coast

Data generate 3-D ‘CT scan’ of North American continent’s interior

Yulee, Florida. Not a place one usually thinks of as an Earthquake Epicenter.

But this swampland not far from the Georgia state line is now home to a state-of-the-art seismic station known as 457A.

Here, within a few miles of the Atlantic Ocean, 457A has been installed to record ground motion from earthquakes. Earthquakes do happen on the East Coast of the United States, as the Virginia quake of August, 2011 attests.

The new seismic station is part of EarthScope, a project funded by the National Science Foundation (NSF). It’s one of some 400 stations collectively called the Transportable Array.

The array has–one-by-one–slowly been making its way across the country in a wave of instrumentation. Transportable Array Station 457A is the first such station to reach the East Coast.

On the West Coast in 2004, the array started its eastward migration. As it moved, it transmitted information from more than 1,350 locations across the United States.

By the end of 2013, the array’s East Coast stations will occupy 400 sites from Florida in the south to Michigan and Maine in the north, including sites in the southernmost regions of Ontario and Quebec, Canada.

Researchers placed the stations in a grid approximately 70 kilometers, or some 43 miles, apart each operates for about two years.

Data recorded by the seismometers help scientists develop a better understanding of the geologic structure inside the North American continent.

“Scientists can use these data to generate 3-D images of Earth’s interior that are very similar to CT scans in medicine,” says Greg Anderson, NSF program director for EarthScope. “The images show Earth’s structure from the core to the surface in never-before-seen detail.”

“With the installation of 457A,” says Anderson, “the Transportable Array has stations active on all four coasts of the ‘lower 48’: Pacific, Atlantic, Great Lakes and Gulf of Mexico.”

Each station is self-contained, using solar panels to recharge the batteries that provide power to the seismometer and other sensors and electronic systems. The entire instrument is placed in a vault and buried six feet below the surface.

“Because the western part of the country regularly experiences earthquakes, that region has dozens of permanent seismometers to observe fault movements,” says Bob Woodward, director of the USArray, the seismic component of EarthScope.

“Seismic stations in the eastern third of the U.S. are much less common, although earthquakes do occur, as we all learned last August.”

Transportable Array seismometers are extremely sensitive. They can detect earthquakes at magnitude 5.0 or greater–“sensing” them as far away as the opposite side of the planet–as well as record smaller quakes that occur regionally and locally.

Each station includes a high-performance barometer and an infrasound microphone, and sensors to record temperature and pressure.

Data collected by the station’s instruments are transmitted in real-time to the Array Network Facility at the University of California, San Diego, then archived at the IRIS (Incorporated Research Institutions for Seismology) Data Management Center in Seattle for use by researchers around the world.

From its underground crypt, 457A will be sending messages to geologists–and to all of us.

Image 1: Transportable Array Station 457A in Florida: first such seismic station on the U.S. East Coast. Credit: IRIS

Image 2: From the array, a CT scan-like, 3-D image of Earth’s mantle under Western North America. Credit: Richard Allen/UC-Berkeley

Image 3: A vault with the transmitting seismometer is closed, then a mound of dirt is piled on top. Credit: IRIS

On the Net:

Brazil Denied Request To Extend Regulation Of Fungicide In Orange Juice

Brazilian orange juice producers were denied a request on Thursday by the U.S. Food and Drug Administration (FDA) to allow a higher tolerance of fungicide in juice imports.

The decision will see that Brazil, the world’s top orange juice producer, stops exporting orange juice that is made from concentrate to the U.S.

A variety of orange juice made from concentrate contains much smaller levels of carbendazim, a chemical used to help fight mold on orange trees.

Carbendazim is permitted up to a limit in imports to the European Union, which is the main buyer of Brazil’s orange juice.

Brazil asked the FDA to give it until June 2013 to eradicate carbendazim from its juice, which would have extended a ban on the chemical that was put into place in 2009.

Coca Cola alerted the FDA after it had discovered that carbendazim was present in laboratory tests of Brazilian orange juice.

The FDA began to check all imports for the chemical in January, prompting Brazil to ask the regulator for an extension on the 2009 ban.

Brazil asked the FDA to distinguish between frozen concentrate and not-from-concentrate, meaning a 60 parts per billion (ppb) carbendazim tolerance for frozen concentrate.  FDA’s current policy requires orange juice in the U.S. to be in the form of diluted not-from-concentrate, containing a content of 10 pbb of carbendazim.

The FDA refused Brazil’s proposal, but one source in the industry told Reuters that, “by the end of the year it is probable that things will be back to normal and we will be shipping concentrated juice to (the U.S.) again.”

The Juice Products Association said it was “disappointed” by the FDA decision, since the Environmental Protection Agency declared that carbendazim levels of up to 80 ppb were no threat to human health.

“The United States cannot grow enough oranges to satisfy American consumer demand for orange juice and therefore juice processors must rely on orange juice concentrate from other countries to assure adequate supplies,” it said in a statement.

FDA said it would not accept the requests of Brazil’s juice industry because laws prohibited it from making exceptions that would accommodate it.

“We recognize that the use in Brazil of a pesticide not authorized in the U.S. and FDA’s detention of shipments bearing illegal residues has affected the orange juice market and has economic impacts,” the regulator wrote in a 1,600 word statement.

On the Net:

York Researchers Create ‘Tornados’ Inside Electron Microscopes

Researchers from the University of York are pioneering the development of electron microscopes which will allow scientists to examine a greater variety of materials in new revolutionary ways.

The team, headed by Professor Jun Yuan and Professor Mohamed Babiker, from the University´s Department of Physics has created electron beams with orbital angular momentum — electron vortex beams — which will open the way to many novel applications including the more efficient examining of magnetic materials.

Electron microscopes use a beam of electrons to illuminate a specimen and produce a magnified image, allowing scientists to investigate atomic arrangements. Compared to conventional electron beams, electron vortex beams improve the resolution and sensitivity of imaging, which is key when determining the structure of biological specimens such as proteins. They also have applications in the manipulation of nano-scale objects such as atoms and molecules.

As the electron vortex consists of moving charged particles, there is a magnetic field associated with the vortex. This magnetic field will be invaluable in examining magnetic materials, enabling the nanoscale magnetic structure to be imaged.

The York team has created a design for a holographic mask to generate an electron vortex beam and now plans to use this to improve the imaging capabilities of the electron microscope in its York-JEOL nanocentre.

Details of York´s latest work – part of the research by second year PhD student Sophia Lloyd – showing that orbital angular momentum of electron beams with vortex structure are more efficient than light for probing atomic magnetism, are published in the February edition of the Physical Review Letters.

Professor Yuan said: “The introduction of vortex beams into electron microscopy, with its screw-like revolving wave front — much like tornados, will revolutionise the study of magnetic nanostructures, as well as creating new applications in terms of nanoparticle manipulation and trapping, and edge contrast detection.”

Professor Babiker, an expert in light vortex research, added: “Optical vortex beams, created using beams of light photons, have been studied for the past 20 years. They have found a great many applications, most notably in fine scale manipulation of single molecules and nano-objects in so-called optical tweezers and optical spanners.

 “Research being carried out at York is intended to further current understanding of electron vortices so that a similarly broad range of applications can be realised.”

On the Net:

PainACTION.com Improves Migraine Self-Management And Reduces Migraine-Related Psychological Distress

Study results show that the online behavioral intervention can be an effective adjunct to a comprehensive medical approach to managing migraine

painACTION.com is a free, non-promotional online program designed to support self-management and improve overall function in people with chronic pain. This study tested painACTION.com’s ability to increase the use of self-management skills in people with chronic migraine headaches. A total of 185 participants completed the study. Participants exposed to painACTION.com were more confident that they could prevent headaches and manage headache-related pain and disability, and reported more use of symptom-management strategies, including relaxation and seeking social support, than the control group. They also reported more daily activities and less migraine-related psychological distress, including decreased depression, anxiety, and stress.

The full report of this study, “A Randomized Trial of a Web-based Intervention to Improve Migraine Self-Management and Coping” was published in the February 2012 issue of Headache: The Journal of Head and Face Pain (Volume 52, Issue 2). Headache is the official publication of the American Headache Society.

“A critical component of comprehensive migraine treatment is engaging the patient in self-management,” says lead researcher Jonas Bromberg, PsyD, Director of Health Communications and Senior Research Scientist at Inflexxion, the company that created painACTION.com. “Self-management training should help patients learn how to identify, avoid, and manage headache triggers, and learn to perform other essential prevention, management, and coping behaviors. The integration of behavioral support in the medical care of migraine is essential in helping people with migraine to manage their condition more effectively, safely manage their prescription pain medications, avoid disease progression, and reduce the high cost of migraine and migraine-related disability to individuals and society.”

“Study participants’ improved ability to manage their pain and pain symptoms indicates that painACTION.com can be an important element of a comprehensive disease management approach,” says Kevin Zacharoff, MD, Vice President of Medical Affairs for Inflexxion. “Many people with migraine may have limited access to expert behavioral and lifestyle-change support, or are reluctant to seek mental health services. Health care providers who refer their patients to painACTION.com can help meet a critical public health need by making behavioral support available in a more timely way for larger numbers of migraine patients.”

On the Net:

Robot Hand Capable Of Grasping, Throwing Objects

[ Watch the Video ]
Cornell University and University of Chicago engineers have developed a robotic hand that is capable of gripping objects and throwing them.
The robot hand features a “universal jamming gripper” that can quickly harden by evacuating air inside the membrane, like vacuum-packing.
If the gripper is placed over an object, such as a dart or coin, the hand tightly grasps it as the air is release.  Air is then pumped back into the hand, which loosens the robot’s grip and helps it “shoot” the objects into the air.
“Using a combination of positive and negative pressure, the gripper can rapidly grip and release a wide range of objects that are typically challenging for universal grippers, such as flat objects, soft objects, or objects with complex geometries,” the engineers said in a statement.
The team said that the design is inspired from the human hand, and is a simple, low cost machine that is still highly capable.
The engineers said the gripper would be useful in situations where a robot needs to grip or lift a variety of times it has not seen before.
“In the long term we are striving to apply jamming in a more general way to adaptive robots and structures that might reconfigure, locomote, or recover from damage,” the team said in a statement on its website.
They said specific applications for the robot hand could include military robotics and improvised explosive device (IED) defeat missions.
The team also said the robot hand could be used in consumer and service robotics in unstructured environments like the home.
“As robots move into increasingly unstructured environments (like the home), applications like sorting objects into bins or throwing away trash come to mind,” the researchers said in a statement.

On the Net:

Auto-injectors Could Help Treat Prolonged Seizures

New research, published in the New England Journal of Medicine, finds that when a person is experiencing a prolonged seizure, quick medical intervention is critical, becoming harder to stop the seizure with each passing minute, placing the patient at risk of severe brain damage and death.
It is for this reason that paramedics are trained to administer anticonvulsive medications as soon as possible — typically giving them intravenously before arriving at the hospital.
But according to findings in the new study, a major clinical trial has shown that using an auto-injector (similar to an EpiPen) to inject drugs into the thigh is just as safe and may be more effective.
The new research was conducted as part of the Rapid Anticonvulsant Medication Prior to Arrival Trial (RAMPART), which included researchers from the University of Cincinnati and local paramedics studying status epilepticus (prolonged seizure lasting more than five minutes). The study was sponsored by the National Institutes of Health.
The researchers wanted to determine whether intramuscular injection was as safe and effective as giving medicine intravenously. The study compared how well delivery by each method stopped patients´ seizures by the time the ambulance arrived at the ER.
The trial involved 17 cities and 79 hospitals around the country, and involved 4,314 paramedics who treated 893 patients ranging from several months old to 103 between 2009 and 2011.
The researchers compared two medicines that are effective in controlling seizures: midazolam and lorazepam. Both are benzodiazepines, a class of sedating anticonvulsant drugs. Midazolam was a candidate for injection because it is rapidly absorbed from muscle. But lorazepam must be given by IV.
The study team found that 73 percent of patients in the midazolam group were seizure-free upon arrival at the hospital, compared to 63 percent of those who received the IV treatment. Patients also treated with midazolam were also less likely to require hospitalization than those receiving the lorazepam IV.
“Patients with status epilepticus can suffer severe consequences if seizures are not stopped quickly. This study establishes that rapid intramuscular injection of an anticonvulsant drug is safe and effective,” said Walter Koroshetz, MD, deputy director of the National Institute of Neurological Disorders and Stroke (NINDS), part of the NIH, which funded the study.
“This project is a great example of the importance of community-based emergency research and the combined strength of a city´s entire health care system, when we all work together,” said J. Claude Hemphill III, MD, MAS, who led the San Francisco part of the clinical trial. Hemphill is Chief of Neurology at SFGH and co-director of the UCSF Brain & Spinal Injury Center.
“It´s much easier to give intramuscular injections than have to start an IV,” said Hemphill. “Given the results of RAMPART, it is time for every emergency medical system in the United States to move toward intramuscular injection of midazolam as a first treatment to stop seizures in the pre-hospital setting.”
And auto-injectors may someday be available for use by epilepsy patients and their family members, but more research is currently needed, said the researchers. Because of the strong sedative effect of midazolam, on-site medical supervision is required for safety of the patient.
The RAMPART study was a unique form of clinical trial, eligible under the US Food and Drug Administration (FDA) requirement of “exception from informed consent.” The federal regulation was created to protect patients who are involved in research when consent is not possible because of their medical condition. RAMPART researchers held community consultation meetings prior to the study launch to get feedback.
AS investigators planned the trial, they learned that the US Defense Department and the Dept. of Health and Human Services were already working with a midazolam auto-injector and the study was an opportunity to confirm its effectiveness in patients with seizures.
“There was great synergy when we realized that RAMPART was studying a similar problem that was of concern to the chemical defense community. This led to a perfect collaboration between HHS and DoD,” said David Jett, PhD, program director for NIH CounterACT and NINDS. “The broader implication of RAMPART is that we now have critical information from studies in humans that a safe and effective tool may one day be available to enhance our public health preparedness. Auto-injectors provide a highly practical way to treat hundreds of people quickly during an emergency.”
“Few other areas of medicine are as time-dependent as injury to the brain. In epilepsy, even a few minutes can be important. With every minute the seizure continues, it becomes harder to stop. RAMPART offers first responders an important treatment tool that will have a meaningful impact on the lives of many people with epilepsy,” said Robert Silbergleit, MD, of the University of Michigan in Ann Arbor, lead author of the paper.
“The use of the auto-injectors could further improve the excellent care our paramedics provide to their patients every day,” said Jason McMullan, MD, RAMPART co-investigator and assistant professor of clinical emergency medicine at UC. “While the auto-injectors are not yet commercially availability, this trial provides an opportunity to change the way that paramedics everywhere deliver time critical treatment for status epilepticus and improve the potential outcomes for our patients.”

On the Net:

NASA Performs J-2X Powerpack Test

[ Watch the Video ]
Engineers at NASA’s Stennis Space Center conducted an initial test of the J-2X engine powerpack Feb. 15, kicking off a series of key tests in development of the rocket engine that will carry humans deeper into space than ever before.
This test is the first of about a dozen various powerpack tests that will be conducted throughout the year at Stennis. The initial test was designed to ensure powerpack and facility control systems are functioning properly. It also marked the first step in establishing start sequencing for tests and was the first time cryogenic fuels were introduced into the powerpack to ensure the integrity of the facility and the test article in preparation for full power, longer duration testing.
The powerpack is a system of components on the top portion of the J-2X engine, including the gas generator, oxygen and fuel turbopumps, and related ducts and valves. On the full J-2X engine, the powerpack system feeds the thrust chamber system which produces engine thrust.
The J-2X is being developed by Pratt & Whitney Rocketdyne for NASA’s Marshall Space Flight Center in Huntsville, Ala. It is the first human-rated liquid oxygen and liquid hydrogen rocket engine to be developed in 40 years. The J-2X will provide upper-stage power for NASA’s Space Launch System, a new heavy-lift vehicle capable of missions beyond low-Earth orbit.
The new powerpack test series is the second for the J-2X engine. Testing of an Apollo-era powerpack at Stennis in 2008 provided critical data for development of the new, more advanced turbomachinery.

Image Caption: J-2X powerpack test lights up the night. (NASA/SSC)

On the Net:

Arsenic Levels In Some Organic Foods Surpass FDA Limits

Researchers from Dartmouth reported today that potentially high levels of arsenic have been found in brown rice syrup, a primary ingredient in many organic foods.

Environmental chemist Brian P. Jackson found what the Environmental Protection Agency (EPA) considers dangerous amounts of arsenic in several organic food products, including organic infant formula whose main ingredient is brown rice syrup.

Other products that include the sweetener are some cereal bars, energy bars and energy “shots” consumed by many athletes, according to the study published today in the journal Environmental Health Perspectives.

The list of products, not listed by brand name, follow recent reports about trace levels of arsenic discovered in apple juice and previous reports of the poison in rice. Researchers point out that rice is among one group of plants that are efficient in taking arsenic from the soil.

Jackson explained to Makiko Kitamura of Bloomberg: “In the absence of regulations for levels of arsenic in food, I would certainly advise parents who are concerned about their children´s exposure to arsenic not to feed them formula where brown rice syrup is the main ingredient.”

Arsenic has long been recognized as a contaminant in found primarily in drinking water, with dangerous levels pegged at federal limit of 10 parts per billion, there are currently no federal thresholds for arsenic in juices or most foods.

Legislation was introduced earlier this month in the US House of Representatives calling on the Food and Drug Administration (FDA) to establish standards for arsenic and lead in fruit juices, writes Kitamura for Bloomberg.

The FDA has been sampling and testing a variety of “more conventional” rice products, including rice crackers and rice cereals, “to evaluate what the risk is and what the levels are in these products” said Siobhan DeLancey, a spokeswoman for the agency´s Center for Food Safety and Applied Nutrition, told Anne Allen of ABC News.

Depending on what the testing reveals, she said there was “a possibility” that the agency would set a threshold for arsenic levels in rice. The FDA previously set a “level of concern” of 23 parts per billion of arsenic for fruit juices, the only other food to have such a designated level.

“The bottom line is this shows there´s a need for FDA to figure out some limits on this and put that out there,” Patty Lovera, assistant director of Food and Water Watch, a consumer advocacy group in Washington, D.C., told Allen. She said FDA needs to take a broader approach toward arsenic in what we eat, rather than going “food by food.”

Consumer Reports magazine published results of arsenic testing last month showing nearly 10 percent of juice samples from five brands exceeded federal drinking-water standards for arsenic. Most of the arsenic was inorganic arsenic, a known carcinogen, according to the study.

The potential presence of the chemical element in formula is “particularly worrisome for babies because they are especially vulnerable to arsenic´s toxic effects,” the Dartmouth researchers said.

On the Net:

What Color Do Flies Like Best?

[ Watch the Video ]

Scientists have created a new type of housefly control device that has proven most effective in killing an insect that carries as many as 100 types of germs, researchers from the University of Florida (UFL) announced Wednesday.

Flies have been known to spread diseases such as dysentery, typhoid fever and cholera, and they are often the first pests to occur in abundance when infrastructure is disrupted due to war or natural disasters, such as hurricanes or tsunamis.

The UFL researchers developed the new device after a need for effective fly control came from military personnel battling the persistent pests on the warfront.

The device, known as the Florida Fly-Baiter, is blue rather than the typical yellow fly control devices on the commercial market. The color is the key in controlling the pesky pests, and blue is far more effective than the traditional yellow-colored traps, said Phil Koehler, a professor of urban entomology with UFL´s Institute of Food and Agricultural Sciences.

Koehler and Roberta Pereira, an IFAS associate researcher, worked with two UFL entomology graduates who are also in the Navy to develop the fly control device.

The team found that flies are three times more attracted to the color blue than to yellow, and they further found that yellow actually seemed to repel flies rather than lure them in.

The research was funded by the Department of Defense´s Deployed War-Fighter Protection Program, which sought better ways to protect their troops from insect-spread diseases.

The device works by luring flies with color, smell and other attractants. Once they reach the device, the flies eat poisonous bait that quickly kills them. The device doesn´t trap flies, making it more useful than other traps that need to be replaced often when flies fill up the devices.

Pereira said that more than 40,000 flies were killed with just one application during a recent test of the device. Additional applications of insecticide can be applied as necessary.

Researchers used behavioral tests to determine which color a fly was most likely attracted to. Using electroretinograms to measure the flies´ eye reaction, they found the insects responded more readily to blue.

Adding effectiveness, the trap also includes black stripes covered with insecticide that line the outside. The stripes, tested by Hertz, mimic dark crevices flies like to hide in.

UFL entomology graduate Joseph Diclaro, lead author of the study and device designer, said his time as a US Navy hospital corpsman in Cuba in 1991, when refugees were flooding in, prompted him to make a better fly control device.

“At the time, there were so many displaced people living very closely together, and the garbage and waste accumulated, producing tons of flies,” Diclaro said. “I remember walking out of my tent and just being covered with them.”

The device is effective in controlling house fly, phorid fly, and blow fly problems. It is now available through control distributors. Insecticide for the device is sold separately.

The research results are published in the current issue of the Journal of Medical Entomology.

On the Net:

China Pollution Costing Economy Billions Of Dollars

Despite improvements in air quality, the economic impact of air pollution in China costs billions of dollars in health care, report researchers from MIT.

The new study shows that the economic impact from ozone and particulates in the air in China has increased.

The study analyzed the costs associated with health impacts from ozone and particulate matter, which can lead to respiratory and cardiovascular diseases.

It was found that, after quantifying costs from both lost labor and the increased need for health care, this air pollution cost the Chinese economy $112 billion in 2005.  This is compared to $22 billion in similar damages in 1975.

“The results clearly indicate that ozone and particulate matter have substantially impacted the Chinese economy over the past 30 years,” Noelle Selin, an assistant professor of engineering systems and atmospheric chemistry at MIT, said in a press release.

The team found two main causes for the increase in pollution costs, including rapid urbanization in the conduction with population growth, and higher incomes raised costs associated with lost productivity.

“This suggests that conventional, static methods that neglect the cumulative impact of pollution-caused welfare damage or other market distortions substantially underestimate pollution’s health costs, particularly in fast-growing economies like China,” Kyung-Min Nam, one of the study´s authors, said in a press release.

Nam said pollution led to a $64 billion loss in gross domestic product in 1995 in China.  Static estimates from the World Bank found the loss was only $34 billion.

Selin said that the study represents a more accurate picture than other studies that have attempted to find the same associations.

“This important study confirms earlier estimates of major damages to the Chinese economy from air pollution, and in fact, finds that the damages are even greater than previously thought,” according to Kelly Sims Gallagher, an associate professor of energy and environmental policy at Tufts University´s Fletcher School.

The team calculated the long-term impacts using atmospheric modeling tools and comprehensive global economic modeling.

They said these models were especially important when it came to assessing the cumulative impact of ozone, which China has recently started to monitor.  The researchers simulated historical ozone levels by using these models.

China’s particulate-matter concentrations were at least 10 to 16 times higher than the World Health Organization’s annual guidelines in the 1980s.

The concentrations in 2005 were still five times higher than what is considered safe, despite making significant improvements.

According to 2007 World Health Organization estimates, these high levels of pollution have led to 656,000 premature deaths in China each year from ailments caused by indoor and outdoor air pollution.

“The study is evidence that more stringent air-pollution control measures may be warranted in China,” Gallagher said.

China is trying to respond to these health and economic problems, including a January move to limit its carbon intensity by 17 percent by 2015.

The China Energy and Climate Project will analyze the impact of existing and proposed energy and climate policies in China on technology, energy use, the environment and economic welfare.

The study appears in the February edition of the journal Global Environmental Change.

On the Net:

Counterfeit Cancer Drug Circulating Throughout The US

Roche Genentech is warning doctors and patients that counterfeit vials of its cancer drug Avastin have been distributed in the U.S.
The fake drugs do not contain the key ingredient in Avastin, which is used to help treat cancers of the colon, lung, kidney and brain.
A spokeswoman for Roche Genentech said the counterfeit drug has been distributed to health care facilities in the U.S., but it´s unclear how many vials are in circulation in the U.S.
The company said it is working with the Food and Drug Administration (FDA) to track down the counterfeit vials and analyze their contents.
“We’re still analyzing what it is, we know it doesn’t contain the active ingredient in Avastin,” Genentech spokeswoman Charlotte Arnold said in a press release. “It’s an infused medicine and not something a patient would have in their hands, so it’s really health care providers who should be on the lookout.”
The counterfeit vials do not have “Genentech” printed on their packing, which appears on all the original packing of the drug.
Also, the original Avastin contains a six-digit lot number with no letters, and all the packaging text should be in English.
The FDA said on Tuesday that it has contacted 19 medical practices that may have purchased the unapproved drugs from a company called Quality Specialty Products.
The agency said the foreign supply company may also do business as Montana Health Care Solutions.
“FDA has requested that the medical practices stop using any remaining products from these suppliers,” the agency said in a statement.
Foreign health regulators alerted Genentech to the problem originally, and officials believe the counterfeits were imported from another country.
The original Avastin drug is packaged in manufacturing facilities in South San Francisco, California.
Avastin works by choking off blood supply that feeds tumors, and it was the first drug of its kind approved by the FDA.
Data tracking firm IMS Health said that the drug was the 14th best-selling drug in the U.S. in 2010.
This is not the first time counterfeit drugs have made their way into doctors´ hands in the U.S.  A contaminated blood thinner called heparin was connected with dozens of deaths and hundreds of allergic reactions across the U.S. in 2008.
An FDA investigation concluded the drug had been intentionally contaminated with an ingredient that mimics heparin.  The drug was imported from China.
About 80 percent of active ingredients used in U.S. prescription drugs are now manufactured overseas, according to congressional investigators.
Recent legislation could give FDA the authority to inspect foreign drug imports.  A separate legislation would create a mandatory barcode system to monitor the authenticity of all prescription drugs.

On the Net:

Childhood Leukemia Drug In Seriously Short Supply

Methotrexate, a lifesaving drug used to treat childhood leukemia and rheumatoid arthritis, is in such short supply that hospitals across the country fear supplies could be exhausted in just a few weeks, leaving thousands of children at risk, health officials say.

Despite looming fears, the US Food and Drug Administration (FDA) said this week that the shortage should ease before hospitals run out. But drug makers are giving few details about how they will find a long-term solution.

The FDA´s drug shortage program associate director, Valerie Jensen, said officials are working with the three makers of Methotrexate to come up with a solution to the problem.

The drug, which cures up to 90 percent of children with acute lymphoblastic leukemia (ALL), has been in short supply for the past year-and-a-half, and in far shorter supply in recent weeks because a leading maker of the drug shut down some of its factories last year.

“This is dire,” said Jensen. “Supplies are just not meeting demand.”

Ben Venue Laboratories was one of the nation´s largest suppliers of injectable preservative-free Methotrexate, but the company suspended operations at its Bedford, Ohio plant because of “significant manufacturing and quality concerns,” the company announced.

Since then, drug supplies have gradually dwindled.

“This is a crisis that I hope the FDA´s hard work can help to avert,” Dr. Michael P. Link, president of the American Society of Clinical Oncology, told the New York Times. “We have worked very hard to take what was an incurable disease and make it curable for 90 percent of the cases. But if we can´t get this drug anymore, that sets us back decades.”

Ben Venue said in a statement that it is working closely with the FDA to bring Methotrexate back to market as soon as possible, and understands “the urgent need” for the medication.

“Since we suspended the production of all products in November 2011, our team has been working around the clock to implement changes needed to ensure a more sustained supply of the medicines we produce, and to address the manufacturing related issues at our facility noted in recent inspections by the FDA and other global regulatory agencies,” the company said.

“Over the past three years, we have invested more than $250 million to upgrade our facilities, and continue to invest millions more in order to restore production as quickly as possible. “¦ We are committed to doing all that we can to help seek a solution to this urgent need, and are hopeful that some of the other companies licensed to manufacture Methotrexate will be able to increase production while we work to restore manufacturing at our facilities,” it added.

“In the meantime, our inability to produce Methotrexate and other medicines critical to patient care weighs heavily on us all,” the statement concluded.

Jensen told The Associated Press (AP) that the three drug companies should be starting to ship doses of Methotrexate by the end of the month. She noted that federal regulations bar the FDA from discussing plans of specific companies, as it is considered proprietary information.

Elizabeth Raetz, a pediatric oncologist at the NYU Langone Medical Center, Liz Szabo of USA TODAY that the Methotrexate shortage is a matter of life and death for the 3,500 kids diagnosed with ALL each year. They endure two to three years of exhausting therapies but are nearly always cured of their disease, and there is no replacement therapy for Methotrexate; going without it, or even delaying it, could leave children vulnerable to a fatal relapse, she said.

FDA officials “have been reassuring in discussions that this is not going to be a prolonged shortage,” Dr. Peter Adamson, chairman of the Children´s Oncology Group, a network of 200-plus North American hospitals treating children with cancer, told the AP. But until the drug is delivered, we can´t be sure, he noted.

According to the AP’s Linda A. Johnson, multiple hospitals and cancer specialists say they still have enough of the drug to treat its current patients. But a survey of 204 oncologists in January found at least 40 percent believed that one or more patients in the past year either died prematurely or suffered a tumor recurrence because of the shortage of Methotrexate.

Though Link praised the FDA for working so fast, he said the US still needs to find a long-term solution to the problem. Methotrexate shortages is only one of a long list of 286 other drugs that are facing supply exhaustion.

“People are panicking,” Erin Fox, manager of the drug information service at the University of Utah, told Gardiner Harris of the New York Times. “There isn´t a lot of hope that supplies will improve drastically over the next few weeks, which is why people are so worried.”

President Obama signed an executive order in October 2011 giving the FDA greater authority to manage drug shortages as well as counter price-gouging. The FDA has reversed 114 shortages in this manner since October 31, said FDA spokeswoman Shelly Burgess.

Sen. Amy Klobuchar, D-Minnesota, has introduced legislation to require manufacturers to report shortfalls of all medications to the FDA. “In the Senate, it´s so hard to get an individual bill to pass.” She said she will keep fighting until this strategy works.

Today, drugmakers are required to notify the FDA of shortages only in scarce drugs for which they´re the only supplier.

Currently there are five manufacturers of Methotrexate in the US, and they are trying to increase their production. The FDA is also seeking a foreign supplier to provide emergency imports until suppliers can meet demand in the US, said Jensen.

“We´re working on many fronts, and will keep this a priority,” she added.

On the Net:

Bumblebees Learn To Take Cues From Honeybees

Bumblebees can use cues from their rivals the honeybees to learn where the best food resources are, according to new research from Queen Mary, University of London.

Writing in the journal PLoS ONE, the team from Queen Mary’s School of Biological and Chemical Sciences explain how they trained a colony of bumblebees (Bombus terrestris) to use cues provided by a different species, the honeybee (Apis mellifera), as well as cues provided by fellow bumblebees to locate food resources on artificial flowers.

They found that the bumblebees were able to learn the information from the honeybees just as efficiently as when the information came from their own species, demonstrating that social learning is not a unique process limited members of the same species.

PhD student Erika Dawson, explains: “Most social learning research has focused on learning between members of the same species. But in the same way that human engineers can pick up useful tricks from animals (such as using bird aerodynamics to design planes), animals might of course learn from different species where the best food is, where predation looms or where the best place to nest can be found.

“We wanted to determine whether animals can use any social cue to enhance their environment, even if they come from another species that share their habitat, resources or predators.”

The results show that information learnt from other species can be just as valuable to an animal like the bumblebee as information from their own species. Bees would have opportunities to learn cues from their own species and other species to an equal degree in the wild, as they often share the same flower species as a source of food. This is particularly true for large flowers such as sunflowers, which are often fed from by multiple pollinators simultaneously.

The results also show that competition between the two species may be much more severe than previously assumed, as Erika Dawson explains: “If bumblebees use individual exploration and copying of their fellow bumblebees to identify rewarding plants, but also use the information provided by a rival species (ie honeybees), this could have important ecological implications for community structure and formation, and may help us better understand the impact of competition within natural pollinator communities.”

On the Net:

Analyzing The Causes Of Obesity In The Romani Ethnic Group

Esther Rebato is a well-known figure in the field of Physical Anthropology. She not only holds the prestigious Alex Hrdlièka academic medal of the Czech Republic, but she is also the Chair of the Spanish Association of Physical Anthropology. This lecturer at the Department of Genetics, Physical Anthropology and Animal Physiology of the University of the Basque Country (UPV/EHU) has carried out numerous pieces of work into nutritional habits, life quality and other associated aspects. Over the last few years she has been looking at the Romani ethnic population; it is a group that suffers from a high rate of obesity and which has caught this researcher’s attention. In actual fact, she has been working with them in several projects, and in particular, the one entitled Determinantes genéticos y ambientales de la obesidad en familias de etnia gitana de la CAV (Genetic and Environmental Determinants of Obesity in Families of the Romani Ethnic Group in the Basque Autonomous Community), funded by the former Ministry of Science and Innovation.

The project got going last year and is set to take three years (2011-2013). Rebato is the only lecturer from the UPV/EHU involved, but she has the collaboration of various PhD students and Doctor Fernando Goñi Goicoechea, endocrinologist and consultant at the Hospital of Basurto (Bilbao). And also of the Romani association Kale Dor Kayiko, which has so far put her in touch with over 50 Romani families and more than 380 individuals. Among other things, they have been asked to take part in socio-economic and image-perception surveys; and they have had their anthropometric measurements, blood pressure and saliva samples taken. “We do not study the metrics or phenotype alone; we also analyse some genes. These are confirmatory studies, we are not out to discover new genes. The only thing we want to see here is whether there is any particular variant in the Romani population,” adds Rebato.

Research and information

This piece of work has only just begun, but some data are illuminating: “Obesity is extremely prevalent. If the rate in the general population is 15-20 %, in the Romani population it exceeds 50 %; in men as well as in women. What is more, it is central or abdominal obesity. In men, in particular, it is very dangerous because it is a type of obesity linked to cardiovascular disease, diabetes, etc.”

Rebato argues that there is a “culture of obesity” that accounts for this phenomenon: “When the populations that have not had much power access food, power is displayed in more potbellied, chubbier children“¦ Furthermore, there is a conception that men like well-built women; this seems to be a fertility symbol. The Romani families have gained access to an obesogenic culture and consume cheaper foods with a higher fat content…” All this has serious consequences. Without looking any further, some of the children of Romani families participating in the study are already hypertensive, and the state of many women points to possible diabetes or heart problems after the menopause. To this has to be added the risks involved, mainly in the men, which is the abdominal type of obesity already mentioned.

In fact, Rebato and her colleagues are not limiting themselves to just doing research. As agreed with Kale Dor Kayiko, they have also undertaken to inform the families involved in the study; to warn them about the bad habits they may have and propose some changes. “We don´t want to take away their culinary culture, but, for example, we tell them to cook with less fat, to walk a little more“¦ They need to be informed so that they know how to use foodstuffs and so that they can modify their way of life without losing their essential features. And naturally, you have to explain to them that there are health inequalities. We want to inform them about what is available and then let them decide what to do,” explains the researcher.

This project still has a long way to go, but it has already begun to bear fruit in academic terms. For example, last year Rebato and her colleagues participated in the Conference of the European Society for Agricultural and Food Ethics (EurSafe) held in Bilbao, with a paper on obesity in ethnic minorities and in populations with a limited income. They have also sent an article, which is currently at the revision stage, to the journal Annals of Human Biology.  

On the Net:

Stem Cells Regrow Healthy Heart Muscle In Heart Attack Patients

Stem cells are proving themselves beneficial once again after scientists used the controversial building blocks to resurrect dead, scarred heart muscle damaged by recent heart attack.

Results from a Cedars-Sinai Heart Institute clinical trial show that treating heart attack patients with an infusion of their own heart-derived cells helps damaged hearts re-grow healthy heart muscle.

Reporting in The Lancet medical journal, the researchers said this is the clearest evidence yet that broken hearts can heal. All that is needed is a little help from one´s own heart stem cells.

“We have been trying as doctors for centuries to find a treatment that actually reverses heart injury,” Eduardo Marban, MD, PhD, and lead author of the study, told WebMD. “That is what we seem to have been able to achieve in this small number of patients. If so, this could change the nature of medicine. We could go to the root of disease and cure it instead of just work around it.”

Marban invented the “cardiosphere” culture technique used to create the stem cells and founded the company developing the treatment.

“These findings suggest that this therapeutic approach is feasible and has the potential to provide a treatment strategy for cardiac regeneration after [heart attack],” wrote University of Hong Kong researchers Chung-Wah Siu and Hung-Fat Tse in an accompanying editorial of Marban´s paper.

The British Heart Foundation told James Gallagher of BBC News that this could “be great news for heart attack patients” in the future.

A heart attack occurs when the heart is starved of oxygen, such as when a clot is blocking the blood flow to the organ. As the heart heals, the dead muscle is replaced by scar tissue, which does not beat like heart muscle. This in turn reduces the hearts ability to pump blood around the body.

Doctors have long been searching for ways to regenerate damaged heart muscle, and now, it seems heart stem cells are the answer. And the Cedars-Sinai trial was designed to test the safety of using stem cells taken from a heart attack patient´s own heart.

The researchers found that one year after receiving the treatment, scar size was reduced from 24 percent to 12 percent of the heart in patients treated with heart stem cells. Patients in the control group, who did not receive stem cells, did not experience a reduction in their heart attack scar tissues.

“While the primary goal of our study was to verify safety, we also looked for evidence that the treatment might dissolve scar and re-grow lost heart muscle,” Marban said in a statement. “This has never been accomplished before, despite a decade of cell therapy trials for patients with heart attacks. Now we have done it. The effects are substantial, and surprisingly larger in humans than they were in animal tests.”

“These results signal an approaching paradigm shift in the care of heart attack patients,” said Shlomo Melmed, MD, dean of the Cedars-Sinai medical faculty and the Helene A. and Philip E. Hixon Chair in Investigative Medicine. “In the past, all we could do was to try to minimize heart damage by promptly opening up an occluded artery. Now, this study shows there is a regenerative therapy that may actually reverse the damage caused by a heart attack.”

Marban cautioned that stem cells do not do what people generally think they do. The general idea has been that stem cells multiply over and over again, and, in time, they turn themselves and their daughter cells into new, working heart muscle.

But Marban said the stem cells are actually doing something more amazing.

“For reasons we didn´t initially know, they stimulate the heart to fix itself,” he told Daniel J. DeNoon of WebMD. “The repair is from the heart itself and not from the cells we give them.”

Exactly how the stem cells invigorate the heart to do this was a matter of “feverish research” in the lab.

The CArdiosphere-Derived aUtologous stem CElls to reverse ventricUlar dySfunction (CADUCEUS) clinical trial was part of a Phase I study approved by the US Food and Drug Administration (FDA) and supported by the National Heart, Lung, and Blood Institute.

Marban used 25 volunteer patients who were of an average age of 53 and had recently suffered a heart attack that left them with damaged heart muscle. Each patient underwent extensive imaging scans so doctors could pinpoint the exact location and severity of the scars. Patients were treated at Cedars-Sinai in LA and at Johns Hopkins Hospital in Baltimore.

Eight of the 25 patients served as a control group, receiving conventional medical treatment. The other 17 patients who were randomized to receive the stem cell treatments underwent a minimally invasive biopsy, under local anesthesia. Using a catheter inserted through a vein in the neck, doctors removed a small sample of heart tissue, about half the size of a raisin. The heart tissue was then taken to the lab at Cedars-Sinai and cultured and multiplied the cells using specially developed tools.

The doctors then took the multiplied heart-derived cells — roughly 12 million to 25 million of them per patient — and reintroduced them into the patient´s coronary arteries during another minimally invasive catheter procedure.

The process used in the trial was developed earlier by Marban when he was on the faculty at Johns Hopkins. Johns Hopkins has filed for a patent on the intellectual property and has licensed it to a company in which Marban has a financial interest. However, no funds from that company were used to support the clinical study. All funding was derived from the National Institutes of Health and Cedars-Sinai Medical Center.

This study followed another in which doctors reported using cells taken from the heart to heal the heart. That trial reported in November 2011 that cells could be used to heal the hearts of heart failure patients who were having heart bypass surgery.

And another trial is about to get underway in Europe, which will be the largest ever for stem cell therapy in heart attack patients.

The BAMI trial will inject 3,000 heart attack patients with stem cells taken from their bone marrow within five days of the heart attack.

Marban said despite the heart´s ability to re-grow heart muscle with the help of heart stem cells, they found no increase in a significant measure of the heart´s ability to pump — the left ventricle ejection fraction: the percentage of blood pumped out of the left ventricle.

Professor Anthony Mathur, a coordinating researcher for the upcoming BAMI trial, said that even if the Marban trial found an increase in ejection fraction then it would be the source of much debate. As it was a proof-of-concept study, with a small group of patients, “proving it is safe and feasible is all you can ask.”

“The findings would be very interesting, but obviously they need further clarification and evidence,” he told BBC News.

“It´s the first time these scientists´ potentially exciting work has been carried out in humans, and the results are very encouraging,” Professor Jeremy Pearson, associate medical director at the British Heart Foundation, told BBC News.

“These cells have been proven to form heart muscle in a petri dish but now they seem to be doing the same thing when injected back into the heart as part of an apparently safe procedure,” he added. “It´s early days, and this research will certainly need following up, but it could be great news for heart attack patients who face the debilitating symptoms of heart failure.”

On the Net:

Moderate Air Pollution Linked To Stroke, Cognitive Decline

Chronic exposure to air pollution, even at levels typically considered safe by federal regulations, increases the risk of stroke by 34 percent and may accelerate cognitive decline in older adults, according to two separate studies published this week in the Archives of Internal Medicine.

In one study, researchers studied more than 1,700 stroke patients in the Boston area over a 10-year period, and found exposure to ambient fine particulate matter (PM), generally from vehicle traffic, was associated with a significantly higher risk of ischemic strokes on days when the EPA’s air quality index for PM was yellow instead of green.

The researchers focused on tiny particles with a diameter of 2.5 millionths of a meter, less than 1/30th the width of a human hair, referred to as PM2.5.

These particles come from a variety of sources, including power plants, factories, trucks and automobiles, and can travel deep into the lungs.  They have been associated in previous studies with increased numbers of hospital visits for cardiovascular diseases, including heart attacks.

“The link between increased stroke risk and these particulates can be observed within hours of exposure and are most strongly associated with pollution from local or transported traffic emissions,” said the study´s senior author, Dr. Murray Mittleman, a physician at Beth Israel Deaconess Medical Center and an Associate Professor of Medicine at Harvard Medical School.

“Any proposed changes in regulated pollution levels must consider the impact of lower levels on public health.”

“Considering that almost everyone is exposed to air pollution and is at risk for stroke, that’s actually a pretty large effect,” said lead author Gregory Wellenius, ScD, an Assistant Professor of Community Health at Brown University.

The researchers analyzed the medical records of more than 1,700 patients who went to the hospital for treatment of confirmed strokes between 1999 and 2008.

They matched the onset of stroke symptoms in each patient to hourly measurements of particulate air pollution taken at the nearby Harvard School of Public Health’s environmental monitoring station.

The researchers then abstracted data on the time of symptom onset and clinical characteristics to estimate the hour the stroke systems first occurred.   The team included only strokes confirmed by attending neurologists, and did not rely upon insurance billing codes, which can sometimes be vague.

They compared this data with Harvard’s hourly measurements of pollution within 13 miles of 90 percent of the stroke patients’ homes to allow for close matching in time of exposure and stroke onset.

“We think that this study is novel in that it has high-quality data on both air pollution exposure and stroke diagnosis,” Wellenius said.

The team was able to calculate that the peak risk to patients from air pollution exposure occurs 12-14 hours before a stroke.

Such information may be useful to researchers who want to trace how PM2.5 might be working in the body to increase the likelihood of stroke.

The researchers also found that black carbon and nitrogen dioxide, two pollutants associated with vehicle traffic, were closely linked with stroke risk, suggesting that pollution from cars and trucks may be particularly important.

Stroke is a leading cause of long-term disability and the third leading cause of death in the United States. An estimated 795,000 Americans suffer a new or recurrent stroke every year, resulting in more than 135,000 deaths and 829,000 hospital admissions.

The researchers estimate that reducing PM2.5 pollution by about 20 percent could have prevented 6,100 of the 184,000 stroke hospitalizations in the northeastern U.S. in 2007.

Although the researchers acknowledge their results need to be replicated in other cities, they note that Boston is considered to have relatively clean air.

“The levels of PM2.5 in Boston are lower than those seen in many in other parts of the country, yet we still find that within these moderate levels the risk of stroke is higher on days with more particles in the air,” Mittleman said.

Meanwhile, a separate study by researchers at Rush University Medical Center suggests that chronic exposure to particulate air pollution may accelerate cognitive decline in older adults.

The large, prospective study found that women who were exposed to higher levels of ambient particulate matter over the long term experienced more decline in their cognitive functioning over a four-year period.

These associations were present at levels of PM exposure typical in many areas of the United States.

The researchers concluded that higher levels of long-term exposure to both coarse PM (PM2.5-10) and fine PM (PM2.5) were associated with a significant acceleration of cognitive decline.

The study is the first to examine changes in cognitive function over a period of time, and whether exposure to the size of particulate matter is important.

Jennifer Weuve, MPH., ScD, assistant professor of the Rush Institute of Healthy Aging and the principal investigator of the study, and colleagues evaluated air pollution, both coarse and fine, in relation to cognitive decline in older women.  They used a study population from the Nurses’ Health Study Cognitive Cohort, which included 19,409 U.S. women ages 70 to 81 over a 14-year period dating back to 1988.

“Very is little known about the role of particulate matter exposure and its association with cognitive decline,” said Weuve.

Exposure to particulate air pollution is known to be associated with cardiovascular risk, which may itself play a role in causing or accelerating cognitive decline, the researchers said.

“Unlike other factors that may be involved in dementia such as diet and physical activity, air pollution is something we can intervene on as a society at large through policy, regulation and technology,” said Weuve.

“Therefore, if our findings are confirmed in other research, air pollution reduction is a potential means for reducing the future population burden of age-related cognitive decline, and eventually, dementia,” she said.

Both studies are published this week in the Archives of Internal Medicine.

On the Net:

Antarctica Fish Threatened By Climate Change

A Yale-led study of the evolutionary history of Antarctic fish and their “anti-freeze” proteins illustrates how tens of millions of years ago a lineage of fish adapted to newly formed polar conditions — and how today they are endangered by a rapid rise in ocean temperatures.

“A rise of 2 degrees centigrade of water temperature will likely have a devastating impact on this Antarctic fish lineage, which is so well adapted to water at freezing temperatures,” said Thomas Near, associate professor of ecology and evolutionary biology and lead author of the study published online the week of Feb. 13 in the Proceedings of the National Academy of Sciences.

The successful origin and diversification into 100 species of fish, collectively called notothenioids, is a textbook case of how evolution operates. A period of rapid cooling led to mass extinction of fish acclimated to a warmer Southern Ocean. The acquisition of so-called antifreeze glycoproteins enabled notothenioids to survive in seas with frigid temperatures. As they adapted to vacant ecological niches, new species of notothenioids arose and contributed to the rich biodiversity of marine life found today in the waters of Antarctica.

Notothenioids account for the bulk of the fish diversity and are a major food source for larger predators, including penguins, toothed whales, and seals. Yale’s Peabody Museum of Natural History has one of the most important collections of these specimens in the world.

However, the new study suggests the acquisition of the antifreeze glycoproteins 22 to 42 million years ago was not the only reason for the successful adaptation of the Antarctic notothenioids. The largest radiation of notothenioid fish species into new habitats occurred at least 10 million years after the first appearance of glycoproteins, the study found.

“The evolution of antifreeze was often thought of as a ‘smoking gun,’ triggering the diversification of these fishes, but we found evidence that this adaptive radiation is not linked to a single trait, but to a combination of factors,” Near said.

This evolutionary success story is threatened by climate change that has made the Southern Ocean around Antarctica one of the fastest-warming regions on Earth. The same traits that enabled the fish to survive and thrive on a cooling earth make them particularly susceptible to a warming one, notes Near.

“Given their strong polar adaptations and their inability to acclimate to warmer water temperatures, climate change could devastate this most interesting lineage of fish with a unique evolutionary history,” Near said.

Yale-affiliated authors of the study are Alex Dornburg, Kristen L. Kuhn, and Jillian N. Pennington.

Image Caption: The development of antifreeze glycoproteins by notothenioids, a fish family that adapted to newly formed polar conditions in the Antarctic millions of years ago, is an evolutionary success story. The three species of fish are an example of the diversity this lineage achieved when it expanded into niches left by fish decimated by cold water environment. Now the same fish are endangered by warming of the Antarctic seas (in order: Chaenodraco wilsoni (common name: spiny icefish); Trematomus newnesi (common name: dusky rockcod); Vomeridens infuscipinnis (common name: antarctic dragonfish). Credit: Courtesy of Yale University

On the Net:

Research Finds Top-Heavy Flying Objects Maintain Balance Best

Contrary to popular belief, it is actually easier for top-heavy structures to maintain their balance while hovering in midair than those with lower centers of gravity, claim scientists from New York University (NYU).

According to UPI reports on Saturday, the findings counter commonly held theories that an even distribution of weight is the best way to achieve flight stability. The NYU researchers reportedly tested a series of “pyramid-shaped ‘bugs’ constructed from paper that hovered when placed in an oscillating column of air, mimicking the effect of flapping wings.”

“To see which types of structures best maintained their balance, the researchers created both top-heavy bugs with a weight above the pyramid and low center-of-mass bugs with the same weight below,” the news organization added. “They said they were surprised to find the top-heavy bugs hovered stably while those with a lower center of mass could not maintain their balance.”

The researchers recorded their experiment using high-speed videos in order to analyze the nature of the airflow surrounding each of the bugs, NYU said in a press release. When the top-heavy bug tilted, they witnessed the swirls of air ejected from the far side of the object would automatically adjust in order to keep it from tilting over. The aerodynamic forces provided stability for the bugs, the university said.

The researchers believe that they could use what they learned in their experiment in order to design and improve stable flying robots with flapping wings that can maneuver easily.

“It works somewhat like balancing a broomstick in your hand,” lead researcher Jun Zhang, a Professor at NYU´s Courant Institute, said, according to Trent Nouveau of TG Daily. “If it begins to fall to one side, you need to apply a force in this same direction to keep it upright.”

Joining Zhang on the project were postdoctoral researchers Bin Liu, Leif Ristroph, and Stephen Childress, all of the Courant Institute, and Annie Weathers, a formed NYU undergrad who now studies mechanics at the University of Texas, Austin. The university reports that funding for the study was provided via grants from the National Science Foundation (NSF) and the U.S. Department of Energy.

On the Net:

Is This Really A Woolly Mammoth?

A strangely out-of-focus video was released by The Sun this week that shows a lumbering animal walking across a river in Siberia. The video alleges the animal to be a live woolly mammoth in what would be a remarkable find as the mammoth has been extinct for nearly 4,000 years.

The footage was taken by a government-employed road surveyor last summer in the Chukotka Autonomous Okrug region of Siberia, and is strangely reminiscent of the 40 year old Bigfoot film, brief and fuzzy.

Many believe the video is a hoax and point to several suspicious aspects of the video. For instance, the fact that the man who posted the video, Michael Cohen, is a paranormal enthusiast and has been involved with several other videos of UFOs and other phenomena of questionable authenticity, reports Benjamin Radford of MSNBC’s Life’s Little Mysteries.

Some comments on the video believe it is not computer generated, but that the creature is just a bear with a large fish in its mouth. That would explain its relatively small size for a mammoth, the shape of the “trunk” on its head, and the color.

Hollywood video-effects artist Derek Serra, who has previously analyzed faked UFO videos, told Life´s Little Mysteries that the video, in his opinion, appears to have been intentionally blurred, to obscure the more mundane identity of the animal.

“Even low-resolution cameras can focus fairly well on something,” Serra explains. “But there´s really nothing in this video in focus. The rocks in the foreground have a blur to them that doesn´t seem natural.”

The film was allegedly shot last summer, so why did the Russian engineer keep such an amazing discovery quiet for so long instead of going public with the biggest science story of the century?

Mr. Cohen defends his video by saying, “If surviving woolly mammoths were found in Siberia, it could run against Russia´s plans to further develop and exploit the area´s considerable resources. It would be potentially one of the greatest discoveries ever.”

On the Net:

Alzheimer’s Breakthrough: Drug Appeared To Reverse Symptoms In Mouse Trial

Scientists, long searching for a cure for Alzheimer´s, a reporting a dramatic breakthrough: a drug that quickly reverses the pathological, cognitive and memory deficits in mice afflicted with the disease.

The results point to the significant potential that the drug, bexarotene, could help the more than 5 million Americans suffering from the brain disease. However, they cautioned that the study was only used in mice, and that the research needs much more work to determine if the medication will show positive results in humans.

Current drugs on the market only slow the progression of Alzheimer´s disease. But the neuroscientists at Case Western Reserve University School of Medicine, hope bexarotene, or a similar variation, will someday work in humans as well.

The researchers, reporting in the US journal Science, said mice treated with the drug became rapidly smarter and the plaque in their brains that was causing Alzheimer´s started to disappear within hours.

“We were shocked and amazed,” lead author Gary Landreth, a professor in the Department of Neurosciences at Case Western Reserve University School of Medicine in Ohio, told the AFP news agency. “Things like this had never, ever been seen before.”

Landreth, explaining how bexarotene works, said levels of the protein Apolipoprotein E (ApoE) are boosted which then helps clear amyloid plaque buildup in the brain, a major trait of Alzheimer´s.

“Think of this as a garbage disposal,” said Landreth. “When we are young and healthy, all of us can basically get rid of this (amyloid) and degrade it and grind it into small bits and it gets cleared.” But many of us are “unable to do this efficiently as we age. And this is associated with mental decline or cognitive impairment,” he said.

Within six hours of receiving the drug, soluble amyloid levels in the mice fell 25 percent, ultimately reaching a 75 percent drop over time. The authors found that the mice, soon after taking the drug, began performing better in tests, showing they were able to remember things again, were more social and were able to smell again, a sense that is commonly lost in Alzheimer´s patients.

Within 72 hours after the treatments, the mice were able to associate paper with nests and began building again — another function lost in mice with Alzheimer´s.

“This is an unprecedented finding. Previously, the best existing treatment for Alzheimer’s disease in mice required several months to reduce plaque in the brain,” said study coauthor Paige Cramer, a PhD candidate at the university´s School of Medicine.

“This is a particularly exciting and rewarding study because of the new science we have discovered and the potential promise of a therapy for Alzheimer´s disease,” added Landreth. “We need to be clear; the drug works quite well in mouse models of the disease. Our next objective is to ascertain if it acts similarly in humans.”

If bexarotene is to work in humans, it might be best targeted at people in the early stages of the disease, because, as seen in the nest building behavior of mice with Alzheimer´s, the nests are nowhere near as good as those built by healthy mice, according to the team.

The team said clinical trials for humans are currently being designed and should produce early results in the coming year.

The US Food and Drug Administration (FDA) had previously approved bexarotene for the treatment of a rare form of cancer — cutaneous T-cell lymphoma — more than a decade ago. It was initially made by US-based Ligand Pharmaceuticals under the brand name Targretin.

Eisai Pharmaceutical from Japan bought the rights for Targretin in 2006 and it is now available through Eisai in 26 countries in Europe, North America and South America.

Scott Turner, director of the Georgetown University Medical Center’s Memory Disorders Program, who was not involved in the research, told Kerry Sheridan of AFP that he was excited by the findings. “This is a brand new way to move forward in human trials of Alzheimer’s disease and it works great with mice.”

Turner, an expert in Alzheimer´s disease, cautioned, however, that more research was needed to see if the same results can be seen in humans. “One obstacle is that the mice may not be a good model of Alzheimer´s disease. We have so many things that work in mice and we try them in humans and they just completely fail,” he said.

The FDA gives bexarotene a good safety profile, although women who are pregnant or may become pregnant are urged not to use it because of possible fetal defects. Typical side effects of the drug include diarrhea, dizziness, nausea, dry skin and trouble sleeping.

Since bexarotene is prescribed for cancer patients, there are no anecdotal reports of improved memory in humans, according to Landreth. This may be because most cancer patients do not live long enough to reach the age of when Alzheimer´s usually strikes.

Alzheimer´s and other forms of dementia affect more than 35 million people worldwide, with cases expected to double by 2030, according to Alzheimer’s Disease International which puts the annual global costs of the disease at $604 billion.

Landreth said funding and support for the research came with help from the Blanchette Hooker Rockefeller Foundation, the Thome Foundation, and the National Institutes of Health.

On the Net:

Teen Pregnancy Rates Down

A new study, titled “U.S. Teen Pregnancies, Births and Abortions, 2008: National Trends by Age, Race and Ethnicity” published by the Guttmacher Institute, has found teen pregnancy to be down among all racial groups.
Teen pregnancies are at their lowest rates in 40 years, according to the latest numbers dating 2008 which is when the latest statistics were given.
According to the report, in 2008 teens were pregnant at a rate of 67.8 pregnancies per 1,000 women aged 15-19, or around 7 percent of teenage girls were pregnant that year. This is a 42 percent decline from 1990 when the rate peaked at 116.9 per 1,000 girls.
Also the teen birthrate fell 35 percent between 1991 and 2008 from 61.6 births to 40.2 births per 1,000 teens. Following the declines is also the teen abortion rate, it declined 59 percent from 43.5 abortions per 1,000 girls in 1988 to 17.8 per 1,000 girls in 2008.
The report notes that there is also racial differences among teenage girls. Even though there are dramatic reductions in teen pregnancies across the board, in racial minority populations the pregnancy rates are 2 to 3 times as high of non-hispanic whites.
The teen pregnancy rate peaked in the early 1990´s and since then the rate has dropped by 37 percent among Hispanics, 48 percent among blacks and 50 percent among non-hispanic whites.
This trend in the minority population also follows the abortion rate. The researchers report that the abortion rates among Hispanics  were twice the rate of whites and blacks experience four times as many abortions as whites.
Kathryn Kost, the lead author of the study says, “The recent declines in teen pregnancy rates are great news. However, the continued inequities among racial and ethnic minorities are cause for concern. It is time to redouble our efforts to ensure that all teens have access to the information and contraceptive services they need to prevent unwanted pregnancies.”
The rate of decline in teen birthrates has come about because more teens are practicing safer sex practices, especially improved use of contraceptives. Teens may also be doubling up on protection and increasing use of the most effective contraceptives creating lower pregnancy and abortion rates in the teen population.

On the Net:

Brain Stimulation Could Boost Memory In Alzheimer’s Patients

Neuroscientists at UCLA have found a way to help improve a human’s memory by stimulating a part of the brain.

The research, published in the New England Journal of Medicine, could lead to a new method for boosting memory in Alzheimer’s patients.

The team focused on a brain site known as the entorhinal cortex during their study, which is an area considered to be the doorway to an area that helps form and store memories.

“The entorhinal cortex is the golden gate to the brain’s memory mainframe,” senior author Dr. Itzhak Fried, professor of neurosurgery at the David Geffen School of Medicine at UCLA, said in a press release. “Every visual and sensory experience that we eventually commit to memory funnels through that doorway to the hippocampus. Our brain cells must send signals through this hub in order to form memories that we can later consciously recall.”

The researchers followed seven epilepsy patients who had electrodes implanted in their brains to help pinpoint the origin of their seizures.

They monitored the electrodes to record neuron activity as memories were being formed in the patients’ brains.

They then tested whether deep-brain stimulation of the entorhinal cortex or the hippocampus altered recall.

Using a video game featuring a taxi cab, patients played the role of cab drivers who picked up passengers and travelled across town.

“When we stimulated the nerve fibers in the patients’ entorhinal cortex during learning, they later recognized landmarks and navigated the routes more quickly,” Fried said in a press release. “They even learned to take shortcuts, reflecting improved spatial memory.

“Critically, it was the stimulation at the gateway into the hippocampus — and not the hippocampus itself — that proved effective,” he added.

He said the use of stimulation during the learning phase suggests that patients do not need to undergo continuous stimulation to boost their memory, but only when they are learning something important.

This technique may lead the way to neuro-prosthetic devices that can switch on during specific stages of information processing or daily tasks.

“Our preliminary results provide evidence supporting a possible mechanism for enhancing memory, particularly as people age or suffer from early dementia,” Fried said in a press release. “At the same time, we studied a small sample of patients, so our results should be interpreted with caution.”

Future studies will determine whether deep-brain stimulation enhances other types of recall, like verbal and autobiographical memories.

On the Net:

Physically Active People Are More Excited, Enthusiastic

People who are more physically active report greater levels of excitement and enthusiasm than people who are less physically active, according to Penn State researchers. People also are more likely to report feelings of excitement and enthusiasm on days when they are more physically active than usual.

“You don’t have to be the fittest person who is exercising every day to receive the feel-good benefits of exercise,” said David Conroy, professor of kinesiology. “It’s a matter of taking it one day at a time, of trying to get your activity in, and then there’s this feel-good reward afterwards.”

Conroy added that it often is hard for people to commit to an exercise program because they tend to set long-term rather than short-term goals.

“When people set New Year’s resolutions, they set them up to include the entire upcoming year, but that can be really overwhelming,” he said. “Taking it one day at a time and savoring that feel-good effect at the end of the day might be one step to break it down and get those daily rewards for activity. Doing this could help people be a little more encouraged to stay active and keep up the program they started.”

The researchers asked 190 university students to keep daily diaries of their lived experiences, including free-time physical activity and sleep quantity and quality, as well as their mental states, including perceived stress and feeling states. Participants were instructed to record only those episodes of physical activity that occurred for at least 15 minutes and to note whether the physical activity was mild, moderate or vigorous. Participants returned their diaries to the researchers at the end of each day for a total of eight days. The researchers published their results in the current issue of the Journal of Sport & Exercise Psychology.

According to Amanda Hyde, kinesiology graduate student, the team separated the participants’ feeling states into four categories: pleasant-activated feelings exemplified by excitement and enthusiasm, pleasant-deactivated feelings exemplified by satisfaction and relaxation, unpleasant-activated feelings exemplified by anxiety and anger, and unpleasant-deactivated feelings exemplified by depression and sadness.

“We found that people who are more physically active have more pleasant-activated feelings than people who are less active, and we also found that people have more pleasant-activated feelings on days when they are more physically active than usual,” said Hyde, who noted that the team was able to rule out alternative explanations for the pleasant-activated feelings, such as quality of sleep.

“Our results suggest that not only are there chronic benefits of physical activity, but there are discrete benefits as well. Doing more exercise than you typically do can give you a burst of pleasant-activated feelings. So today, if you want a boost, go do some moderate-to-vigorous intensity exercise.”

Conroy added that most previous studies have looked only at pleasant or unpleasant feelings and paid less attention to the notion of activation.

“Knowing that moderate and vigorous physical activity generates a pleasant-activated feeling, rather than just a pleasant feeling, might help to explain why physical activity is so much more effective for treating depression rather than anxiety,” he said. “People dealing with anxious symptoms don’t need an increase in activation. If anything, they might want to bring it down some. In the future, we plan to look more closely at the effects of physical activity on mental health symptoms.”

Other authors on the paper include Aaron Pincus, professor of psychology, and Nilam Ram, assistant professor of human development and family studies and of psychology.

National Institute on Aging and the Penn State Social Science Research Institute funded this research.

On the Net:

Will Bubble-Powered Microrockets Zoom Through The Human Stomach?

Scientists have developed a new kind of tiny motor – which they term a “microrocket” – that can propel itself through acidic environments, such as the human stomach, without any external energy source, opening the way to a variety of medical and industrial applications. Their report in the Journal of the American Chemical Society describes the microrockets traveling at virtual warp speed for such devices. A human moving at the same speed would have to run at a clip of 400 miles per hour.
Joseph Wang and colleagues explain that self-propelled nano- or microscale motors could have applications in targeted drug delivery or imaging in humans or as a way to monitor industrial applications, such as semiconductor processing. However, some versions of these small-scale motors are not self-propelled and require the addition of a fuel (commonly hydrogen peroxide). Other versions cannot withstand extreme environments such as the stomach, which is very acidic. That’s why the researchers developed a new, tubular microrocket that can move itself without added fuels in very acidic conditions.
They tested the new microrocket in various acids and in acidified human blood serum. In such environments, a microrocket spontaneously produces bubbles of hydrogen gas, which propels it like the gases spewing out of a rocket’s motor nozzle. The microrocket is ultrafast – it can move farther than 100 times its 0.0004-inch length in just one second. In contrast to current devices of this kind, the microrocket’s interior is lined with zinc, which is more biocompatible and “greener” than other materials and leads to the generation of the hydrogen bubbles. Wang’s team also developed a version with a magnetic layer, which enabled them to guide the microrockets toward cargo for pick-up, transport and release.

On the Net:

Morning-After Pill Available By Vending Machine On Campus

The student health center at Shippensburg University in Pennsylvania installed a vending machine where students can get the “morning-after” pill for $25, reports Daily Mail´s Jill Reilly.

It doesn´t appear that any other vending machine in the US dispenses the contraceptive, which can prevent pregnancy if taken soon after sexual intercourse.

Shippensburg, a secluded public institution of 8,300 students in the Cumberland Valley, provides the Plan B One Step emergency contraceptive in the vending machine along with condoms, decongestants and pregnancy tests. On average, one dose is sold every day from the machine.

“I think it´s great that the school is giving us this option,” junior Chelsea Wehking told the Associated Press (AP). “I´ve heard some kids say they´d be too embarrassed” to go into Shippensburg, a small town with a permanent population of about 6,000, to buy Plan B.

Federal law makes the pill available without a prescription to anyone 17 or older, and the school checked records and found that all current students are that age or older, a spokesman said. The vending machine has been in place for about two years, and its existence wasn´t widely known until recently.

Doctor Roger Serr, university vice president for student affairs said the idea for the vending machine was from the University´s Student Association.

“We went out and did a survey of the student body, and we got an 85 percent response rate that students would be supportive of having Plan B in the health center,” he said to Ship News Now. “The vending machine is just a way to dispense it. It´s provided, it´s not necessarily promoted on a large scale,” he added.

The medical vending machine is in the school´s Health Center, which is accessible only by students and university employees, school spokesman Gigliotti said in a statement. In addition, “no one can walk in off the street and go into the health center,” he said; students must check in at a lobby desk.

Plan B must be taken within 72 hours of rape, condom failure or just forgetting regular contraception and can cut the chances of pregnancy by up to 89 percent. It works best if taken within 24 hours. Some religious conservatives consider the emergency contraceptive tantamount to an abortion drug.

History professor Alexandra Stern, from the University of Michigan, questions whether making it so easily available is a good idea. “Perhaps it is personalized medicine taken too far. It´s part of the general trend that drugs are available for consumers without interface with a pharmacist or doctors. This trend has serious pitfalls.”

Other universities in the state system such as Millersville University require students have an appointment with campus medical staff before the pill is made available.

On the Net:

NASA Jumping Out Of Joint ESA Mars Mission

NASA has decided it will be unable to continue a joint endeavor with the European Space Agency (ESA) on the ExoMars mission.

The space agency told ESA that it will not be able to continue the mission, in which they planned to send an orbiting satellite and a rover to Mars.

NASA has not yet made a formal statement about the situation, but reports say that the decision was due to budget restraints.

“The Americans have indicated that the possibility of them participating is now low – very low. It’s highly unlikely,” Alvaro Gimenez, ESA’s director of science, told BBC. “They are interested, they know it’s a very good option for them – but they have difficulties putting these missions in the budget.”

The ExoMars mission would see a satellite be launched in 2016 to try and look for methane and other trace gases in the Martian atmosphere.

ESA was originally concerned by NASA’s budget cuts last year, which prompted the space agency to talk with the Russian space agency (Roscosmos) about entering the ExoMars project.

If Roscosmos decides to not play a role in the mission, ESA will be forced to use previous designs for a smaller rover, and a date of 2018.

This is not the first adversity NASA has faced recently with budget cuts from the President Barack Obama administration.

In 2010, President Obama introduced a plan to change how NASA operates, by getting rid of the decades-old shuttle program, and focusing future U.S. space travel in the private industry.

NASA’s James Webb Space Telescope also faced the chopping block after last year’s budget was announced, but recent estimates show funding may still be there.

The observatory, which is the successor to the Hubble Space Telescope, is now expected to cost $8.8 billion and could launch in 2018.

Keith Cowing of NASAwatch.com wrote that the 2013 NASA budget will see the Science Mission Directorate budget drop 50 to 60 percent.  He said half a billion could be pulled from this Mars exploration program due to overruns from its Webb telescope project.

On the Net:

Pass The Cake Please!

A full breakfast that includes a sweet dessert contributes to weight loss success, say TAU researchers

When it comes to diets, cookies and cake are off the menu. Now, in a surprising discovery, researchers from Tel Aviv University have found that dessert, as part of a balanced 600-calorie breakfast that also includes proteins and carbohydrates, can help dieters to lose more weight – and keep it off in the long run.

They key is to indulge in the morning, when the body’s metabolism is at its most active and we are better able to work off the extra calories throughout the day, say Prof. Daniela Jakubowicz, Dr. Julio Wainstein and Dr. Mona Boaz of Tel Aviv University’s Sackler Faculty of Medicine and the Diabetes Unit at Wolfson Medical Center, and Prof. Oren Froy of Hebrew University Jerusalem.

Attempting to avoid sweets entirely can create a psychological addiction to these same foods in the long-term, explains Prof. Jakubowicz. Adding dessert items to breakfast can control cravings throughout the rest of the day. Over the course of a 32 week-long study, detailed in the journal Steroids, participants who added dessert to their breakfast – cookies, cake, or chocolate – lost an average of 40 lbs. more than a group that avoided such foods. What’s more, they kept off the pounds longer.

The scale tells the tale

A meal in the morning provides energy for the day’s tasks, aids in brain functioning, and kick-starts the body’s metabolism, making it crucial for weight loss and maintenance. And breakfast is the meal that most successfully regulates ghrelin, the hormone that increases hunger, explains Prof. Jakubowicz. While the level of ghrelin rises before every meal, it is suppressed most effectively at breakfast time.

Basing their study on this fact, the researchers hoped to determine whether meal time and composition impacted weight loss in the short and long term, says Prof. Jakubowicz, or if it was a simple matter of calorie count.

One hundred and ninety three clinically obese, non-diabetic adults were randomly assigned to one of two diet groups with identical caloric intake – the men consumed 1600 calories per day and the women 1400. However, the first group was given a low carbohydrate diet including a small 300 calorie breakfast, and the second was given a 600 calorie breakfast high in protein and carbohydrates, always including a dessert item (i.e. chocolate).

Halfway through the study, participants in both groups had lost an average of 33 lbs. per person. But in the second half of the study, results differed drastically. The participants in the low-carbohydrate group regained an average of 22 lbs. per person, but participants in the group with a larger breakfast lost another 15 lbs. each. At the end of the 32 weeks, those who had consumed a 600 calorie breakfast had lost an average of 40 lbs. more per person than their peers.

Realistic in the long run

One of the biggest challenges that people face is keeping weight off in the long-term, says Prof. Jakubowicz. Ingesting a higher proportion of our daily calories at breakfast makes sense. It´s not only good for body function, but it also alleviates cravings. Highly restrictive diets that forbid desserts and carbohydrates are initially effective, but often cause dieters to stray from their food plans as a result of withdrawal-like symptoms. They wind up regaining much of the weight they lost during the diet proper.

Though they consumed the same daily amount of calories, “the participants in the low carbohydrate diet group had less satisfaction, and felt that they were not full,” she says, noting that their cravings for sugars and carbohydrates were more intense and eventually caused them to cheat on the diet plan. “But the group that consumed a bigger breakfast, including dessert, experienced few if any cravings for these foods later in the day.”

Ultimately, this shows that a diet must be realistic to be adopted as part of a new lifestyle. Curbing cravings is better than deprivation for weight loss success, Prof. Jakubowicz concludes.

On the Net:

Your Odds Of Living A Long Life Just Got Worse

Research just published by a team of demographers at the social science research organization NORC at the University of Chicago contradicts a long-held belief that the mortality rate of Americans flattens out above age 80.

It also explains why there are only half as many people in the U.S. age 100 and above than the Census Bureau predicted there would be as recently as six years ago.

The research is based on a new way of accurately measuring mortality of Americans who are 80 years of age and older, an issue that has proven remarkably elusive in the past. The work will be significant in arriving at more accurate cost projections for programs such as Social Security and Medicare, which are based in part on mortality rates.

The research, done by Leonid A. Gavrilov and Natalia S. Gavrilova, and published in the current edition of the North American Actuarial Journal, is based on highly accurate information about the date of birth and the date of death of more than nine million Americans born between 1875 and 1895. The data is publicly available in the Social Security Administration Death Master File. “It is a remarkable resource that allowed us to build what is called an extinct birth cohort that corrects or explains a number of misunderstandings about the mortality rate of our oldest citizens,” said Leonid Gavrilov.

A stark example of the problem of estimating the number of people over 100 came recently when the U.S. Census Bureau revised sharply downward the number of living centenarians. Six years ago, the bureau predicted that by 2010 there would be 114,000 people age 100 or older. The actual number turned out to be 53,364. The projection was wrong by a factor of two.

The newly published paper, titled “Mortality Measurement at Advanced Ages: A Study of the Social Security Administration Death Master File,” explains the discrepancy and is likely to make a difference in the way mortality projections for the very old are done in the future.

The key finding is straightforward–the rate of mortality growth with age of the oldest Americans is the same as that for those who are younger. The research reveals that mortality deceleration, the long-held belief that the mortality rate flattens out above age 80, does not take place.

Anne Zissu, chair of the Department of Business NYC College of Technology/CUNY, said the research provides “an essential tool” for developing models on seniors’ financial assets.

Zissu said the research “will alter our financial approach to this valuation of mortality/longevity risk. Demographers and financiers need to work on this issue together, and their models must adapt to each other.”

The mortality rate for people between the ages of 30 and 80 follows what is called the Gompertz Law, named for its founder, Benjamin Gompertz, who observed in 1825 that a person’s risk of death in a given year doubles every eight years of age. It is a phenomenon that holds up across nations and over time and is an important part of the foundation of actuarial science.

For approximately 70 years, demographers have believed that above age 80 the Gompertz Law did not hold and that mortality rates flattened out. The work done by the Gavrilovs, a husband-and-wife team, reveals that the Gompertz Law holds at least through age 106, and probably higher, but the researchers say mortality data for those older than 106 is unreliable.

The Gavrilovs say the extinct birth cohort of people born between 1875 and 1895, which they built using the Social Security Administration Death Master File, reveals beyond question that the mortality rate of people in that cohort aligns with the Gompertz Law.

“It amazes me that the Gompertz model fits so well nearly 200 years after he proposed it. I like the approach of using extinct cohorts methods on SSA DMF (Social Security Administration Death Master File) data by month and the use of male-female ratios to test the quality of the data at advanced ages,” said Tom Edwalds, Assistant Vice President, Mortality Research, for the Munich American Reassurance Company.

Prior estimates of the number of centenarians in the United States were made in less direct ways that were subject to error. They depended, for example, on people self-reporting their age in the U.S. Census, which is less reliable than having actual birth and death data.

Gavrilov and Gavrilova work at the Center on the Economics and Demography of Aging, one of the Academic Research Centers of NORC. The study is supported by the National Institute on Aging.

On the Net:

Printable Fibromyalgia Diagnosis Questionnaire

Below you will find a downloadable and printable questionnaire you can fill out and take with you to your doctor.

[pdf-embedder url=”http://www.fibromyalgiatreating.com/wp-content/uploads/2012/02/Fibro-Dx-questionnaire.pdf” title=”Fibro Dx questionnaire”]

Cape Cod Dolphin Strandings At Record High For January

The dolphin strandings reported in Massachusetts have been a record event for the Cape Cod area.

Since early January, 129 common dolphins have been found stranded on the beaches, said Katie Moore a marine mammal rescue and research manager for the International Fund for Animal Welfare.

She said they were able to release 37 of 54 animals recovered alive, but 75 others were dead or had to be euthanized on the spot, bringing the death toll to 92.

It is unusual, though, for so many animals to strand themselves at one time. The dolphins are stranding themselves in groups as large as 10. Dolphins are known to be very social animals and they may be following each other to their own demise.

These strong social bonds serve the animals well in the wild but when they get into trouble they stay together. Moore told Suzanne Goldenberg of The Guardian newspaper: “That bond becomes a liability when they get into shallow water, and that may be why they mass strand.”

Other theories as to why the dolphins were swimming so close to shore include being lost, confused by changing tides, or possibly diseased.

But the pattern of the stranding does not indicate a possible reason why they should be coming ashore. Moore told Goldenberg: “In the ones we are finding alive, we are not seeing any consistent diseases or anything indicating a pattern as to why they might be stranding.” Most of the live dolphins are reported as healthy, and necropsies were performed on the dead ones but lab results are pending.

CNN reports that beached animals are susceptible to sunburn, predators and organ damage. When found, volunteers roll them over on the stomachs to help them breathe. The volunteers also keep seagulls away from the animals to prevent the birds from pecking at the dolphins. Volunteers also cool the animals with water or warm them with blankets as needed.

The volunteers at the International Fund for Animal Welfare are fitting some of the dolphins with satellite tags so they can be tracked after release.  Brian Sharp, a representative of the International Fund for Animal Welfare told ABC News that, “We release them off of beaches where it gets deep quite quickly. From all these signs that we´ve seen from this event, the satellite tags look very good.”

On the Net:

Russian Scientists Reach Ancient Antarctic Lake

After years of drilling, Russian scientists have finally managed to reach down to reveal a unique sub-glacial lake.

The scientists drilled 12,362 feet to reach the sub-glacial Antarctic lake, Vostok, which has been sealed for the past 20 million years.

Explorers hope the lake could reveal new forms of life, and show how life evolved before the ice age.

The discovery of the hidden lakes of Antarctica in the 1990s sparked enthusiasm from scientists across the globe.

Some believe the ice cap above and at the edges have created a hydrostatic seal with the surface that has prevented lake water from escaping, or anything else from getting inside.

Lake Vostok is the largest of Antarctica’s hidden lakes, and is also one of the largest lakes in the world.

The lake could offer a glimpse of what conditions exist for life in similar extreme conditions on Mars and Jupiter’s moon, Europa.

Scientists believe this is the first time the lake has been exposed to air in more than 20 million years.

In order to insure the water has not become contaminated once it is exposed to air, the scientists agreed to drill until a sensor warned them of free water.

At that point, they used kerosene and adjusted the pressure so none of the liquids would fall into the lake, but rather lake water would rise through the hole due to pressure from below.

The lake could be 14 million years old, but the water could be just tens or hundreds of thousands of years old because water may flow between different sub-glacial lakes.

The scientists will not be able to sample the lake water until late 2012, according to a report by the Scientific American.  Winter is starting to bring colder temperatures, and the scientists must retreat from Antarctica while aircraft can still operate.

American and British teams are also exploring the hidden Antarctica lakes.  Researchers from the British Antarctic Survey are drilling down to Lake Ellsworth later this year, while U.S. scientists hope to study the sub glacial Whillans Ice Stream.

Image Caption: The surface above Lake Vostok, hidden under more than a kilometer of ice, looks like most of Antarctica´s landscape–flat, barren, and icy. The best way to detect a subglacial lake is through remote sensing. Credit: M. Studinger, LDEO

On the Net:

Astronomers Spot Fourth Potential Habitable Planet

An international team of astronomers have discovered a potentially habitable planet 22 light years away orbiting a nearby star, reports The Telegraph.
Scientists, led by Carnegie´s Guillem Anglada-Escudé and Paul Butler, discovered the triple star system with a planet orbiting one of the stars, clearly within the habitable zone, where its neither too hot nor too cold, making it suitable for liquid water to be present on the surface.
The host star has a different makeup than our own Sun, being relatively lacking in metallic elements — a discovery that demonstrates that habitable planets can form in a greater variety of environments than previously believed.
Publishing their findings in the Astrophysical Journal Letters, and also available at http://arxiv.org/archive/astro-ph,  the team said the planet has a mass at least 4.5 times that of Earth and orbits its host star about once every 28 days, meaning its year equals about one Earth month.
Data was analyzed from the European Southern Observatory on the GJ 667C star, which is an M-class dwarf star that puts out much less heat than our Sun. The team believes there are at least three planets orbiting close to the star, with one being close enough that it could support life, with similar temperatures that we have on Earth.
“This was expected to be a rather unlikely star to host planets. Yet there they are, around a very nearby, metal-poor example of the most common type of star in our galaxy,” said UC Santa Cruz astronomer Steven Voght. “The detection of this planet, this nearby and this soon, implies that our galaxy must be teeming with billions of potentially habitable rocky planets.”
GJ 667C had previously been observed to have a super-Earth-like planet (GJ 667Cb), but the findings were never published. The planet, however, orbits too close to the star and would not be a suitable planet to host liquid water. The team were collecting data on this planet when they found the clear signal of the new planet (GJ 667Cc). The team said it receives about 90 percent of the light that Earth receives; however, because most of its incoming light is in the infrared, more of this incoming energy should be absorbed by the planet, giving it about the same amount as the Earth absorbs from the sun.
The close-orbiting rocky planet found “is the new best candidate to support liquid water and, perhaps, life as we know it,” said Anglada-Escudé, now working at the University of Gottingen in Germany.
But the theory about water will not be confirmed until astronomers learn more about the planet´s atmosphere.
Some experts have been skeptical that M-class dwarf stars could have planets in the habitable zone because they are too dim and tend to have lots of solar flare activity that can emit lethal doses of radiation to nearby planets.
But the new findings could indicate those experts are wrong. Astronomers are intrigued by the possibilities that these stars could in fact host planets within the habitable zone, suitable for both life and water.
“With the advent of a new generation of instruments, researchers will be able to survey many M dwarf stars for similar planets and eventually look for spectroscopic signatures of life in one of these worlds,” said Anglada-Escudé.
At least two other planets had been discovered within the past year that may also potentially be within the habitable zone. In May, French astronomers confirmed the first exoplanet, Gliese 581d, about 20 light years away, to meet the key requirements to support life. And Swiss astronomers in August discovered planet HD 85512b, about 36 light years away, that seemed to also be within the habitable zone.

Image Caption: Artist’s impression of Gliese 667 Cb with the Gliese 667 A/B binary in the background. Credit: ESO/L. Calçada

On the Net:

Siblings Of Addicts Wired For Addiction

Scientists have discovered that addicts and their siblings have the same disorders in the brain, meaning both are wired for addictive behavior.
However, the siblings that do not exhibit addictive tendencies give researchers hope that addiction can be cured.
Paul Keedwell, a consultant psychiatrist from Britain´s Cardiff University who was not involved in the study, told Reuters: “If we could get a handle on what makes unaffected relatives of addicts so resilient we might be able to prevent a lot of addiction from taking hold.”
Collecting data on drug addicts is difficult since they typically live on the fringes of society. The World Health Organization estimates that there are 15.3 million people globally who have drug problems and 148 countries report problems with injected drug use.
Scientists know that drug abusers have differences in their brains, but they were not sure if the drugs affected the brain or if the differences were already there before drug use.
The researchers worked around this problem by studying the brains of 50 crack addicts and their non-addict siblings and then comparing the results to the brains of other healthy people.
The scientists found that the brains of the non-addicted siblings had the same abnormalities in the part of the brain that controls behavior, the fronto-striatal systems.
According to Karen Ersche, the lead researcher, “It has long been known that not everyone who takes drugs becomes addicted, and that people at risk of drug dependence typically have deficits in self-control. Our finding now shed light on why the risk of becoming addicted to drugs is increased in people with a family history“¦Parts of their brains underlying self-control abilities work less efficiently.”
The results of the research is published in the journal Science.

On the Net:

Taco Bell Linked To 2011 Salmonella Outbreak

The Mexican fast food chain Taco Bell has now been connected to an outbreak of salmonella that occurred in October and November of last year and affected some 68 people across 10 states, according to a report published by the U.S. Center for Disease Control and Prevention (CDC).

While the CDC had for legal reasons initially refused to positively identify the food chain under investigation–calling it instead simply “Restaurant A” in its initial reports–Food Safety News reported on Thursday that the suspect chain was in fact Taco Bell, citing official documentation obtained from the Oklahoma State Department of Health.

Drawing on information from a January 19 CDC report, Taco Bell has said it believes that the infected food most likely originated at the supplier level and was not due to any unsafe practices in the restaurants themselves.

“We take food quality and safety very seriously,” read a press release from the company Wednesday evening.

Investigators believe that the surge of salmonella infections began in mid-October and continued through November. There were illnesses related to the food-borne bacteria reported in states as geographically dispersed as Texas, Michigan, New Mexico, Ohio, Tennessee, Nebraska, Kansas, Iowa, Missouri and Oklahoma–one of several factors that led investigators to believe that the problem occurred at the supply level before the questionable food even made it into restaurants.

No deaths were reported roughly a third of the affected patients had to hospitalized, according to the CDC.

The CDC estimates that approximately one in six Americans become ill each year from food-borne contagions, of which approximately 3,000 result in death.

Salmonella infections typically last four to seven days and are usually accompanied by fever, abdominal cramps, diarrhea and vomiting.

The Taco Bell brand has suffered in recent years thanks to a series of bacterial outbreaks.

In 2006, some 71 people became infected with a strain of E. coli that the CDC said originated in a batch of contaminated lettuce served by Taco Bell restaurants in the northeastern United States.

And in 2010, the Mexican restaurant was also linked to two large salmonella outbreaks that sickened over 150 people across 21 states.

As if the bacterial outbreaks weren´t damaging enough to the fast-food chain´s image, the 2011 outbreak came hard on the heels of a flippant but reputation-damaging lawsuit over the content of the restaurant´s ground beef.

Taco Bell is a subsidiary of Yum Brand Inc, the world´s largest fast food restaurant corporation. The Fortune 500 company also licenses or operates KFC, Pizza Hut and WingStreet.

On the Net:

Privacy Changes For Google Services Are Ruffling Feathers

Search giant Google has previously announced changes in its privacy  policies starting on March 1 of this year. It is expected to fold 60 of its 70 existing product-privacy policies into one blanket policy which users cannot opt out of.

Google will also treat any user with an account who signs into its search services, YouTube, Gmail or its other services as the same individual across those services and it is expected to share data between those services, reports Nicholas Kolakowski for eWeek.

Google is promising the change in direction is to the benefit of the user. A search for “German restaurants”, for example, would yield results not just from the web, but from Google+ posts or Gmail messages.

Privacy advocates are arguing that these are profound changes in what the public has so far expected from Google and they run roughshod over user privacy rights, all in the name of allowing the company to better compete with Facebook for advertising dollars.

Pushing back, Google is arguing that its new policy is more transparent. “Our approach to privacy has not changed,” Pablo Chavez, Google´s director of public policy, argued in a Jan. 30 letter to Congress. “Google users continue to have choice and control.”

Microsoft, wasting no time to strike at Google during a rare moment of weakness, is attempting to pull some customers away into its own search engine and web services.

Microsoft chief spokesman Frank Shaw, in a blog entry on the Official Microsoft Blog, said: “the changes Google announced make it harder, not easier, for people to stay in control of their own information. We take a different approach–we work to keep you safe and secure online, to give you control over your data, and to offer you the choice of saving your information on your hard drive, in the cloud, or on both.”

“If the news about Google has you feeling frustrated, or concerned, or both, we have some great, award-winning alternatives.”

Some of those alternatives include its email service, Hotmail, search engine, Bing, the cloud-based Office 365 and browser Internet Explorer. In addition, Microsoft will run ads in major newspapers this week that says the company is “Putting People First,” reports By Paul McDougall for InformationWeek.

Microsoft would be remiss if it was not taking this advantage to launch opportunistic attacks against any one of Google´s policies or products. And like any good battle between superpowers, it then becomes a question of how Google will respond.

On the Net:

Same Genes Linked To Early- And Late-Onset Alzheimer’s

The same gene mutations linked to inherited, early-onset Alzheimer’s disease have been found in people with the more common late-onset form of the illness.

The discovery by researchers at Washington University School of Medicine in St. Louis may lead doctors and researchers to change the way Alzheimer’s disease is classified.

They report their findings Feb. 1 in the online journal PLoS One (Public Library of Science).

“We probably shouldn’t think of early-onset disease as inherited and late-onset as sporadic because sporadic cases and familial clustering occur in both age groups,” says senior investigator Alison M. Goate, DPhil. “I think it’s reasonable to assume that at least some cases among both early- and late-onset disease have the same causes. Our findings suggest the disease mechanism can be the same, regardless of the age at which Alzheimer’s strikes. People who get the disease at younger ages probably have more risk factors and fewer protective ones, while those who develop the disease later in life may have more protective factors, but it appears the mechanism may be the same for both.”

The researchers used next-generation DNA sequencing to analyze genes linked to dementia. They sequenced the APP (amyloid precursor protein) gene, and the PSEN1 and PSEN2 (presenilin) genes. Mutations in those genes have been identified as causes of early-onset Alzheimer’s disease. They also sequenced the MAPT (microtubule associated protein tau) gene and GRN (progranulin) gene, which have been associated with inherited forms of another illness involving memory loss called frontotemporal dementia.

“We found an increase in rare variants in the Alzheimer’s genes in families where four or more members were affected with late-onset disease,” says Goate, the Samuel and Mae S. Ludwig Professor of Genetics in Psychiatry, professor of neurology, of genetics and co-director of the Hope Center Program on Protein Aggregation and Neurodegeneration. “Changes in these genes were more common in Alzheimer’s cases with a family history of dementia, compared to normal individuals. This suggests that some of these gene variants are likely contributing to Alzheimer’s disease risk.”

The study also found mutations in the MAPT and GRN genes in some Alzheimer’s patients, suggesting they had been incorrectly diagnosed as having Alzheimer’s disease when they instead had frontotemporal dementia.

Goate and her colleagues studied the five genes in members of 440 families in which at least four individuals per family had been diagnosed with Alzheimer’s disease. They found rare variants in key Alzheimer’s-related genes in 13 percent of the samples they analyzed.

“Of those rare gene variants, we think about 5 percent likely contribute to Alzheimer’s disease,” says first author Carlos Cruchaga, PhD, assistant professor of psychiatry. “That may not seem like a lot, but so many people have the late-onset form of Alzheimer’s that even a very small percentage of patients with changes in these genes could represent very large numbers of affected individuals.”

Goate, who in 1991 was the first scientist to identify a mutation in the APP gene linked to inherited, early-onset Alzheimer’s disease, now wants to look closely at families with multiple cases of Alzheimer’s but no mutations in previously identified Alzheimer’s genes. She says it’s likely they carry mutations in genes that scientists don’t yet know about. And she believes that new sequencing techniques could speed the discovery of these genes. In fact, the researchers say a study like this would have been impossible only a few years ago.

“With next-generation sequencing technology, it’s now possible to sequence all of these genes at the same time,” Cruchaga says. “One reason we didn’t do this study until now is that 15 to 20 years ago when these genes were first identified, it would have taken years to sequence each gene individually.”

Cruchaga and Goate say the new technology and their new findings suggest that it may be worthwhile to sequence these genes in people with a strong family history of Alzheimer’s disease.

“We would like to see physicians who treat patients with late-onset disease ask detailed questions about family history,” Goate says. “I’m sure many probably do that already, but in those families with very strong histories, it’s not unreasonable to think about screening for genetic mutations.”

She says such screenings also may weed out people thought to have Alzheimer’s disease who actually have changes in genes related to frontotemperal dementia.

Both Goate and Cruchaga agree that one result of their discovery that the same genes can be connected with both early- and late-onset forms of Alzheimer’s disease may be changes in the way the disease is classified.

“It’s always been somewhat arbitrary, figuring out where early-onset ends and late-onset begins,” Goate says. “So I no longer look at early- and late-onset disease as being different illnesses. I think of them as stages along a continuum.”

On the Net:

Brain Capacity Limits Exponential Online Data Growth

Study of internet file sizes shows that information growth is self-limited by the human mind

Scientists have found that the capacity of the human brain to process and record information – and not economic constraints – may constitute the dominant limiting factor for the overall growth of globally stored information. These findings have just been published in an article in EPJ B´ by Claudius Gros and colleagues from the Institute for Theoretical Physics at Goethe University Frankfurt in Germany.

The authors first looked at the distribution of 633 public internet files by plotting the number of videos, audio and image files against the size of the files. They gathered files which were produced by humans or intended for human use with the spider file search engine Findfiles.net. They chose to focus on files which are hosted on domains pointing from the online encyclopedia Wikipedia and the open web directory dmoz.

Assuming that economic costs for data production are proportional to the amount of data produced, these costs should be driving the generation of information exponentially. However, the authors found that, in fact, economic costs were not the limiting factors for data production. The absence of exponential tails for the graph representing the number of files indicates this conclusion.

They found that underlying neurophysiological processes influence the brain´s ability to handle information. For example, when a person produces an image and attributes a subjective value to it, for example, a given resolution, he or she is influenced by his or her perception of the quality of that image. Their perception of the amount of information gained when increasing the resolution of a low-quality image is substantially higher than when increasing the resolution of a high-quality photo by the same degree. This relation is known as the Weber-Fechner law.

The authors observed that file-size distributions obey this Weber-Fechner law. This means that the total amount of information cannot grow faster than our ability to digest or handle it.

References

1. Gros C., Kaczor G., Marković D., (2012) Neuropsychological constraints to human data production on a global scale, European Physical Journal B (EPJ B) 85: 28, DOI 10.1140/epjb/e2011-20581-3 (http://www.springerlink.com/content/00227p270r74943m/)

On the Net:

Child Abuse And Neglect Cost The United States $124 Billion

Rivals cost of other high profile public health problems
The total lifetime estimated financial costs associated with just one year of confirmed cases of child maltreatment (physical abuse, sexual abuse, psychological abuse and neglect) is approximately $124 billion, according to a report released by the Centers for Disease Control and Prevention, published in Child Abuse and Neglect, The International Journal.
This study looked at confirmed child maltreatment cases, 1,740 fatal and 579,000 non—fatal, for a 12—month period. The lifetime cost for each victim of child maltreatment who lived was $210,012, which is comparable to other costly health conditions, such as stroke with a lifetime cost per person estimated at $159,846 or type 2 diabetes, which is estimated between $181,000 and $253,000.  The costs of each death due to child maltreatment are even higher.
“No child should ever be the victim of abuse or neglect — nor do they have to be.  The human and financial costs can be prevented through prevention of child maltreatment,” said Linda C. Degutis, Dr.P.H., M.S.N., director of CDC“²s National Center for Injury Prevention and Control.
Child maltreatment has been shown to have many negative effects on survivors, including poorer health, social and emotional difficulties, and decreased economic productivity.  This CDC study found these negative effects over a survivor“²s lifetime generate many costs that impact the nation“²s health care, education, criminal justice and welfare systems.
Key findings:
    The estimated average lifetime cost per victim of nonfatal child maltreatment includes:
        $32,648 in childhood health care costs
        $10,530 in adult medical costs
        $144,360 in productivity losses
        $7,728 in child welfare costs
        $6,747 in criminal justice costs
        $7,999 in special education costs
    The estimated average lifetime cost per death includes:
        $14,100 in medical costs
        $1,258,800 in productivity losses
Child maltreatment can also be linked to many emotional, behavioral, and physical health problems. Associated emotional and behavioral problems include aggression, conduct disorder, delinquency, antisocial behavior, substance abuse, intimate partner violence, teenage pregnancy, anxiety, depression, and suicide.
Past research suggests that child maltreatment is a complicated problem, and so its solutions cannot be simple. An individual parent or caregiver“²s behavior is influenced by a range inter—related factors such as how they were raised, their parenting skills, the level of stress in their life, and the living conditions in their community.  Because of this complexity, it is critical to invest in effective strategies that touch on all sectors of society. 
“Federal, state, and local public health agencies as well as policymakers must advance the awareness of the lifetime economic impact of child maltreatment and take immediate action with the same momentum and intensity dedicated to other high profile public health problems —in order to save lives, protect the public“²s health, and save money,” said Dr. Degutis.
Several programs have demonstrated reductions in child maltreatment and have great potential to reduce the human and economic toll on our society.  Several examples of effective programs include:
    Nurse—Family Partnership, an evidence—based community health program. Partners a registered nurse with a first—time mother during pregnancy and continues through the child“²s second birthday.   http://www.nursefamilypartnership.org/External Web Site Icon
    Early Start, provides coordinated, family—centered system of services:  http://www.dds.ca.gov/earlystart/External Web Site Icon California“²s response to federal legislation providing early intervention services to infant and toddlers with disabilities and their families.
    Triple P, a multilevel parenting and family support system: http://www.triplep—america.com/External Web Site Icon Aims to prevent severe emotional and behavioral disturbances in children by promoting positive and nurturing relationships between parent and child.
The article, “The economic burden of child maltreatment in the United States and implications for prevention,” is available at http://www.sciencedirect.com/science/journal/aip/01452134External Web Site Icon.
CDC“²s Injury Center works to prevent injuries and violence and their adverse health consequences.  For more information on public health child maltreatment prevention activities and research, please visit http://www.cdc.gov/ViolencePrevention/childmaltreatment.

On the Net:

Doctors Say Sugar Should Be Regulated Like Alcohol, Tobacco

[ Watch the Video ]
A group of doctors have published a report claiming sugar should be regulated and taxed by the government in much the same way as tobacco and alcohol.
They claim that sugar, at the rate most Americans consume it, is more than empty calories. Sugar changes metabolism, raises blood pressure, alters the signaling of hormones and causes damage to the liver. These are similar to the health hazards posed by excessively drinking alcohol which is made from fermented and distilled sugar.
The researchers claim that sugar consumption worldwide has tripled over the last 50 years, and is a major cause to the worldwide obesity epidemic. But, they argue that obesity may just be a marker for the damage caused by sugar´s toxic effects.
According to Claire Brindis of the University of California, San Francisco (UCSF), one of the contributors to the report, people are not going to change by themselves. In order to push people away from sugar, there has to be community-wide and environmental intervention, much like what has happened with tobacco.
In other words, the researchers agree that sugar should be taxed and regulated by the government, either local or federal, thereby increasing the price and pushing people away from sugar consumption.
Some of the regulations that would be placed on sugar would be limiting its availability in schools and workplaces, especially vending machines.
Dr. Laura Schmidt (also from UCSF), who contributed to the report, said “What we want is actually to increase people´s choices by making foods that aren´t loaded with sugar comparatively easier and cheaper to get.”
But some critics disagree with the report.  The Sugar Association told CBS News that they dispute the science presented, namely the tripled sugar consumption rate that they say is based on incomplete science.
Another critic Barbara Gallani, of the Food and Drink Federation in the UK, says sugar alone is not the sole cause of heart disease and other sugar related symptoms. She told the DailyMail, “The causes of these diseases are multi-factorial and demonizing food components does not help consumers to build a realistic approach to the diet.”
The researchers published their study and commentary in the Feb. 2 issue of Nature.

On the Net:

Pfizer Recalls Ineffective Birth Control Pills

The pharmaceutical giant Pfizer has had to recall some one million packages of birth control pills after post-production investigations indicated that some of the packages may not contain enough contraceptive to prevent pregnancy.
Pfizer Inc. has voluntarily recalled 14 batches of their popular Lo/Ovral-28 tablets and 14 batches of Norgestrel and Ethinyl Estradiol tablets . The company reported that due to production irregularities some of the packages may not contain the clinically-proven amount of the contraceptive required to avoid fertilization.
In an official statement made Wednesday on their website, Pfizer explained that a number of the foil blister-packets in which the medications are packaged were found to contain too many tablets while others had too few.
A Pfizer spokesman said the problem was a result of a combination of visual and mechanical inspection errors on their packaging line.
In the statement, the company also made clear that the affected pills were in no way dangerous to women´s health but that there was a possibility that they might not be effective as a contraceptive.
Pfizer also advised customers who use the pills in question to immediately “begin using a non-hormonal form of contraception.”
The affected pills are marketed by Akrimax Pharmaceuticals, and by the time Pfizer noticed the potential production error, the pills had already been distributed to warehouses and pharmacies nationwide.
“As a result of this packaging error, the daily regimen for these oral contraceptives may be incorrect and could leave women without adequate contraception, and at risk for unintended pregnancy,” read the website.
Packages of hormone-regulating oral contraceptives typically contain 21 tablets with medication and seven inactive sugar tablets to regulate the menstrual period and prevent pregnancy.
The company says that the affected packets have expiration dates between July 31, 2013, and March 31, 2014.

On the Net:

Using Indigestion Drugs Could Bump Up Risk Of A Hip Fracture

According to a new study, post-menopausal women are 35 percent more likely to suffer a hip fracture if they take indigestion drugs, or “proton pump inhibitors” (PPIs).

These drugs are the most common medicines used around the world and are often used to treat heartburn and acid reflex.

However, PPIs can inhibit the absorption of calcium, which leads to the increased risk of fractures.

Researchers looked at the association between PPIs and hip fractures in just under 80,000 post-menopausal women over an eight year period from 2000 to 2008.

The team found that women with a prolonged use of these drugs and who smoke could be up to 50 percent more likely to suffer from hip fractures compared to women who do not smoke.

The Food and Drug Administration (FDA) issued a warning in May 2010 about the relationship between hip fractures and using indigestion drugs.

The team found that out of the 79,899 post-menopausal women in the study, 893 suffered hip fractures in total over the eight year period.  Correlation was found between the length of time PPIs were taken and the risk of fractures.

The researchers also found an increased use in women taking indigestion drugs.  They found that 6.7 percent of women were regularly using a PPI in 2000, jumping to 18.9 percent in 2008.

The authors wrote in a study published on bmj.com that the risk of hip fracture returned to a normal level two years after patients stopped taking the indigestion drugs.

The FDA says it hopes to revise labeling on these drugs, and the researchers stress the importance of evaluating the need for long-term use of PPIs among those with a history of smoking.

On the Net:

Re-Blockage Rates Low In Both Stented And Surgically-Opened Arteries

Opening blocked neck arteries with a metal stent or surgery were equally durable, in research presented at the American Stroke Association’s International Stroke Conference 2012.

Two years after the procedures, less than 7 percent of patients had developed repeat blockages (restenosis), researchers said.

“Unlike bare metal stents placed in coronary arteries, where re-blockage occurs about 20 percent of the time, we found the re-blockage rates in the carotid artery were quite small,” said Brajesh K. Lal, M.D., lead author of this analysis and associate professor of vascular surgery at the University of Maryland School of Medicine in Baltimore. “Patients and physicians can be reassured that both procedures are durable and that re-blockage rates are equivalent, so they can use different criteria to determine which procedure is right for a patient.”

The study is the largest to look at restenosis rates after either procedure.

The study participants – part of the Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) – had partial blockages in a neck artery. Symptomatic patients had experienced a non-disabling stroke or transient ischemic attack (mini-stroke) because of the blockage, while asymptomatic patients had not.

Previously, this head-to-head comparison of the two procedures showed no difference in the combined rates of stroke, heart attack or death between patients undergoing surgical removal of a blockage (carotid endarterectomy) or stenting.

About 10 percent of strokes are caused by blockages in the neck arteries, which supply blood to the brain. Revascularization procedures preserve blood flow and lower the risk of stroke.

In the current study, 1,086 patients received stenting and 1,105 endarterectomy. All were assessed at one, six, 12 and 24 months after the procedure with an ultrasound to identify those who had developed a 70 percent or greater blockage in the treated section.

After two years, the researchers found:

    Identical rates of restenosis (5.8 percent) after stenting and endarterectomy.
    Complete blockage (occlusion) in 0.3 percent after stenting and 0.5 percent after endarterectomy.
    Combined restenosis/occlusion in 6 percent after stenting and 6.3 percent after endarterectomy.
    Twenty stent patients and 23 endarterectomy patients had undergone a second procedure to open a re-blocked carotid.
    Rates of restenosis were about double in women and patients with diabetes and abnormal lipid levels.
    Stroke rates were 4 times higher in patients who developed a restenosis compared to those that did not develop a restenosis during follow-up.

“These may be groups we need to focus more on by monitoring them closely and aggressively controlling risk factors after the procedures,” said Lal, who is also chief of vascular surgery at the Baltimore VA Medical Center in Maryland.

Physicians from different specialties perform revascularization procedures. In the study, results didn’t differ by specialty.

“CREST was unique in having a built-in training and credentialing process that mandated participating physicians perform 1,500 revascularization procedures before randomizing any patients,” Lal said. “These results provide hard data for the FDA and professional societies to use as they recommend a particular type or extent of training for performing these procedures.” Monitoring of CREST participants will continue through 10 years.

On the Net:

Device Could One Day Read Your Mind By Decoding Brain Waves

[ Watch the Video ]
Have you ever imagined taking on the role of Spock in the popular Star Trek shows and films, using your mind melding abilities to read the thoughts of others. Well that could one day become a reality, in a roundabout way.
Researchers at the University of California, Berkeley and University of California, San Francisco, have demonstrated a new method to reconstruct words, based on signals from the brains of patients thinking of those words. While it is a far cry from a person being able to read another´s mind, it does give plausibility to the notion that one could read someone´s mind with the aid of a machine.
Reporting in the journal PLoS Biology, the UCB researchers said the method could one day help comatose and stroke patients communicate with the outside world. There have been several approaches in recent years that have suggested scientists were on track to tap into the minds of fellow man.
Robert Knight of UCB and Edward Chang of UCSF, senior authors of the study, said the process gives great insight into how the brain processes language. The brain breaks down words into complex patterns of electrical activity, which can be detected and translated back into an approximate version of the original sound.
Because the brain is believed to process thought similarly to how it processes sound, scientists hope the breakthrough can lead to an implant that would interpret thought into speech in people who cannot talk.
Unlike Spock, who could read people´s minds just by placing his hand in a particular fashion on the subject´s face, the technology behind this new breakthrough is precarious at best.
Any device capable of reading one´s mind is a long way off because researchers would have to make the technology much more accurate than it is now and also find a way to apply sounds which the patient merely thinks of, rather than hears.
It would also require electrodes to be placed beneath the skull onto the brain itself, because no sensors exist which could detect the tiny patterns of electrical activity non-invasively. But this doesn´t mean it could not one day be feasible.
“This is huge for patients who have damage to their speech mechanisms because of a stroke or Lou Gehrig´s disease and can´t speak,” said Knight. “If you could eventually reconstruct imagined conversations from brain activity, thousands of people could benefit.”
For the research, Knight and colleagues studied 15 epilepsy patients who were undergoing exploratory surgery to find the cause of their seizures, a process in which a series of electrodes are connected to the brain through a hole in the skull.
While the electrodes were attached, the team monitored activity in the temporal lobe — the brain´s speech processing area — as patients listened to about ten minutes of conversation. By breaking down the conversation into component sounds, the team was able to create two computer models which matched distinct signals in the brain to individual sounds.
They then tested the models by playing a recording of a single word to the patients, and predicting from the brain activity what word they had heard was.
One of the models was able to produce a close approximation of the word that scientists could guess what it was 90 percent of the time.
“This is exciting in terms of the basic science of how the brain decodes what we hear,” Knight, director of the Helen Wills Neuroscience Institute at UCB, told The Guardian’s Ian Sample.
“The next step is to test whether we can decode a word when a person imagines it. That might sound spooky, but this could really help patients. Perhaps in 10 years it will be as common as grandmother getting a new hip,” Knight added.
Dr. Brian Pasley, the study´s lead author, compared the method to a pianist who could watch a piano being played in a soundproof room and “hear” the music just by watching the movement of the keys.
“This research is based on sounds a person actually hears, but to use this for a prosthetic device, these principles would have to apply to someone who is imagining speech,” cautioned Pasley in a recent statement. “There is some evidence that perception and imagery may be pretty similar in the brain. If you can understand the relationship well enough between the brain recordings and sound, you could either synthesize the actual sound a person is thinking, or just write out the words with a type of interface device.”
Knight had his doubts that the method could actually work, but was impressed with the results. “His computational model can reproduce the sound the patient heard and you can actually recognize the word, although not at a perfect level,” knight said of Pasley.
The ultimate goal of the study was to explore how the human brain encodes speech and determine which aspects of speech are most important for understanding.
“At some point, the brain has to extract away all that auditory information and just map it onto a word, since we can understand speech and words regardless of how they sound,” said Pasley. “The big question is, what is the most meaningful unit of speech? A syllable, a phone, a phoneme? We can test these hypotheses using the data we get from these recordings.”
Being able to read minds is a controversial subject. Ethical concerns have arisen that such technology could be used covertly or to interrogate criminals and terrorists.
But Knight said that only exists in the realm of science fiction. “To reproduce what we did, you would have to open up someone´s skull and they would have to cooperate.” Making a device to help people speak will not be easy. Brain signals that encode imagined words could be harder to decipher and the device must be small and operate wirelessly. It would also prove difficult distinguishing between words a person wants to speak and thoughts they wish to keep secret.
Jan Schnupp, professor of neuroscience at Oxford University, who thought the breakthrough was “remarkable,” said possible uses of the technology to use mind-reading devices to eavesdrop on the privacy of our minds is unjustified.
The scientists could only get their technique to work because of cooperation from patients who were willing to participate. You aren´t going to get willing parties in an interrogation setting, he noted.
“We can rest assured that our skulls will remain an impenetrable barrier for any would-be technological mind hacker for any foreseeable future,” Schnupp told Sample of The Guardian.
But the benefits of such devices could be transformative, said Mindy McCumber, a speech-language pathologist at Florida Hospital in Orlando.
“As a therapist, I can see potential implications for the restoration of communication for a wide range of disorders,” she told Jason Palmer of BBC News. “The development of direct neuro-control over virtual or physical devices would revolutionize ℠augmentative and alternative communication℠, and improve quality of life immensely for those who suffer from impaired communication skills or means.”
The report is accompanied by an interview with the authors for the PLoS Biology Podcast.

Image 2: An X-ray CT scan of the head of one of the volunteers, showing electrodes distributed over the brain´s temporal lobe, where sounds are processed. Credit: Adeen Flinker, UC Berkeley

On the Net:

Critical Habitat And Corridors For World’s Rarest Gorilla

Protection of forest habitat could support larger population of Cross River gorillas

Conservationists working in Central Africa to save the world’s rarest gorilla have good news: the Cross River gorilla has more suitable habitat than previously thought, including vital corridors that, if protected, can help the great apes move between sites in search of mates, according to the North Carolina Zoo, the Wildlife Conservation Society, and other groups.

The newly published habitat analysis, which used a combination of satellite imagery and on-the-ground survey work, will help guide future management decisions for Cross River gorillas living in the mountainous border region between Nigeria and Cameroon.

The study appears in the online edition of the journal Oryx. The authors include: Richard A. Bergl of the North Carolina Zoo; Ymke Warren (deceased), Aaron Nicholas, Andrew Dunn, Inaoyom Imong, and Jacqueline L. Sunderland-Groves of the Wildlife Conservation Society; and John F. Oates of Hunter College, CUNY.

“We’re pleased with our results, which have helped us to identify both new locations where the gorillas live and apparently unoccupied areas of potential gorilla habitat,” said Dr. Bergl of the North Carolina Zoo, lead author of the study. “The study is a great example of how scientific research can be directly applied to great ape conservation.”

WCS conservationist and co-author Andrew Dunn said: “The good news for Cross River gorillas is that they still have plenty of habitat in which to expand, provided that steps are taken to minimize threats to the population.”

Using high-resolution satellite images, the research team mapped the distribution of forest and other land-cover types in the Cross River region. In order to ground truth the land-cover map, field researchers traveled to more than 400 control points to confirm its accuracy. They found that the land-cover rating system had an accuracy rate of 90 percent or higher. The land-cover map was combined with other environmental data to determine the extent of the Cross River gorilla’s habitat. The entire Cross River region was divided into 30 x 30 meter pixels, and each pixel was rated in terms of its suitability as gorilla habitat (with steep, forested areas of low human activity receiving a high rating, and lowland areas more significantly impacted by people receiving a low rating). These ratings were translated into a habitat suitability map for the area.

With the new habitat suitability map to guide them, the team then selected 12 locations possessing all the characteristics of gorilla habitat (mainly forested landscapes far from human settlements) for field surveys. Most of these areas had no previous record of gorillas, but to their surprise, the team found signs of gorilla presence (in the form of gorilla dung and nests) in 10 of the 12 sites, thereby confirming the value of using satellite image analysis to predict suitable habitat and to prioritize areas in which to conduct further surveys.

Overall, the findings of the study represent a significant expansion of known Cross River gorilla range. The area now known to be occupied by gorillas is more than 50 percent larger than had previously been documented. The findings also support recent genetic analyses that suggest a high degree of connectivity between the 11 known locations where gorillas occur.

The study also located parts of the population under threat from isolation through fragmentation. For example, Afi Mountain Wildlife Sanctuary in Nigeria, which contains a significant portion of the Cross River gorilla population, is only tenuously connected to the nearest sub-population of gorillas by farmland and other forms of habitat degradation.

“For small populations such as this one, the maintenance of connective corridors is crucial for their long term survival,” said WCS researcher Inaoyom Imong. “The analysis is the first step in devising ways to rehabilitate degraded pathways.”

Authors of the study will use their findings at the upcoming Cross River gorilla workshop (scheduled for February in Limbe, Cameroon) to help formulate a new 5-year regional plan for the subspecies. “This latest research has greatly expanded our knowledge on Cross River gorilla distribution, which will lead to more effective management decisions,” said WCS conservationist and co-author Aaron Nicholas.

Dr. James Deutsch, Executive Director for WCS’s Africa Program, said: “Accurately assessing the state of available habitat is a vital foundation for future conservation efforts for the Cross River gorilla. A new action plan for the subspecies will build on the collaborative partnership already underway between Nigeria and Cameroon and ensure a future for this unique primate.”

The Cross River gorilla is the rarest of the four subspecies of gorilla, numbering fewer than 300 individuals across its entire range, limited to the forested mountainous terrain on the border region of Nigerian and Cameroon. The subspecies is listed as “Critically Endangered” and is threatened by both habitat disturbance and hunting, as the entire population lives in a region of high human population density and heavy natural resource exploitation.

Conservation work on Cross River gorillas in this region is a priority for several U.S. government agencies, including the U.S. Agency for International Development, U.S. Fish and Wildlife Service, and U.S. Forest Service.

The study was made possible through the generous support of: the Arcus Foundation; Great Ape Conservation Fund; KfW (German Development Bank); Lincoln Park Zoo; National Geographic Conservation Trust; Primate Conservation Inc.; and U.S. Fish and Wildlife Service.

Image Caption: The Cross River gorilla, the most endangered great ape in Africa, is seen here in Cameroon’s Limbe Wildlife Center. Images of wild Cross River gorillas are rare, due to the rugged terrain in which they exist and the great ape’s elusive behavior. Credit: Nicky Lankester

On the Net:

Human Activity, Not Solar Activity, Drives Global Warming

A new NASA study underscores the fact that greenhouse gases generated by human activity — not changes in solar activity — are the primary force driving global warming.

The study offers an updated calculation of the Earth’s energy imbalance, the difference between the amount of solar energy absorbed by Earth’s surface and the amount returned to space as heat. The researchers’ calculations show that, despite unusually low solar activity between 2005 and 2010, the planet continued to absorb more energy than it returned to space.

James Hansen, director of NASA’s Goddard Institute for Space Studies (GISS) in New York City, led the research. Atmospheric Chemistry and Physics published the study last December.

Total solar irradiance, the amount of energy produced by the sun that reaches the top of each square meter of the Earth’s atmosphere, typically declines by about a tenth of a percent during cyclical lulls in solar activity caused by shifts in the sun’s magnetic field. Usually solar minimums occur about every eleven years and last a year or so, but the most recent minimum persisted more than two years longer than normal, making it the longest minimum recorded during the satellite era.

Pinpointing the magnitude of Earth’s energy imbalance is fundamental to climate science because it offers a direct measure of the state of the climate. Energy imbalance calculations also serve as the foundation for projections of future climate change. If the imbalance is positive and more energy enters the system than exits, Earth grows warmer. If the imbalance is negative, the planet grows cooler.

Hansen’s team concluded that Earth has absorbed more than half a watt more solar energy per square meter than it let off throughout the six year study period. The calculated value of the imbalance (0.58 watts of excess energy per square meter) is more than twice as much as the reduction in the amount of solar energy supplied to the planet between maximum and minimum solar activity (0.25 watts per square meter).

“The fact that we still see a positive imbalance despite the prolonged solar minimum isn’t a surprise given what we’ve learned about the climate system, but it’s worth noting because this provides unequivocal evidence that the sun is not the dominant driver of global warming,” Hansen said.

According to calculations conducted by Hansen and his colleagues, the 0.58 watts per square meter imbalance implies that carbon dioxide levels need to be reduced to about 350 parts per million to restore the energy budget to equilibrium. The most recent measurements show that carbon dioxide levels are currently 392 parts per million and scientists expect that concentration to continue to rise in the future.

Climate scientists have been refining calculations of the Earth’s energy imbalance for many years, but this newest estimate is an improvement over previous attempts because the scientists had access to better measurements of ocean temperature than researchers have had in the past.

The improved measurements came from free-floating instruments that directly monitor the temperature, pressure and salinity of the upper ocean to a depth of 2,000 meters (6,560 feet). The network of instruments, known collectively as Argo, has grown dramatically in recent years since researchers first began deploying the floats a decade ago. Today, more than 3,400 Argo floats actively take measurements and provide data to the public, mostly within 24 hours.

Hansen’s analysis of the information collected by Argo, along with other ground-based and satellite data, show the upper ocean has absorbed 71 percent of the excess energy and the Southern Ocean, where there are few Argo floats, has absorbed 12 percent. The abyssal zone of the ocean, between about 3,000 and 6,000 meters (9,800 and 20,000 feet) below the surface, absorbed five percent, while ice absorbed eight percent and land four percent.

The updated energy imbalance calculation has important implications for climate modeling. Its value, which is slightly lower than previous estimates, suggests that most climate models overestimate how readily heat mixes deeply into the ocean and significantly underestimates the cooling effect of small airborne particles called aerosols, which along with greenhouse gases and solar irradiance are critical factors in energy imbalance calculations.

“Climate models simulate observed changes in global temperatures quite accurately, so if the models mix heat into the deep ocean too aggressively, it follows that they underestimate the magnitude of the aerosol cooling effect,” Hansen said.

Aerosols, which can either warm or cool the atmosphere depending on their composition and how they interact with clouds, are thought to have a net cooling effect. But estimates of their overall impact on climate are quite uncertain given how difficult it is to measure the distribution of the particles on a broad scale. The new study suggests that the overall cooling effect from aerosols could be about twice as strong as current climate models suggest, largely because few models account for how the particles affect clouds.

“Unfortunately, aerosols remain poorly measured from space,” said Michael Mishchenko, a scientist also based at GISS and the project scientist for Glory, a satellite mission designed to measure aerosols in unprecedented detail that was lost after a launch failure in early 2011. “We must have a much better understanding of the global distribution of detailed aerosol properties in order to perfect calculations of Earth’s energy imbalance,” said Mishchenko.

Image 1: A prolonged solar minimum left the sun’s surface nearly free of sunspots and accompanying bright areas called faculae between 2005 and 2010. Total solar irradiance declined slightly as a result, but the Earth continued to absorb more energy than it emit throughout the minimum. Credit: NASA Goddard’s Scientific Visualization Studio

Image 2: A graph of the sun’s total solar irradiance shows that in recent years irradiance dipped to the lowest levels recorded during the satellite era. The resulting reduction in the amount of solar energy available to affect Earth’s climate was about .25 watts per square meter, less than half of Earth’s total energy imbalance. (Credit: NASA/James Hansen)

Image 3: Data collected by Argo floats, such as this one, helped Hansen’s team improve the calculation of Earth’s energy imbalance. Credit: Argo Project Office

Image 4: A chart shows the global reach of the network of Argo floats. (Credit: Argo Project Office)

On the Net:

Normal Weight Doctors More Effective in Helping Overweight Patients

According to a recent national survey by a leading medical research center, a doctor´s own weight may significantly influence how effectively he or she is able to care for and diagnose patients suffering from obesity.

The study was carried out by researchers at the world-renowned Johns Hopkins Bloomberg School of Public Health and Medical Institutions in Baltimore and published online this week in the medical journal Obesity.

They found that physicians with a high body mass index (BMI) were almost half as likely to discuss weight-management issues with overweight and obese patients than doctors with normal BMIs (18% versus 30%, respectively).

The study´s lead researcher Dr. Sara Bleich says they also found that doctors with normal BMIs felt more confident in their competency to provide weight-loss advice to patients than their overweight colleagues (53% versus 37%, respectively).

Because primary care doctors in particular can be highly influential in intervening and counseling patients struggling with their weight, the study´s authors believe that a physician´s BMI may be a critical factor in a patient´s obesity care.

Bleich´s team looked at a cross section of 500 general internists, general practitioners and family practice doctors, recruiting physicians from a pool of volunteers through the Epocrates Honors panel which has a membership of some 145,000 AMA-certified doctors.

Each participating physician was asked to fill out a questionnaire with questions relating to diagnosing obesity and weight-management strategies. They were also asked to score their own effectiveness in helping patients with weight loss, including initiating counseling and prescribing weight-loss medications.

The study also examined the doctors´ perspectives on the importance of modeling healthy behavior and whether they believed that a patient´s trust in the soundness of their doctor´s advice was connected with the doctor´s weight.

The results indicated that doctors with normal BMIs were more likely than their overweight counterparts to believe that physicians have a responsibility to serve as role models for their patients in terms of maintaining healthy weight (73% versus 57%, respectively).

A similar correlation was also observed regarding how the participants thought their patients perceive the relationship between their doctor´s weight and his competence in advising them. Some 79% of the normal-weight responders believed that patients were more likely to trust the advice of a non-overweight doctor compared with 69% of the overweight or obese doctors.

The largest discrepancy came in terms of how the MDs´ perception of their own weight affected their diagnoses and treatment recommendations for overweight patients. Of the surveyed doctors, 93% reported that they were more likely to diagnose a patient as obese if they perceived that patient´s body weight as being greater than their own compared to 7% who said they wouldn´t. Approximately the same ratio held true for whether or not they would start a conversation about weight loss with a patient they thought heavier than themselves.

While the study provides researchers with some initial orientation and insight into a largely unexplored field of weight-loss studies, the authors were quick to point out their study has a number of critical weaknesses and cannot yet be treated as medical gospel.

For one thing, the study was set-up to show possible correlation only and cannot be held up as an example of causal evidence that skinnier doctors are more effective in helping patients lose weight. Moreover, because a number of the questions relied not on the physicians´ actual BMI but rather on their self-image–the accuracy of which can vary greatly from one responder to the next–exact relationships cannot be objectively determined.

Nonetheless, the team stated, some general advice can still be gleaned from the survey.

“Physician self-efficacy to care for obese patients–regardless of their BMI–may be improved by targeting physician well being and enhancing the quality of obesity-related training in medical school, residency or continuing medical education,” their report concluded.

On the Net:

Treasure Hunters Make Bizarre Find At Bottom Of Baltic Sea

A team of professional Swedish treasure hunters have made a remarkable find at the bottom of the Baltic Sea. What exactly it is, however, will remain a mystery for the immediate future.

At an unrevealed location some 250 feet below the brackish waters between Sweden and Finland, the deep-sea salvage company Ocean Explorer has discovered a large, bizarrely shaped object on the seabed.

“I have been doing this for nearly 20 years, so I have a seen a few objects on the bottom, but nothing like this,” the crew´s team leader Peter Lindberg told Brooke Bowman of CNN.

“We had been out for nine days and we were quite tired and we were on our way home, but we made a final run with a sonar fish and suddenly this thing turned up.”

At first glance, said Lindberg, the team joked that they had found a UFO. However, upon closer inspection using a device known as a side-scan sonar, the joke no longer seemed quite so far-fetched.

What they found appears to be a disc-shaped object roughly 180-feet in diameter with a rigid tail that extends another 1,200 or so feet.

And what´s more, when the team turned back to make another pass and get a closer look, they found another similarly shaped object some 600 feet away.

Lindberg´s crew says that the object is too large to be part of a shipwreck and admits that they´re utterly stumped as to what the mysterious object could be.

Not surprisingly, this has led to wild speculation and a number of theories that border on the absurd.

“We´ve heard lots of different kinds of explanations, from George Lucas´s spaceship–the Millennium Falcon–to ℠it´s some kind of plug to the inner world,´ like it should be hell down there or something,” Lindberg told CNN and the Daily Mail Online.

And there´s only one way to find out for sure, he added.

Unfortunately, the team will have to wait for tamer waters before they´re able to make an exploratory dive to clear up the mystery once and for all. That could mean weeks or even months of nerve-racking waiting.

While a handful of adventurous salvage crews have found troves of treasure–both historical and literal–at the bottom of the world´s oceans, more often than not such searches result in disappointment and bankruptcy.

Andreas Olsson, chief of the archeological division at Sweden’s Maritime Museums, says that he isn´t getting his hopes up too high about Lindberg´s find.  While he hopes for the best, maritime treasure-hunting simply doesn´t have much of a record of success.

“If you want to stand in a cold shower tearing up £50 notes, go shipwreck hunting,” he told CNN.

“Most shipwrecks are rotting away, or carrying dull things–all the romance has been taken out of it.”

And Lindberg himself doesn´t entirely disagree. He explained that to have success in the business, you need patience, luck and a lot of financing–and even that doesn´t always work.

“It´s a very difficult industry to be in–it´s money all the time,” he confessed.

And while he does appreciate the historical element of his profession, Lindberg says make no mistakes; he´s in it for the money.

“The best thing [our find] could be, would be 60 meters of gold–then I would be very happy.”

Yet even if it doesn´t turn out to be a gold-laden Millennium Falcon from Star Wars, Lindberg´s team is already working on a plan to earn back some of the time and money they´ve invested in the search.

They say that if it´s something really interesting–whether an archeological object or some kind of natural anomaly–they´ll try to make a sort of underwater tourist attraction out of it, taking investors and curious tourists for an up-close and personal look at the marine oddity for a hefty fee.

On the Net:

Milk Is Good For Your Brain

New research finds milk drinkers scored better on memory and brain function tests

Pouring at least one glass of milk each day could not only boost your intake of much-needed key nutrients, but it could also positively impact your brain and mental performance, according to a recent study in the International Dairy Journal. Researchers found that adults with higher intakes of milk and milk products scored significantly higher on memory and other brain function tests than those who drank little to no milk. Milk drinkers were five times less likely to “fail” the test, compared to non milk drinkers.

Researchers at the University of Maine put more than 900 men and women ages 23 to 98 through a series of brain tests — including visual-spatial, verbal and working memory tests — and tracked the milk consumption habits of the participants. In the series of eight different measures of mental performance, regardless of age and through all tests, those who drank at least one glass of milk each day had an advantage. The highest scores for all eight outcomes were observed for those with the highest intakes of milk and milk products compared to those with low and infrequent milk intakes. The benefits persisted even after controlling for other factors that can affect brain health, including cardiovascular health and other lifestyle and diet factors. In fact, milk drinkers tended to have healthier diets overall, but there was something about milk intake specifically that offered the brain health advantage, according to the researchers.

In addition to the many established health benefits of milk from bone health to cardiovascular health, the potential to stave off mental decline may represent a novel benefit with great potential to impact the aging population. While more research is needed, the scientists suggest some of milk’s nutrients may have a direct effect on brain function and that “easily implemented lifestyle changes that individuals can make present an opportunity to slow or prevent neuropsychological dysfunction.”

New and emerging brain health benefits are just one more reason to start each day with lowfat or fat free milk. Whether in a latte, in a smoothie, on your favorite cereal, or straight from the glass, milk at breakfast can be a key part of a healthy breakfast that help sets you up for a successful day. The 2010 Dietary Guidelines for Americans recommend three glasses of lowfat or fat free milk daily for adults and each 8-ounce glass contains nine essential nutrients Americans need, including calcium and vitamin D.

References: Crichton GE, Elias MF, Dore GA, Robbins MA. Relation between dairy food intake and cognitive function: The Maine-Syracuse Longitudinal Study. International Dairy Journal. 2012; 22:15-23.

On the Net:

Electronic Tattoo Monitors Brain, Heart and Muscles


[ Watch the Video ]
Elastic electronics offer less invasive, more convenient medical treatment

Imagine if there were electronics able to prevent epileptic seizures before they happen. Or electronics that could be placed on the surface of a beating heart to monitor its functions. The problem is that such devices are a tough fit. Body tissue is soft and pliable while conventional circuits can be hard and brittle–at least until now.
“We’re trying to bridge that gap, from silicon, wafer-based electronics to biological, ’tissue-like’ electronics, to really blur the distinction between electronics and the body,” says materials scientist John Rogers at the University of Illinois Urbana-Champaign.
With support from the National Science Foundation (NSF), he’s developing elastic electronics. The innovation builds upon years of collaboration between Rogers and Northwestern University engineer Yonggang Huang, who had earlier partnered with Rogers to develop flexible electronics for hemispherical camera sensors and other devices that conform to complex shapes.
This is circuitry with a real twist that’s able to monitor and deliver electrical impulses into living tissue. Elastic electronics are made of tiny, wavy silicon structures containing circuits that are thinner than a human hair, and bend and stretch with the body. “As the skin moves and deforms, the circuit can follow those deformations in a completely noninvasive way,” says Rogers. He hopes elastic electronics will open a door to a whole range of what he calls “bio-integrated” medical devices.
One example is what Rogers calls, an “electronic sock”–in this case, elastic electronics are wrapped around a model of a rabbit heart like a stocking. “It’s designed to accommodate the motion of the heart but at the same time keep active electronics into contact with the tissue,” explains Rogers.
Using animal models, Rogers has developed a version of the sock that can inject current into the heart tissue to detect and stop certain forms of arrhythmia.
Rogers also demonstrates prototypes of a catheter that can be inserted through the arteries and into the chambers of the heart to map electrical activity and provide similar types of therapies.
He believes that one day this technology will lead to devices like an implantable circuit that diagnoses and perhaps even treats seizures by injecting current into the brain.
The device might detect differences in brainwave activity that occur just before a seizure sets in, and could automatically counteract any electrical abnormalities. Prototypes of the circuits are being tested that can detect muscle movement, heart activity and brain waves just by being placed on the surface of the skin like temporary tattoos. The prototypes can detect the body’s electrical activity nearly as well as conventional, rigid electrode devices in use currently.
Rogers says their size could offer benefits in many important cases, such as monitoring the health and wellness of premature babies. “They are such tiny humans that this epidermal form of electronics could really be valuable in the monitoring of these babies in a manner that is completely noninvasive and mechanically ‘invisible’,” he points out.

On the Net: