Melting Mantle Linked To Great Oxygenation Event 2.5 Billion Years Ago

Brett Smith for RedOrbit.com

Oxygen-based life evolved on Earth because of geological events that occurred over 2.5 million years ago, according to Princeton University researchers who published a report this week in the online journal Nature.

Based on geological evidence, scientists know that roughly 2.5 billion years ago, oxygen levels in the atmosphere exploded and eventually gave birth to our present atmosphere. This time period, dubbed the Great Oxygenation Event (GOE), appears to have coincided with a sharp drop in the melting of the Earth´s mantle, the Princeton research team said.

The study´s authors, C. Brenhin Keller and Blair Schoene, applied a statistical analysis of a 70,000 geological sample database and constructed a 4-billion-year geochemical evolution.

“Research in this area has been largely qualitative, but with this much data, we can pick up finer features in the geologic record, particularly a level of detail related to this sudden change 2.5 billion years ago that people had not seen with such clarity before,” said Keller, the lead author and a doctoral student at the university.

Analysis of the geological data showed a dramatic change in the makeup of Earth’s magma at the end of the Archean eon, which lasted from 4 to 2.5 billion years ago. Keller and Schoene focused on changes in the chemical composition of basalt, a byproduct of melting magma in the Earth’s mantle, because when melting in the mantle is high, basalt contains greater concentrations of elements that are ordinarily found deep in the mantle, according to Keller. Less intense melting results in basalt with a higher content of incompatible elements that are found closer to the Earth’s surface.

Based on these observations, they determined that magma was being formed at greater depths during this time and that diminished melting in the mantle decreased the depth of melting in the Earth’s crust, thereby reducing the output of volcanic gases that react with and remove oxygen from the atmosphere.

“The perspective behind past efforts to connect geologic processes to the Great Oxygenation Event has been hypothetical, saying that ‘If the Earth had been X, there would have been reaction Y,'” Schoene said.

“But these ideas cannot be tested experimentally because they are largely notional. In our paper, we have the evidence to say, ‘The Earth was like this,’ and then propose a hypothesis that can be tested by examining the same rich database of mantle and deep-crust changes we used in our work.”

A decrease in the production of oxygen-reactive gases would allow for the accumulation of oxygen in the atmosphere through the photosynthetic activities of existing organisms, which were alive before the GOE. What exact mechanism drove up atmospheric oxygen levels is difficult to know – even recent fluctuations are difficult to gauge, according to climate scientists. However, the scenario painted by the Princeton study shows that a drop in volcanic activity was precursor to the GOE.

“In science, it is becoming increasingly obvious that seemingly different systems act together and the question is how,” Schoene said. “Overall, this analysis strengthens emerging arguments that interaction between the solid Earth and biosphere are very intimate and important.”

“This is strong evidence of how biological and geological systems might work together, and it suggests that important planetary change is not simply the result of life dragging the rest of the planet along.”

Report Forecasts End To The ‘One-Size-Fits-All’ Cloud

Enid Burns for RedOrbit.com

There are several different cloud types such as cumulus, stratus, cirrus and altostratus; and so too there should be different shapes and flavors of cloud computing. That’s the findings of a report, “Negotiating Cloud Contracts – Looking at Clouds from Both Sides,”  released by researchers of the Cloud Legal Project at the Centre for Commercial Law Studies at Queen Mary, University of London.

Researchers identify the six types of cloud contract terms most negotiated as provider liability, service level agreements, data protection and security, termination rights and lock-in/exit, unilateral amendments to service features and intellectual property rights.

“These are the key contractual issues of concern to users in the cloud market at this relatively immature stage of cloud adaptation,” explains Professor Christopher Millard, lead academic on the Cloud Legal Project (CLP), in a statement.

Millard, and his colleagues, identifies a need for service packages when it comes to negotiating contracts for cloud services. The report posits that many one-size-fits-all terms are actually non-compliant, invalid or unenforceable in different countries.

While the team at the Cloud Legal Project looks at global and UK-based providers and users, the report points out issues close to home. In Europe, regulatory and other legal obligations may dictate specific terms particularly on privacy issues.

“To remain competitive, providers may have to be more aware of user concerns, more flexible in negotiations, and more willing to demonstrate the security and robustness of their services,” Millard says. “In the middle or low value markets, choice is still limited, and many contract terms are still inadequate or inappropriate for SME users’ needs, as they may lack the bargaining power to force contract changes.”

As Millard commented, cloud services and usage are still in their infancy — and contract terms, usage needs and service plans will develop over time.

Developing varied cloud packages may actually make some users dissatisfied if services are stripped from new plans. Many customers, the report finds, look for the cheapest services yet request the highest levels of contract terms and conditions.

“Forcing providers to accept more liability and incur the expense of upgrading their infrastructure, while keeping prices low, may undermine market development,” says Professor Ian Walden from the Cloud Legal Project.

It’s as important for the users of cloud services to understand the issues as it is for providers. “Users may need to consider what functions should be migrated to cloud and on what basis, such as starting with pilots only, conducting risk assessments, and implementing internal controls,” the report states. For small businesses and enterprise companies, this means assigning these duties to an IT person or someone who can work with the company’s IT providers to implement procedures and best practices.

Several providers including Microsoft, Dell and Rackspace currently offer cloud computing services. As cloud computing develops over time, it is important for these and other companies to address local regulatory issues in order to cater services to a global user base.

The report suggests multiple approaches, which are already emerging with fragmentation of the market. “Market participants may be developing a range of cloud services with different contractual terms, priced at different levels and embracing standards and certifications that aid legal certainty and compliance, particularly for SME users,” the report concludes.

Do Violent Video Games Really Train Deadly Shooters

Enid Burns for RedOrbit.com

A new study suggests that playing violent video games that reward “head shots” makes gamers more likely to choose the head as a target with a real gun.

Faculty of the Ohio State University School of Communication and VU University in Amsterdam, the Netherlands, conducted the study.

Boom, Headshot: Effect of Video Game Play and Controller Type on Firing Aim and Accuracy” is the study published by Sage Journals. Lead authors include Jodi L. Whitaker of VU University in Amsterdam, the Netherlands and Brad J. Bushman of School of Communication, The Ohio State University.

The study illustrates that “viewers of television programs and films passively watch other characters behave aggressively, whereas players of videogames ‘become’ the aggressive characters.”

To conduct their research, study authors tested 151 college students in their behavior during and after playing videogames. Participants played different types of violent and non-violent videogames including games with human targets where players were rewarded for hitting targets’ heads. After playing the game for 20 minutes, researchers had the participants shoot 16 bullets from a realistic gun at a life-size, human-shaped mannequin. In the study, the researchers found that participants who played a violent shooting game using a pistol-shaped controller (often called a light-gun) hit the mannequin 33 percent more than other participants. The same group hit the mannequins’ head 99 percent more often.

The sample size of 151 participants is relatively small, claims analyst Billy Pidgeon from M2 Research. “Trying to ascertain some correlation to video games these days is difficult because of the number of people who play them,” Pidgeon told RedOrbit. “It’s like saying people who like ham sandwiches are more likely to do this type of behavior.”

Pidgeon also says such a low percentage of players use light-gun or “gun-shaped” controllers. “That’s pretty rare,” he says. “Typically people use a controller or a keyboard.”

Even with the use of a light-gun, the amount of training a player receives from a game is likely limited. “If my life depended on someone’s shooting ability, I’d prefer they train on the shooting range,” Pidgeon says.

Several groups have tried to state that videogames make players more likely to commit gun-related or other violent crimes. Those theories have largely been disproved, and the number of people who play such video games helps discount any concerns.

Just a few weeks after the tragic shootings in Columbine, Colorado in 1999, the video game industry held its annual trade show, E3. Doug Lowenstein, then head of the Electronic Software Association addressed the event in a keynote speech. To dispel any thoughts that the video games had a direct correlation to the actions or skills carried out by two teens, Lowenstein held up a game controller and stated that the device could no more make someone an expert marksman than it could help a player learn to drive at NASCAR.

It’s likely that participants took headshots with a real gun and target because they followed the behavior from gameplay. However the report doesn’t clearly suggest that these participants are any more likely to pick up a gun and commit a crime.

Risk Of Eye Infections With Some Acne Medications

Connie K. Ho for RedOrbit.com

May is Healthy Vision Month and, as such, the publication of new research regarding eye health and popular prescription acne medication comes at the perfect time.

Researchers from Tel Aviv University (TAU) found that popular prescription acne medication pills like Accutane or Roaccutane can lead to eye infections, ranging from conjunctivitis (pink eye) to sties. They give patients and doctors a few recommendations on how to prevent eye infections from developing as a result of using prescription medication pills.

Previously, clinicians theorized that there was a relationship between acne and eye infections; however, there was no evidence to prove this claim.

“Acne itself can increase the risk of ocular diseases,” remarked Dr. Gabriel Chodick of TAU’s School of Public Health at the Sackler Faculty of Medicine in a prepared statement. “There is a greater tendency towards inflammation, and sometimes this leads to irritation.”

The research, published in the Archives of Dermatology, was a collaborative effort between Chodick as well as Dr. Meira Neudorfer, Dr. Orna Shamai-Lubovitz, and Dr. Varda Shalev from the Sackler Faculty of Medicine and Inbal Goldshtein from Maccabi Health Care Services. They looked at the records of 15,000 children enrolled in the Maccabi Health Care Services database. The children were then divided into three groups: adolescents who did not have acne, adolescents who did have acne but were not taking oral medicine, and adolescents who had acne but were taking Accutane or Roaccutane.

In the project, the researchers found that, out of the 15,000 subjects, 1,791 people were found to have developed inflammatory ocular diseases. The numbers included 991 people from the medicated group, 446 people in the acne group, and 354 people in the group that didn´t have acne. Conjunctivitis was the most common infection, with four patients in the medicated group reported to have the illness compared to two percent of the normal population.

The scientists found that it´s important to have tears that can lubricate the surface of the eye and clean out any debris, bacteria, or viruses that might be found near the eye or the lid of the eye; infection of the gland can cause sties and swelling of the complete eyelid may result from more bacterial infections.

“A very common side effect of Accutane and Roaccutane is dryness of skin and lips, so it’s only natural that these medications would also effect the lubrication of the eyelids – specifically the oil glands along the rim of the eyelid,” commented Chodick in a prepared statement.

Researchers recommend that dermatologists and patients be conscientious of the side effects of the long-term damage that can come about from treating acne with particular acne medications that are taken orally. Patients can talk to their doctors to find out how they can minimize eye damage. One option is to utilize artificial tears or eye drops that can continue lubricating the eye. Local pharmacies offer inexpensive over-the-counter medications that address this issue.

Besides the research done by Chodik and his colleagues that examined the connection between eye irritation and acne medication, another study published in Clinical and Experimental Optometry stated that long-term eye rubbing and irritation could cause structural eye problems like keratoconus, which is a degeneration of the cornea.

Newly Discovered Organ Could Explain Size, Eating Habits Of Some Whales

Scientists with the Smithsonian Institution and the University of British Columbia (UBC) have discovered a new sensory organ in the rorqual family of whales — a discovery which sheds new light on their unique feeding behavior and explains why they grow to such massive sizes.

The US and Canadian biologists involved in the study located the organ at the tip of the chin of blue, humpback, minke and fin whales, contained within a batch of ligaments connecting the lower jaw bones, according to AFP reports published on Wednesday.

The previously hidden organ is made up of nerves, and “orchestrates dramatic changes in jaw position” required for rorqual whales to feed by opening their mouths wide and gulp massive amounts of water in order to consume large amounts of fish or krill in a single bite, LiveScience Staff Writer Jennifer Welsh explained.

Next, the water is filtered back into the ocean through baleen at the front of the creature’s mouth, which allows it to return to normal size and keep the food swallowed, she added. The whole process is completed in about six seconds and is made possible because rorqual whales have a pair of large connected jawbones that are loosely attached to the rest of their skull.

“The odd arrangement of tissues didn’t make much sense to us at first, but then we realized that this organ was perfectly placed, anatomically, to coordinate a lunge because that soft structure is pinched by the tips of the jaws, and deforms through the course of a lunge,” Nicholas Pyenson, paleobiologist at the Smithsonian’s National Museum of Natural History and lead author of research published in the journal Nature.

“This deformation is registered by the nerves inside the organ, informing the gulping whale about its gigantic jaws, which must close before prey escape,” he added. “This finding answers several outstanding theoretical questions and puzzling field data that suggest rorquals actively control their lunge, rather than letting their mouths passively inflate like a parachute.”

In the study, Pyenson called the rorquals’ feeding methods one of the most extreme amongst all aquatic vertebrates, according to BBC Nature Writer Victoria Gill. He also said that the structure resembled a “gelatinous mess,” which could explain why it had previously been dismissed by scientists as simply a “fluid-filled joint” located between the two bottom jaw bones. It wasn’t until they dissected the organ that they discovered its true complexity, she added.

“We think this sensory organ sends information to the brain in order to coordinate the complex mechanism of lunge-feeding, which involves rotating the jaws, inverting the tongue and expanding the throat pleats and blubber layer,” Pyenson said. “It probably helps rorquals feel prey density when initiating a lunge.”

“In terms of evolution, the innovation of this sensory organ has a fundamental role in one of the most extreme feeding methods of aquatic creatures,” added UBC Zoology Professor and study co-author Bob Shadwick. “Because the physical features required to carry out lunge-feeding evolved before the extremely large body sizes observed in today’s rorquals, it’s likely that this sensory organ — and its role in coordinating successful lunging — is responsible for rorquals claiming the largest-animals-on-earth status“¦ This also demonstrates how poorly we understand the basic functions of these top predators of the ocean and underlines the importance for biodiversity conservation.”

Image 2 (below): A new sensory organ, found within the chin of rorqual whales, is responsible for coordinating the biomechanics of their extreme lunge-feeding strategy. Left, a fin whale after lunging; right, anatomy of the new sensory organ. Art by Carl Buell, arranged by Nicholas D. Pyenson / Smithsonian Institution.

Obesity Genes Linked To Increased Appetite, Poor Dietary Choices

[ Watch the Video ]

Connie K. Ho for RedOrbit.com

Researchers from The Miriam Hospital´s Weight Control and Diabetes Research Center recently acknowledged that people who have particular “obesity genes” are more likely to eat more meals and snacks as well as consume more calories or foods high in fat and sugar.

The research, found in the June 2012 issue of the American Journal of Clinical Nutrition, shows the differences in the fat mass and obesity-associated genes (FTO) as well as brain-derived neurotrophic factor gene (BDNF) genes. Researchers believe that these genes could influence eating habits that cause obesity. FTO and BDNF have been seen to be expressed in the part of the brain that influences eating and appetite. As such, they are found to be linked to overeating in children, and the researchers extended this finding to adults as well.

The results of the project show that it may be possible for people to lower their genetic risk by adapting specific eating patterns, practicing more vigilance in regards to food choices, and adding physical fitness to a daily routine.

“Understanding how our genes influence obesity is critical in trying to understand the current obesity epidemic, yet it´s important to remember that genetic traits alone do not mean obesity is inevitable,” noted lead author Dr. Jeanne M. McCaffery of The Miriam Hospital´s Weight Control and Diabetes Research Center. “Our lifestyle choices are critical when it comes to determining how thin or heavy we are, regardless of your genetic traits“¦ However, uncovering genetic markers can possibly pinpoint future interventions to control obesity in those who are genetically predisposed.”

The project lasted over six months, with over 2,000 subjects submitting a questionnaire based on their eating habits as well as participating in genotyping. The participants were part of the program Look Action in Health and Diabetes (AHEAD), which allowed researchers to look at a number of genes that were related to obesity. Look AHEAD, funded by the National Institutes of Diabetes and Digestive and Kidney Diseases, is a multi-site clinical trial that examines how lifestyle changes can affect weight loss as well as the risk for cardiovascular diseases. The team looked at how the genetic markers impacted the pattern or content of the subjects´ diet.

The findings are along the same lines as other previous research done with children. The results show that variations in the FTO gene were related to the consumption of more meals and snacks per day and a greater percentage of energy from fat as well as more servings in fats, oils, and sweets. Researchers also found that people who had variations of BDNF tended to eat more calories per day and consume an increased amount of servings of dairy and meat, eggs, nuts, and beans food groups.

“We show that at least some of the genetic influence on obesity may occur through patterns of dietary intake,” explained McCaffery. “The good news is that eating habits can be modified, so we may be able to reduce one´s genetic risk for obesity by changing these eating patterns.”

To continue the project, researchers must first replicate before being allowed to utilize the results in possible clinical measures.

Pesticide Turns Bees Into Picky Eaters

[ Watch the Video ]

Lee Rannals for RedOrbit.com

New research shows that a common pesticide can alter the appetite of honey bees and turn them into “picky eaters.”

Biologists at the University of California, San Diego (UCSD) found that a single dose of imidacloprid given to bees made the insects crave sweeter foods and reject foods that may not be as tasty.

According to the researchers, honey bees that prefer sweeter foods limit the amount of resources they contribute to the colony.

During the study, the scientists individually harnessed the bees so only their heads could move, enabling them to get a more detailed perspective of a bee’s behavior.

They stimulated the bees’ antennae with sugar water and were able to determine at what concentrations the sugar water was rewarding enough to feed on.  The researchers touched the antennae of each bee to see if it extended its mouth parts.

Bees treated with imidacloprid were less willing to feed on lower concentrations of sugar water than those bees that went untreated with the pesticide.

“Our results show that shortly after bees are exposed to pesticides, they respond less often to sweeter nectar sources that they would normally feed on,” Daren Eiri, lead author of the paper published in Journal of Experimental Biology, told RedOrbit in an email.

“In addition, bees typically recruit their nestmates to good food with waggle dances, and we discovered that the treated bees also danced less to food sources that did not contain any pesticides.”

Imidacloprid, which is part of the group of crop pesticides known as neonicotinoids, is a popular active ingredient in many consumer-used products for home gardening, according to Eiri.

Honey bee populations have dropped in both North America and Europe over the years due to “colony collapse disorder,” and recent published studies point to neonicotinoids as a culprit.

Eiri said the biologists began this study because an earlier study showed that bees feeding at a source contaminated with imidacloprid resulted in reduced foraging activity.

“It was unclear whether this was due to the anti-feedant character of the pesticide, or if there was activity occurring inside the colony to reduce foraging activity,” he told RedOrbit. “Our research will contribute to the number of studies recently published that have also found negative effects on honey bee behavior.”

Imidacloprid is banned for use in certain crops in some European countries, but it is still able to be used in the U.S.  Eiri said he hopes the study may influence how pesticides are registered by the EPA.

“The EPA currently does not have a formal review process that looks at the sublethal, or behavioral, effects of pesticides to beneficial insects like honey bees,” he said.

Because imidacloprid is part of consumer products, Eiri said the findings may also have an influence on homeowner´s choice in what products to use for gardening.

“It may influence home owners to use alternative products that use active ingredients that are not considered harmful to beneficial pollinators,” Eiri told RedOrbit.

The researchers said their discoveries not only have implications for how pesticides are used in crops, but also help to produce an additional chemical tool that can be used by other researchers who study the neural control of honey bee behavior, according to a press release.

Image 2 (below): Using an ascending range of sugar water from 0 to 50 percent, the researchers touched the antennae of each bee to see if it extended its mouthparts. Credit: Daren Eiri

Hacking Code Of Leaf Vein Architecture Allows Predictions Of Past Climate

UCLA life scientists have discovered new laws that determine the construction of leaf vein systems as leaves grow and evolve. These easy-to-apply mathematical rules can now be used to better predict the climates of the past using the fossil record.

The research, published May 15 in the journal Nature Communications, has a range of fundamental implications for global ecology and allows researchers to estimate original leaf sizes from just a fragment of a leaf. This will improve scientists’ prediction and interpretation of climate in the deep past from leaf fossils.

Leaf veins are of tremendous importance in a plant’s life, providing the nutrients and water that leaves need to conduct photosynthesis and supporting them in capturing sunlight. Leaf size is also of great importance for plants’ adaptation to their environment, with smaller leaves being found in drier, sunnier places.

However, little has been known about what determines the architecture of leaf veins. Mathematical linkages between leaf vein systems and leaf size have the potential to explain important natural patterns. The new UCLA research focused on these linkages for plant species distributed around the globe.

“We found extremely strong, developmentally based scaling of leaf size and venation that has remained unnoticed until now,” said Lawren Sack, a UCLA professor of ecology and evolutionary biology and lead author of the research.

How does the structure of leaf vein systems depend on leaf size? Sack and members of his laboratory observed striking patterns in several studies of just a few species. Leaf vein systems are made up of major veins (the first three branching “orders,” which are large and visible to the naked eye) and minor veins, (the mesh embedded within the leaf, which makes up most of the vein length).

Federally funded by the National Science Foundation, the team of Sack, UCLA graduate student Christine Scoffoni, three UCLA undergraduate researchers and colleagues at other U.S. institutions measured hundreds of plant species worldwide using computer tools to focus on high-resolution images of leaves that were chemically treated and stained to allow sharp visualization of the veins.

The team discovered predictable relationships that hold across different leaves throughout the globe. Larger leaves had their major veins spaced further apart according to a clear mathematical equation, regardless of other variations in their structure (like cell size and surface hairiness) or physiological activities (like photosynthesis and respiration), Sack said.

“This scaling of leaf size and major veins has strong implications and can potentially explain many observed patterns, such as why leaves tend to be smaller in drier habitats, why flowering plants have evolved to dominate the world today, and how to best predict climates of the past,” he said.

These leaf vein relationships can explain, at a global scale, the most famous biogeographical trend in plant form: the predomination of small leaves in drier and more exposed habitats. This global pattern was noted as far back as the ancient Greeks (by Theophrastus of Lesbos) and by explorers and scientists ever since. The classical explanation for why small leaves are more common in dry areas was that smaller leaves are coated by a thinner layer of still air and can therefore cool faster and prevent overheating. This would certainly be an advantage when leaves are in hot, dry environments, but it doesn’t explain why smaller leaves are found in cool, dry places too, Sack noted.

Last year, Scoffoni and Sack proposed that small leaves tend to have their major veins packed closely together, providing drought tolerance. That research, published in the journal Plant Physiology, pointed to an advantage for improving water transport during drought. To survive, leaves must open the stomatal pores on their surfaces to capture carbon dioxide, but this causes water to evaporate out of the leaves. The water must be replaced through the leaf veins, which pull up water through the stem and root from the soil. This drives a tension in the leaf vein “xylem pipes,” and if the soil becomes too dry, air can be sucked into the pipes, causing blockage.

The team had found, using computer simulations and detailed experiments on a range of plant species, that because smaller leaves have major veins that are packed closer together – a higher major vein length per leaf area – they had more “superhighways” for water transport. The greater number of major veins in smaller leaves provides drought tolerance by routing water around blockages during drought.

This explanation is strongly supported by the team’s new discovery of a striking global trend: higher major vein length per leaf area in smaller leaves.

The Nature Communications research provides a new ability to estimate leaf size from a leaf fragment and to better estimate past climate from fossil deposits that are rich in leaf fragments. Because of the very strong tendency for smaller leaves to have higher major vein length per leaf area, one can use a simple equation to estimate leaf size from fragments.

Major vein length per leaf area can be measured by anyone willing to look closely at the large and small leaves around them.

“We encourage anyone to grab a big and a small leaf from trees on the street and see for yourself that the major veins are larger and spaced further apart in the larger leaf,” Scoffoni said.

Because leaf size is used by paleobiologists to “hindcast” the rainfall experienced when those fossil plants were alive and to determine the type of ecosystem in which they existed, the ability to estimate intact leaf size from fragmentary remains will be very useful for estimates of climate and biodiversity in the fossil record, Sack said.

The research also points to a new explanation for why leaf vein evolution allowed flowering plants to take over tens of millions of years ago from earlier evolved groups, such as cycads, conifers and ferns. Because, with few exceptions, only flowering plants have densely packed minor veins, and these allow a high photosynthetic rate – providing water to keep the leaf cells hydrated and nutrients to fuel photosynthesis – flowering plants can achieve much higher photosynthetic rates than earlier evolved groups, Sack said.

The UCLA team’s new research also showed that the major and minor vein systems in the leaf evolve independently and that the relationship between these systems differs depending on life size.

“While the major veins show close relationships with leaf size, becoming more spaced apart and larger in diameter in larger leaves, the minor veins are independent of leaf size and their numbers can be high in small leaves or large leaves,” Sack said. “This uniquely gives flowering plants the ability to make large or small leaves with a wide range of photosynthetic rates. The ability of the flowering plants to achieve high minor-vein length per area across a wide range of leaf sizes allows them to adapt to a much wider range of habitats – from shade to sun, from wet to dry, from warm to cold – than any other plant group, helping them to become the dominant plants today.”

The strength of the mathematical linkage of leaf veins with leaf size across diverse species raises the question of cause.

The UCLA team explains that these patterns arise from the fact of a shared script or “program” for leaf expansion and the formation of leaf veins. The team reviewed the past 50 years of studies of isolated plant species and found striking commonalities across species in their leaf development.

“Leaves develop in two stages,” Sack said. “First, the tiny budding leaf expands slightly and slowly, and then it starts a distinct, rapid growth stage and expands to its final size.”

The major veins form during the first, slow phase of leaf growth, and their numbers are complete before the rapid expansion phase, he said. During the rapid expansion phase, those major veins are pushed apart, and can simply extend and thicken to match the leaf expansion. Minor veins can continue to be initiated in between the major veins during the rapid phase, as the growing leaf can continue to lay down new branching strands of minor veins.

In the final, mature leaf, it is possible for minor veins to be spaced closely, even in a large leaf where the major veins would be spaced apart.

“The generality of the development program is striking,” Sack said, “It’s consistent with the fact that different plant species share important vein development genes – and the global scaling patterns of leaf vein structure with leaf size emerge in consequence.”

These vein trends, confirmed with high-resolution measurements, are “obvious everywhere under our noses,” Sack and Scoffoni said.

Why had these trends escaped notice until now?

“This is the time for plants,” Sack said. “It’s amazing what is waiting to be discovered in plant biology. It seems limitless right now. The previous century is known for exciting discoveries in physics and molecular biology, but this century belongs to plant biology. Especially given the centrality of plants for food and biosphere sustainability, more attention is being focused, and the more people look, the more fundamental discoveries will be made.”

On The Net:

Phthalates In PVC floors Absorbed By Bodies Of Infants

A new study at Karlstad University in Sweden shows that phthalates from PVC flooring materials is taken up by our bodies. Phthalates are substances suspected to cause asthma and allergies, as well as other chronic diseases in children. The study shows that children can ingest these softening agents with food but also by breathing and through the skin.

Phthalates are a group of chemical compounds that occur in construction materials and a great number of common consumer goods such as toys, cleaning solvents, packaging, etc. Phthalates are suspected of disrupting hormones and may be related to several chronic diseases in children, like asthma and allergies, as shown in earlier studies. Flooring materials using softened PVC contain phthalates and have previously been shown to be a significant source of phthalates in indoor dust. This new study was designed to investigate whether flooring materials using PVC and other housing-related factors, together with other individual factors, can be tied to the uptake of phthalates by infants.

Urine samples were taken from 83 randomly selected children between the ages of two and six months by the county council in Värmland in western Sweden. The prevalence of four types of phthalates in the urine was measured, and data were collected about flooring materials and the home, the family’s lifestyle, and individual factors for the infants. The levels of certain phthalates (MBzP, a BBzP metabolite) proved to be higher in the urine of babies that had PVC materials on their bedroom floor. The levels of another phthalate metabolite related to DEHP were lower in two-month-old children if they were exclusively breastfed, with no supplements.

Earlier studies from the current group have shown that PVC flooring can be tied to the occurrence of phthalates in indoor dust, and that exposure for BBzP in indoor dust could be associated with allergic conditions in children. These new data thus show that the uptake of phthalates in infants can be related to flooring materials using softened PVC in the home. It should be pointed out that both DEHP and BBzP are banned for use in toys for small children owing to health risks.

“With this study as a basis, we can establish that there are other sources that should be taken into consideration in regard to the uptake of banned chemicals and that we do not only ingest them in our food,” says Carl-Gustaf Bornehag, professor of public health at Karlstad University and leader of the study. The findings also show that phthalates can be taken up in different ways, both through food and probably through breathing and through the skin.

On the Net:

Yellow-winged Bat, Lavia frons

The yellow-winged bat (Lavia frons) is a species of false vampire bat. It can be found in Africa, and its range consists of much of mid-Africa including Benin, Cameroon, the Democratic Republic of Congo, Ethiopia, Gabon, Kenya, Mali, Nigeria, Rwanda, Senegal, and Zambia. It will roost in many habitats at elevations less than 6,561 feet. It prefers to roost near bodies of water in acacia trees and thorn bushes, but it may be seen roosting in buildings or tree hollows. It will have two roosts that it will use throughout the day, one for sleeping (the primary) and one for resting (the peripheral) between flights and especially on hot days.

The yellow-winged bat can reach an average body length of up to 3.1 inches, and can weigh between .98 and 1.2 ounces. Typically, females are larger than males and the average wingspan for an adult bat can reach a length of 14.1 inches. As its name implies, the wings are usually yellow in color, but may have red hues. The fur, which does not grow on most of the membranes, or wings, is usually pale grey or pearl gray in color, and males may have green hues on their hind body and underbelly.

The ears of the yellow-winged bat are long and pointed and the nose is also elongated, coming to a dull point at the end, where a noseleaf is located. Males have glands on their lower backs which secrete a yellow matter, and females have two extra teats near the anus that their young will latch onto. Although the interfemoral membrane (or area of the wings between the legs) of this bat is enhanced, it lacks an external tail.

The diet of the yellow-winged bat consists of insects, unlike its other false vampire bat counterparts who feed on vertebrates. They will eat many different kinds of insects, including both hard and soft bodied species, as well as scarab beetles, termites, moths, butterflies, and even flies. Instead of catching prey in midair, these bats will roost in wait, waiting for an insect to pass by. Once they have spotted the prey, the bats will then attack.

Yellow-winged bats are monogamous during breeding season, finding and making a pair that will form its own foraging territory. Either the male or female will stay alert during the day, and is able to turn its head around 225 degrees in order to keep a vigilant look out. In the mornings and the evenings, male bats will fly to their peripheral roosts in order to keep intruders away. During the day, both males and females will separate, returning to the primary roost in the evening to interact. This process will occur between the months of May to June, when it is rainy and food is abundant. Communication between bats usually occurs during mating, aggressive moments, and between mother and pup.

Female yellow-winged bats have a pregnancy that lasts approximately three months, after which one pup is born. The exact timing of birth varies from region to region. In Zambia, the bats will give birth at the end of the dry season in October, while in areas of Kenya the bats will give birth at the beginning of the rainy season in April. In the first weeks of the pup’s life, it will spend its time latched to its mother, but will soon grow to roost on its own and begin to fly. At around fifty-five days, the pups are weaned.

Little is known about the yellow-winged bat’s exact populations, or about how humans can affect their numbers. It is thought that, although the bat is not very common, it is not particularly vulnerable to human actions. Predators of this bat may include common kestrels, bat hawks, Mambas, and night tree vipers. The IUCN has given the yellow-winged bat a conservation status of “least Concern”.

Image Caption: Picture taken of a yellow-winged bat (Lavia frons) in Tanzania. The bat was hanging inside a building. Credit: Dries Sagaert/Wikipedia

Thawing Arctic Cryosphere Releases Trapped Methane

Brett Smith for RedOrbit.com

The edges of glaciers and Arctic permafrost are where most of the evidence of global warming can be seen, but scientists have recently been traveling to these remote locations for a different reason.

Researchers from the University of Alaska at Fairbanks just published a study in the online edition of Nature Geoscience showing that methane trapped under arctic lake ice for millions of years is now being released by the melting ice. The team used both aerial and ground survey to locate over 150,000 methane seeps along the margins of ice and permafrost in Alaska and Greenland.

The release of methane trapped under ice and frost is nothing new. As the melting occurs, previously frozen organic material decomposes and releases methane. A NASA research team just reported high levels of methane over the Arctic Ocean in last month in the online edition of Nature.

However, this latest study points to the possibility of releasing gas that was previously thought to be permanently trapped under ice.

“Now we are saying that as permafrost thaws and glaciers retreat it is going to let something out that has had a lid on it,” said Katey Walter Anthony of the university´s Water and Environmental Research Center, who led the study.

Using radioactive dating of the carbon-14 isotope, Anthony and her team determined “ancient “ methane was being released from many of the gas seeps, possibly generated from natural gas or coal deposits underneath the water. Other sites were found to be releasing “younger” methane from the period known as the Little Ice Age, around 1500 to 1800.

The Alaskan university team expressed concern about the current and potential future environmental impact from the release of untold amounts of methane, a green house gas.

“If this relationship holds true for other regions where sedimentary basins are at present capped by permafrost, glaciers and ice sheets, such as northern West Siberia, rich in natural gas and partially underlain by thin permafrost predicted to degrade substantially by 2100, a very strong increase in methane carbon cycling will result, with potential implications for climate warming feedbacks,” the report said.

The scientists estimated a potential methane store of 1,200 petagrams (1,000 million million grams) compared to the current atmospheric pool of methane around 5 petagrams. The release of even a fraction of this trapped methane could have a significant climate change ramifications.

Methane is one of the most important non-carbon dioxide greenhouse gases, having as much effect on global warming as all other non-carbon dioxide gases combined. The potential release of vast amounts of ancient methane from the glacier and permafrost margins creates the potential for a global warming feedback loop.

“We observed most of these cryosphere-cap seeps in lakes along the boundaries of permafrost thaw and in moraines and fjords of retreating glaciers,” the report said.

“If this relationship holds true for other regions where sedimentary basins are at present capped by permafrost, glaciers and ice sheets, such as northern West Siberia, rich in natural gas and partially underlain by thin permafrost predicted to degrade substantially by 2100, a very strong increase in methane carbon cycling will result, with potential implications for climate warming feedbacks.”

The Women’s Health Initiative Study And Hormone Therapy: What Have We Learned 10 Years On?

In July 2002 the publication of the first Women’s Health Initiative (WHI) report caused a dramatic drop in Menopausal Hormone Therapy (HT) use throughout the world. Now a major reappraisal by international experts, published as a series of articles in the peer-reviewed journal Climacteric (the official journal of the International Menopause Society), shows how the evidence has changed over the last 10 years, and supports a return to a “rational use of HT, initiated near the menopause”.

The reappraisal has been carried out by some of the world’s leading experts in the field, including clinicians who worked on the original WHI study. Summarizing the findings of the special issue, authors Robert Langer, JoAnn Manson, and Matthew Allison conclude that “classical use of HT” — MHT initiated near the menopause — will benefit most women who have indications including significant menopausal symptoms or osteoporosis.

Dr. Robert Langer, Principal Scientist at the Jackson Hole Center for Preventive Medicine, Jackson Wyoming, was the Principal Investigator of the WHI Clinical Center at the University of California, San Diego. He said

“With 10 years hindsight we can put the lessons learned from the WHI HT trials into perspective. In some ways we’ve come full circle — studies in recently menopausal women that suggested protection against major diseases led to testing whether that would carry over to older women who have even greater risks of heart attacks and fractures. That hope proved false. Unfortunately the results were wrongly generalized back to women like those who inspired the study. Information that has emerged over the last decade, shows that for most women starting treatment near the menopause, the benefits outweigh the risks, not just for relief of hot flashes, night sweats and vaginal dryness, but also for reducing the risks of heart disease and fractures”.

Langer continued:

“Overgeneralizing the results from the women who were — on average — 12 years past menopause to all postmenopausal women has led to needless suffering and lost opportunities for many. Sadly, one of the lessons from the WHI is that starting HT 10 years or more after menopause may not be a good idea, so the women who were scared away by the WHI over this past decade may have lost the opportunity to obtain the potential benefits.”

Professor JoAnn Manson (Harvard Medical School and Brigham and Women’s Hospital, Boston, MA), who has been one of the WHI Principal Investigators since the study started, said:

“An important contribution of the WHI was to clarify that, for older women at high risk of cardiovascular disease, the risks of HT far outweighed the benefits. This halted the increasingly common clinical practice of prescribing HT to women who were far from the onset of menopause. Unfortunately, these findings were extrapolated to newly menopausal and healthy women who actually had a favorable benefit: risk ratio with HT. The WHI results point the way towards treating each woman as an individual. There is no doubt that HT is not appropriate for every woman, but it may be appropriate for many women, and each individual woman needs to talk this over with her clinician”.

The authors note that the initial press reaction, following the lead of the WHI press release, over-emphasized a relatively small increase in breast cancer, so distorting the overall view of the report.

WHI researcher Professor Matthew Allison (University of California, San Diego), said:

“It is important to put the results of the WHI trials into context. That is, being obese, not exercising or excess alcohol consumption confer higher absolute risks for breast cancer than HT use.”

On the Net:

Student Design Improves Pill Bottle For Blind, Visually Impaired

Connie K. Ho for RedOrbit.com

Ralph Waldo Emerson, a leader of the Transcendentalist movement, once said “Be an opener of doors for such as come after thee, and do not try to make the universe a blind alley.” Two students from the University of Cincinnati have done just that; they recently applied for a provisional patent on the design and prototype of a prescription-medicine pill bottle that would help people who are blind and visually impaired. The design has universal appeal and can assist over 1.3 million Americans who are legally blind, as well as those who suffer from other vision impairment problems.

According to the University of Cincinnati, by the year 2020, there will be a 70 percent increase in the number of people who suffer from blindness. Many of these people will be baby boomers and the students´ design will especially address this population. In a statement, the University of Cincinnati students, Alex Broerman and Ashley Ma, described how the design was low-tech, simple, and inexpensive.

“Options that are currently on the market are more expensive and complex, dependent on technology and requiring a more expensive outlay on the part of the end user to purchase them,” expressed Ma in the prepared statement.

The students´ design features a number of interesting elements. One of the features is a lid that is attached to the bottle, as lost caps are problematic for the visually impaired and twist caps can be difficult for the elderly. Along with the bottle flip lid, there are eight different kinds of textures available. These textures would relate to various medications and would not be in Braille, as only ten percent of the blind and visually impaired are knowledgeable of Braille. Besides the lid textures, the design comes in a variety of dramatic colors that would be easily seen by visually impaired people and the bottles include a “fail-safe” audio button that announce the contents of the container. Lastly, the design also features a small rectangular bottle body that is two-by-two inches wide and three-inches tall; it allows a user to pick out a few pills without having to dump all the pills and pick out a few that will fulfill the dosage.

Other options currently on the market include a Wi-Fi connected bottle that glows when patients need to take their medicines, a radio frequency identification (RFID) monitor that has vocal description of the medicine when a bottle moves, and an audio recorder that plays back verbal instructions recorded by a pharmacist when the bottle is placed on top of the recorder.

“There are a lot of great technology-based solutions on the market already, but those are out of reach for users who can’t afford the time or money to learn these systems. We interviewed a number of blind and visually impaired users of medications, and the cost for an option like the RFID device is out of reach for many of them. In fact, many of those we interviewed had to develop their own custom solutions — like rubber bands around a specific bottle — to meet their needs to differentiate medications,” explained Broerman in the statement.

With their “Inclusive Pill Bottles for the Blind” design, Broerman and Ma are helping others and were recognized at Innov8 for Health, a business-concept competition organized by a number of regional institutions and businesses, with a $1,000 prize.

“It was powerful to hear the stories of those we interviewed in the early stages of the design process. These consumers, many of them elderly, are paying hundreds of dollars more than their sighted counterparts in order to aurally differentiate their medications. So the challenge becomes to create the best solution for the most number of people at the lowest cost, and we’re pretty confident that we’ve achieved something like that with this project,” noted Ma.

From June 5 to June 9, Broerman and Ma´s design will be displayed at DAAPworks, a showcase of senior projects from the University of Cincinnati´s College of Design, Architecture, Art, and Planning (DAAP).

For more information about the project, visit their design process blog.

Recognizing Emergency Medical Services With National EMS Week 2012

Connie K. Ho for RedOrbit.com

Rescuing a man who was trapped in a cave. Saving a cub scout who fell into a ravine. These are examples where emergency responders rushed to a situation to help those in need. From May 20 to May 26, professionals in Emergency Medical Services (EMS) are being honored for their commitment and service to the country with National EMS Week 2012.

In 1973, the U.S. Congress first authorized the Emergency Medical Services System (EMSS). The year following, President Gerald R. Ford signed the bill and David R. Boyd was appointed as the director of the Division of Emergency Medical Services (DEMSS), Public Health Service, Department of Health, Education, and Welfare. President Ford was the first to proclaim “Emergency Medical Services Week” after working with Boyd.

“During National Emergency Medical Services Week, we recognize the tremendous role that EMS practitioners make to improve health in communities across the nation. The around-the-clock dedication to providing emergency care is evident with one statistic: more than 36 million patients were cared for by EMS professionals in 2011 alone,” noted Dr. Nicole Lurie, Rear Admiral of the U.S. Public Health Service as well as the Assistant Secretary for Preparedness and Response of the U.S. Department of Health and Human Services, in a prepared statement.

Emergency care professionals provide emergency medical care to the community at all hours of the day.

“They strive for seamless care, from the field to the hospital emergency department or trauma center. Their commitment to ensuring that patients receive the best medical care available, anytime and anywhere, is instrumental to advancing the health, safety, and well-being of the American people,” continued Lurie in the statement. “EMS is an essential part of building a resilient health care system that functions efficiently and effectively every day and is capable of responding to disasters and public health emergencies.”

In particular, Wednesday, May 23 is recognized as Emergency Medical Services for Children (EMSC) Day with the theme of “EMS: More Than a Job. More Than a Calling.”

“For the 10th consecutive year, we are devoting one day during EMS Week to focus on the needs of children. It is National EMS for Children (EMS-C) Day“¦ consider directing your activities and events specifically on child safety and injury prevention,” remarked Dr. David Seaberg, American College of Emergency Physicians (ACEP) President, on the organization´s website. “Thank you again for the outstanding service you provide your communities”

According to EMS Week Ideas, there are a number of ways in which community members can celebrate National EMS Week. One way to promote the event is through awareness. For example, people can post EMS-related content to social networking websites such as Facebook and Twitter. Along with social media marketing, advertising can be placed at local fire stations. In terms of event planning, community members can collaborate with other organizations on projects. For example, people can work with hospitals and other public health agencies on a local preparedness project. They can also create health and safety outreach projects such as blood pressure screenings at malls and grocery stores.

Arthritis Drug Powerful Against Human Dysentery

Connie K. Ho for RedOrbit.com

Every year, 50 million people throughout the world contract amebiasis through contaminated food or water. With this shocking statistic, it is considered the third leading cause of illness and the fourth leading cause of death due to protozoan infection on a global basis.

A collaborative project by researchers from the University of California, San Francisco (UCSF), the University of California, San Diego (UCSD), and Wake Forest Medical School recently announced that they had found that an approved arthritis drug worked against these amoebas during lab and animal studies. This is a breakthrough in research, as the drug could be used to treat human dysentery that is caused by amoebic infections. The drug would be prepared in small, inexpensive doses and could be used by those in the developing world.

In developing countries, 70,000 deaths are reportedly caused by amebiasis and children are known to have the highest risk of contracting the disease. Noted effects of the parasite Giardia include diarrhea, abdominal cramps, and dehydration. Presently, amebiasis and giardiasis are treated with the antibiotic metronidazole, but it has side effects like dizziness, headaches, nausea, and vomiting.

Based on the experiment, researchers were able to get the drug auranofin approved under Orphan Drug Status from the U.S. Food and Drug Administration (FDA). It is slated to be tested in clinical trials with humans with regards to amebiasis and the parasite Giardia.

During the lab tests, auranofin demonstrated the ability to stop the growth of the parasite Entamoeba histolytica. The findings, published in the June 2012 issue of Nature Medicine, showed that existing drugs could have other uses and assist in treatment of other illnesses. According to co-senior author Dr. James McKerrow, the off-patent drug and the clinical safety data shows that there is the possibility for a low-cost solution that can be used globally; it will have fewer side effects and risks to bacterial infection than other therapy options that are currently offered.

“When we’re looking for new treatments for the developing world, we start with drugs that have already been approved,” explained McKerrow, a professor of pathology in the UCSF Sandler Center for Drug Discovery, in a statement. “If we can find an approved drug that happens to kill these organisms, we’ve leapfrogged the development process that goes into assessing whether they are safe, which also makes them affordable throughout the world.”

In the past, auranofin has been used by adults with rheumatoid arthritis and, since 1985, has been taken by patients orally twice-daily. The researchers believe that auranofin would be ten times stronger than the current treatments for dysentery. The drug could be given at low dose, once or a limited number of times.

“This is a drug that you can find in every country,” commented Dr. Anjan Debnath, lead researcher on the paper and a postdoctoral fellow at UCSF, in the statement. “Based on the dosage we’re seeing in the lab, this treatment could be sold at about $2.50 per dose, or lower. That cost savings could make a big difference to the people who need it the most.”

The project was an international collaboration between the California Institute for Quantitative Biosciences (QB3) at UCSF, the pathology departments at UCSD, and offices from the Instituto Politecnico Nacional in Mexico. Researchers at UCSF examined how to create a screen that could identify small molecule drugs that could safely eliminate amoebas. Debnath, who was a UCSD researcher at the time, developed a high-throughput screen that could work in an oxygen-free environment that was similar to the amoeba´s natural environment. The team also received a screening library of 900 compounds from Iconix Biosciences and worked with the Small Molecule Discovery Center to screen drugs against amoebas.

“The top hit was this drug auranofin, which caught our attention for a couple of reasons,” noted McKerrow in the statement. “First, it was more effective than the current drug, and importantly, it was a drug that has been given to people since 1985. So we knew it could be taken orally and was safer than the current drug for amoebas.”

In subsequent tests by UCSD and the Instituto Politecnico Nacional in Mexico, researchers found that the drug was effective, decreasing the number of parasites and lowering damage from inflammation. According to AFP, auranofin targeted the enzyme that shielded the parasite from oxygen; the parasite is particularly sensitive to oxygen. In terms of the FDA´s approval of auranofin for Orphan Drug Status, this will fast-track the drug. Orphan drugs status is given to those medicines that show promise in treating a disease that affects fewer than 200,000 persons in the U.S.

“This new use of an old drug represents a promising therapy for a major health threat, and highlights how research funded by the National Institutes of Health can benefit people around the world,” remarked Dr. Sharon L. Reed, a professor in the UCSD Departments of Pathology and Medicine, in a prepared statement.

Medical Students’ Studies And Career Expectations

In this “themed” issue, Deutsches Ärzteblatt International is focusing on medical students. Bernd Gibis, of the German National Association of Statutory Health Insurance Physicians, and his coauthors investigate the question how medical students envisage their future professional lives as doctors (Dtsch Arztebl Int 2012; 109(18): 327). Another perspective is provided by Esther Ziemann and Jörg-Wilhelm Oestmann, who studied doctors’ role as academics and analyzed the publication activities of doctoral candidates at Berlin’s Charité University Medical School over a period of 10 years (Dtsch Arztebl Int 2012; 109(18): 333).

Gibis and coauthors evaluated more than 12 500 questionnaires in 2010. Almost all participants (96%) stated that compatibility of family and career was important to them–so important that more of them expressed an interest in working for an employer (92%) rather than starting their own private practice (77%). Among those who did aspire to working for themselves in private practice, the stated preference was for a specialist practice and an urban setting, rather than for a general practice in a rural setting. Envisaged obstacles to their future working lives included an overwhelming amount of administrative tasks and an imbalance between career and family–and when thinking about private practice, the financial risks involved were also perceived as an obstacle.

And how about academia? In order to answer this question, doctoral candidates from 1998, 2004, and 2008 were retrospectively captured in samples of more than 250 for each of the three years, and their publication activity was studied. The database used for the study was PubMed, and the quality parameter was the impact factor. Ziemann and Oestmann found that over the study period the number of publications per doctoral candidate had significantly increased, and the impact factors of the journals in which the candidates had published had also risen. The proportion of first authorships among doctoral candidates remained just about constant, at some 25%.

On the Net:

Indoor Navigation System Gives Guide Dogs A Rest

Helen Keller, perhaps the most famous activist for the visually impaired once said, “It is for us to pray not for tasks equal to our powers, but for powers equal to our tasks, to go forward with a great desire forever beating at the door of our hearts as we travel toward our distant goal.”
Empowerment of the visually impaired took another step forward this month with the presentation of Navatar, an indoor navigation system. Navatar, which was developed by a University of Nevada, Reno computer science engineering team, is an improvement on existing systems because it relies primarily on existing smartphone technology and not on less practical and bulky sensors.
“Existing indoor navigation systems typically require the use of expensive and heavy sensors, or equipping rooms and hallways with radio-frequency tags that can be detected by a handheld reader and which are used to determine the user’s location,” said Kostas Bekris, of the UNR College of Engineering’s Robotics Research Lab. “This has often made the implementation of such systems prohibitively expensive, with few systems having been deployed.”
In conjunction with two-dimensional, digital architectural maps that are widely available, the smartphone-based Navatar uses the device´s accelerometer and compass to navigate its user. The system is able to guide people with visual impairments down hallways and into rooms through audible instructions similar to those given by GPS devices made for autos.
“Nevertheless, the smartphone’s sensors, which are used to calculate how many steps the user has executed and her orientation, tend to pick up false signals,” said Eelke Folmer, who worked on the project.”To synchronize the location, our system combines probabilistic algorithms and the natural capabilities of people with visual impairments to detect landmarks in their environment through touch, such as corridor intersections, doors, stairs and elevators.”
Folmer explained that Navatar ℠listens´ for voice prompts or a button push on a Bluetooth-enabled device from the user to confirm the presence of these landmarks. This means the system can work to assist the user in conjunction with their typical routine for navigation, including the use of a cane.
On his website, Folmer noted that the system has a “high possibility of large-scale deployment” because it only requires a simple digital representation of an indoor environment can be sketched up with simple design drawing programs that could be downloaded from a building´s Wi-Fi network. The UNR team also performed a study involving 12 blindfolded and six blind users to demonstrate the feasibility of their system.
While the system was able to track users within 1.85 meters of their actual location, the researchers were able to identify several areas for improvement. Based on feedback from test subjects, the team´s report said improving Navatar´s accuracy, making it able to repeat directions, and having it capable of working from within in a pocket are all improvement they are considering.
For their work on Navatar, Bekris and Folmer recently won a PETA Proggy Award for Leadership in Ethical Science. PETA recognized the system as an animal-friendly achievement because of its potential to decrease the reliance on guide dogs for the visually impaired.

On the Net:

Dream Machine Helps Sleepers Control Their Journeys

Lee Rannals for RedOrbit.com

Ever dreamt that you were attempting to run from someone, but couldn’t get away and were actually running in place?  Well, a new sleeping mask may see to it that you are able to escape, scot-free, from now on.

Scientists have developed a sleeping mask, known as Remee, that allows people to control their own dream.

Remee, which is billed as a special REM enhancing device, helps steer people into dreaming by making the brain aware that it is dreaming.

The project was placed on the crowd funding website Kickstarter with the goal of raising $35,000.  Now, the researchers have seen more than 6,500 people pledge $572,891 to fund the sleeping device.

According to the Kickstart webpage, the project was able to raise its funding in just three days.

Sleep stages are divided into two main categories, including non-REM and REM sleep.  People go back and forth between these sleep stages through the night.

Remee notices the longer REM stages and “enters” the dream through flashing lights.  The device waits for four to five hours for the sleeper to get into heavy REM stages before its red lights turn on.

Once the pattern of lights kick on, it signals the brain that it is dreaming, which enables the dreamer to determine what happens next in their dream.

The scientists set up a website that allows Remee users to adjust setups like when to start the light sequence and when to repeat it.  The intensity of the lights can be changed as well.

Remee will display patterns for 15 to 20 seconds, with a second delay of 15 minutes between each signal.

The scientists behind the dream creation began working on the mask last February after reading studies focusing on lucid dreams that were conducted at Stanford University in the 1980s.

Early devices were bulky and came with a price tag of $1,000, according to what Remee inventor Duncan Frazier told Mail Online.

Frazier and fellow inventor Steven McGuigan developed a mask that works on a small 3-volt coin cell battery, and are available in five color options for $95.

The inventor told Mail Online that they have received 7,000 orders for the dream machines so far, most of them coming from buyers in Australia, Italy and Spain.

He said that he uses his Remee several times a week, but admitted that he does not reach the state of liquidity every time.

Frazier also said that the inventors have not received any reports of problems associated with the mask so far, and that LED lights are not known to cause seizures.

Remee is not the first device aimed at trying to enable people to control their own dreams.  Dream:ON, an iPhone app, is meant to do the same thing as Remee.

The iPhone app uses music to help dish out signals to the brain, allowing the dreamer to control the path of their sleep journey.

Dream:ON is interactive with researchers of the app as well, because it allows the user to makes notes after sleep that can be sent back to a database for future research.

On the Net:

New Bar Demographics App Raising Privacy Concerns

A new social app that can scan the faces of bar patrons in order to determine their ages and genders launched in venues throughout the US this past weekend, and it has wasted little time in raising the hackles of privacy advocates harboring concerns over how much and what types of information are being stored by the program.

The app in question, SceneTap, launched in two dozen bars in San Francisco on Friday and earlier in Austin, Texas; Athens, Georgia; Bloomington, Indiana; Chicago, Illinois, Gainesville, Florida; and Madison, Wisconsin, according to Marcus Wohlsen of the Associated Press (AP) and Elinor Mills of CNET.

SceneTap scans the faces of each venue’s customers, determines their respective ages and sexes, and provides real-time updates on the average age and the number of men vs. women at each establishment, Wohlsen noted. However, while the makers of the app told the AP that it does not identify any specific person, and that it does not save any pictures or personal information, it has still given rise to privacy concerns.

“SceneTap could conceivably prove useful for a variety of retail companies, providing data on when customers shop in stores, what items they browse, and other in-store behaviors and patterns. But it also raises expected hackles of privacy watchdogs who worry the data could be combined with a person’s online footprint to do what even Google can’t do right now — match your Internet activities with your offline world,” Mills wrote Thursday.

“SceneTap’s devices“¦ keep track of the number of people who enter and exit a venue and use facial detection software on video feeds to figure out what gender and age customers appear to be,” she added. “It provides that traffic and demographic information to bar owners who can design marketing and other promotions to target specific audiences, while users of the SceneTap app can see which bars in their area are ‘hopping.'”

Several bars in California’s Bay Area, which had originally agreed to partner with SceneTap, have withdrawn their offers, due largely to negative media attention and the public’s privacy worries, ArsTechnica‘s Cyrus Farivar wrote Sunday. Farivar added that the company insists that no images are being stored.

“The only stored data are historical trends of male and female ratios and estimates of customer ages. The company says that once it has accumulated such data over time, it will make it available to bars as an analytical tool to evaluate bar traffic,” he said, adding that after speaking with ArsTechnica, SceneTap CEO Cole Harper amended the company’s privacy policy to make it clear that “no facial mapping metrics, measurements, or other data used to predict demographics are stored” by the app or its developers.

Even so, the privacy concerns revolving around the new technology remain for many.

“Ten years ago if I walked down the street and took a picture of someone I didn’t know, there was little I could do to find out who that person was. Today it’s a very different story,” Lee Tien, a staff attorney and privacy expert with the Electronic Frontier Foundation (EFF), explained in an interview with the AP. “Even if everything is happening the way it is supposed to, then the next question is, gee, is that good enough? Is that something that you’re comfortable with?”

Sprint Looking To Entice New Customers With $100 iPhone Trade-In Offer

Sprint is looking to bring in new subscribers by offering a $100 trade-in credit for those who bring in their iPhones from another carrier and purchase a new iPhone 4S with a new two-year contract, and to make the move more enticing, Sprint still offers unlimited data plans.

Sprint said the limited time offer ends July 1st, but it will be accepting any iPhone models during the duration of the offer.

Both AT&T and Verizon ended unlimited data plans on their contracts, but some mobile customers were grandfathered in. However, Verizon announced earlier this week that it would push to end all unlimited data usage for its users this summer when it rolls out new shared data plans.

The top wireless company on Thursday issued a statement clarifying that only subscribers upgrading to a subsidized handset would lose their unlimited data plans.

AT&T in March made the decision to throttle heavy data users that pass a 3GB per month threshold.

Now that Verizon and AT&T are making the move to shared data plans and tiered pricing, their customers may be more enticed by Sprint´s user-friendly plans. Sprint CEO Dan Hesse has committed to continuing its unlimited data plans when the next iPhone rolls out, even if its 4G LTE.

This strategy has somewhat worked in getting Sprint new customers in recent quarters, but profitability has lagged due to the high cost of subsidizing iPhone prices. Hesse said earlier this week that Sprint wouldn´t make a profit from the iPhone until 2015, but has no regrets in signing a $15.5 billion over-four-year deal with handset maker Apple.

“We believe in the long term,” said Hesse. “And over time we will make more money on iPhone customers than we will on other customers.”

Faster, More Energy Efficient Computer Components On The Way

A pair of new computer components unveiled late this week — one which will require less energy to store and retrieve information, and one which improves power and resource efficiency by occasionally allowing errors to occur — could one day fundamentally change the technology behind desktops, laptops, and similar devices.

The first of those two units is known as a “memristor,” and according to BBC News Science and Technology Reporter Jason Palmer, its properties “make it suitable for both for computing and for far faster, denser memory.”

The theoretical concept of the memristor, which derives its name from the words memory and resistor, was first proposed roughly four decades ago, though a first prototype of the component was not possible until 2008, Palmer said.

The unit can remember how much current has passed through it, even after the device containing it has been powered off, and experts told BBC News that it can be manufactured more affordably these days thanks to modern semiconductor techniques.

“The history-dependent nature of their electrical properties would make them able to carry out calculations, but most interest has focused on developing them for memory applications, to replace the widespread ‘flash’ solid-state memory of USB sticks and memory cards,” Palmer wrote on Friday.

“We’re reaching the limits of what we can do with flash memory in terms of increasing the storage density, and it’s also relatively high power and not as fast as we would like,” added Anthony Kenyon of University College London (UCL), who along with colleagues at the school and from France and Spain detail their work with memristors in the Journal of Applied Physics. “Flash memory devices switch at 10,000 nanoseconds (billionths of a second) or so, and in our device we can’t measure how fast it is“¦ Our equipment only goes down to 90 nanoseconds. It’s at least as fast as that and probably faster.”

He also told Palmer that, while their memristor concepts may be less advanced than other, similar components being worked on by other teams that he believes that the ease of manufacturing and the low cost of materials could make them more attractive to consumer electronics manufacturers. Kenyon said that they were in preliminary talks with some “fairly major names” in the industry about making their technology commercially available.

On Thursday, researchers at Rice University announced via press release a new computer chip that they say “challenges the industry’s 50-year pursuit of accuracy” — a design which they argue “improves power and resource efficiency by allowing for occasional errors” and is “at least 15 times more efficient than today´s technology.”

Prototypes of the new chip were unveiled this week at the ACM International Conference on Computing Frontiers in Cagliari, Italy — where research completed by experts at the Houston, Texas-based school, as well as Nanyang Technological University (NTU) in Singapore, Switzerland´s Center for Electronics and Microtechnology (CSEM) and the University of California, Berkeley, earned best-paper honors, the university announced.

“It is exciting to see this technology in a working chip that we can measure and validate for the first time,” Project Leader and Rice-NTUS Institute for Sustainable and Applied Infodynamics (ISAID) Director Krishna Palem said in a statement.  “Our work since 2003 showed that significant gains were possible, and I am delighted that these working chips have met and even exceeded our expectations.”

“The paper received the highest peer-review evaluation of all the Computing Frontiers submissions this year,” added Paolo Faraboschi, the program co-chair of the ACM Computing Frontiers conference and an employee of Hewlett Packard Laboratories. “Research on approximate computation matches the forward-looking charter of Computing Frontiers well, and this work opens the door to interesting energy-efficiency opportunities of using inexact hardware together with traditional processing elements.”

The goal of the project, according to the university’s press release, is to create microchips that require a fraction of modern-day microprocessors by being inexact in certain processes.

“The concept is deceptively simple: Slash power use by allowing processing components — like hardware for adding and multiplying numbers — to make a few mistakes,” the researchers explain. “By cleverly managing the probability of errors and limiting which calculations produce errors, the designers have found they can simultaneously cut energy demands and dramatically boost performance.”

Two examples of this approach include a process known as “pruning,” which does away with some infrequently used areas of a microchip’s digital circuits, and “confined voltage scaling,” which harnesses improvements in processing speed performance to reduce the amount of power required to operate.

The Rice University researchers said that in simulated tests conducted last year, they discovered that the smaller pruned chips were twice the speed as traditional counterparts while needing less than half as much energy.

More recent tests demonstrated that pruning could reduce energy consumption by more than three times ordinary chips when they “deviated from the correct value” by just one-fourth of a percent, study co-author and graduate student Avinash Lingamneni said. When including size and speed increases into their figures, the researchers discovered that they could be up to 7.5 times more efficient than regular chips — a number which could be increased to as much as 15 times more efficient when larger deviation percentage was allowed, he added.

“Particular types of applications can tolerate quite a bit of error,” Christian Enz, project co-investigator and chief of the CSEM branch of the research, explained. “For example, the human eye has a built-in mechanism for error correction. We used inexact adders to process images and found that relative errors up to 0.54 percent were almost indiscernible, and relative errors as high as 7.5 percent still produced discernible images.”

Palem said that prototype hearing aids utilizing the pruned chips are expected by 2013.

Diamond Used To Produce Graphene Quantum Dots And Nano-Ribbons Of Controlled Structure

Kansas State University researchers have come closer to solving an old challenge of producing graphene quantum dots of controlled shape and size at large densities, which could revolutionize electronics and optoelectronics.

Vikas Berry, William H. Honstead professor of chemical engineering, has developed a novel process that uses a diamond knife to cleave graphite into graphite nanoblocks, which are precursors for graphene quantum dots. These nanoblocks are then exfoliated to produce ultrasmall sheets of carbon atoms of controlled shape and size.

By controlling the size and shape, the researchers can control graphene’s properties over a wide range for varied applications, such as solar cells, electronics, optical dyes, biomarkers, composites and particulate systems. Their work has been published in Nature Communications and supports the university’s vision to become a top 50 public research university by 2025. The article is available online.

“The process produces large quantities of graphene quantum dots of controlled shape and size and we have conducted studies on their structural and electrical properties,” Berry said.

While other researchers have been able to make quantum dots, Berry’s research team can make quantum dots with a controlled structure in large quantities, which may allow these optically active quantum dots to be used in solar cell and other optoelectronic applications.

“There will be a wide range of applications of these quantum dots,” Berry said. “We expect that the field of graphene quantum dots will evolve as a result of this work since this new material has a great potential in several nanotechnologies.”

It has been know that because of the edge states and quantum confinement, the shape and size of graphene quantum dots dictate their electrical, optical, magnetic and chemical properties. This work also shows proof of the opening of a band-gap in graphene nanoribbon films with a reduction in width. Further, Berry’s team shows through high-resolution transmission electron micrographs and simulations that the edges of the produces structures are straight and relatively smooth.

On the Net:

New Autism Test Is As Easy As Picking Up Your Baby

Doctors are now saying there is a simple, new way to test 6 month olds for autism and other growth delays.

A study by the Kennedy Krieger Institute found children with a high risk for autism also had weak head and neck control. A majority of the children who fit both of these descriptions were later diagnosed with autism and other social delays.

Though these studies are preliminary, these findings are the first to suggest that early delays in motor development may be a warning sign for autism.

A typical baby should be able to control their head and neck as early as 4 months. When a baby is lying on their back and is then pulled into a sitting and then standing position, their heads should remain strong and in line, rather than flopping back.

According to WebMD, such delays in development have been noticed in premature babies and children with cerebral palsy. This is the first time researchers are linking these delays to autism.

Dr. Rebecca Landa is the study´s author, and will present her findings at the International Meeting for Autism Research.

In order to conduct these studies, Dr. Landa´s team looked for babies who´s older siblings have autism, placing them at high risk for the disorder.

In one group, nearly 40 babies were given the head-lag test at 6, 14 and 24 months. Then, the same babies were tested for autism between 30 and 36 months.

At the end of the test, the babies were classified into three outcomes:

• 90 percent of infants diagnosed with ASD exhibited head lag as infants;

• 54 percent of children meeting criteria for social/communication delay had exhibited head lag as infants, and;

• 35 percent of children not meeting the criteria for social or communication delay or ASD exhibited head lag at 6 months.

In the second group, the researchers studied 6-month olds at a single point in time, looking for the presence of head lag. Of these babies, the researchers found 75% of high-risk infants had issues of head lag. According to a statement detailing these results, Dr. Landa said, “Our findings show that the evaluation of motor skills should be incorporated with other behavioral assessments to yield insights into the very earliest signs of autism.”

“While previous research shows that motor impairments are linked to social and communication deficits in older children with autism, the field is just starting to examine this in younger children,” said Dr. Landa. “Our initial research suggests that motor delays may have an important impact on child development.”

To continue their research, Dr. Landa´s team also conducted studies on 14, 24 and 36-month old babies at high and low risk of developing autism. According to this research, motor delay in children with autism will become increasingly noticeable by the child´s third birthday, though not every child with autism will exhibit these motor delays.

“While more research is needed to examine why not all children with ASD experience motor delay, the results of our studies examining motor development add to the body of research demonstrating that early detection and intervention for infants later diagnosed with autism is possible and remains crucial to minimize delays and improve outcomes,” said Dr. Landa.

Popular Antibiotic May Be Responsible For Sudden Cardiac Death

Lawrence LeBlond for RedOrbit.com

Azithromycin, a common antibiotic used for treatment of bronchitis, pneumonia, and sexually transmitted diseases (STDs), appears to also boost the risk of sudden cardiac death when compared with no antibiotic treatment, according to a US study on Wednesday.

Azithromycin has been available worldwide since the 1980s, but the new study, published in the New England Journal of Medicine (New England Journal of Medicine), is the first to document serious heart risks with the use of the popular antibiotic.

Researchers at Vanderbilt University compared about 348,000 prescriptions of azithromycin to millions of records from people who were not treated with any antibiotics or who received the common antibiotic amoxicillin, which is considered heart safe.

Compared with no antibiotic treatment, the hazard ratio for cardiovascular death during a 5-day course of azithromycin was 2.88, according to Wayne A. Ray, PhD, of Vanderbilt University in Nashville, and colleagues. And when compared to amoxicillin, azithromycin showed a 2.49 hazard ratio.

The comparisons were made based on patient records in Tennessee from 1992 to 2006.

The analysis found 47 more deaths per million in those taking azithromycin compared to those taking amoxicillin. When the researchers examined patients already at high risk for heart problems, the chance increased to 245 more deaths per million in the azithromycin group compared to the amoxicillin group.

While the increased odds of death are small, the researchers note they are significant enough to get the attention of medical experts and also urge them to consider prescribing a different drug to those patients who are already at high risk of heart problems.

People at high risk include those with heart failure, diabetes or who have had a previous heart attack, and for those who have undergone bypass surgery or have had stents implanted. In such patients, the drug may cause fatal abnormal heart rhythms.

Ray said the concerns should not apply to children as they have very little risk of heart disease.

Azithromycin is commonly used for a number of illnesses, and is most familiarly used as the “Z-Pak,” taken for five days. It is considered a convenient treatment option because most other antibiotics need to be taken for at least 10 days.

More than 55.3 million prescriptions for azithromycin were written last year in the US, and sales topped $464 million, according to medical information and services firm IMS Health. Global sales topped $1.8 billion.

“We believe this study adds important information on the risk profile for azithromycin,” Ray told AFP. “For patients with elevated cardiovascular risk and infections for which there are alternative antibiotics, the cardiovascular effects of azithromycin may be an important clinical consideration.”

“I´m inclined to agree with Dr. Ray,” Dr. John G. Bartlett,  professor of medicine at Johns Hopkins University School of Medicine and a former president of the Infectious Diseases Society of America, told the New York Times.

He said the study convinced him because it included data on a huge number of patients and because the findings were biologically plausible, given that related drugs had also been found capable of disrupting heart rhythm.

Bartlett said he would consider prescribing a different drug to patients at high risk. There are plenty of alternatives, he noted.

Bartlett also noted that the study provided another reason to halt the overuse of antibiotics, which are often prescribed to illnesses they cannot treat. Overuse of these antibiotic drugs has contributed to the emergence of dangerous, drug-resistant bacteria strains.

“We use azithromycin for an awful lot of things, and we abuse it terribly,” Bartlett told New York Times writer Denise Grady. “It´s very convenient. Patients love it. ℠Give me the Z-Pak’. For most of where we use it, probably the best option is not to give an antibiotic, quite frankly.”

Dr. Lori Mosca, director of preventive cardiology at New York-Presbyterian Hospital/Columbia University Medical Center, said azithromycin´s benefits may outweigh the small risks suggested by the study.

She noted that the study was observational rather than clinical, meaning the researchers looked back at medical records, rather than setting up experiments in which patients were assigned random treatments and then monitored. Even the most careful observational study can be misleading, Mosca said.

She said a more rigorous study should be done to verify the findings.

“It would be crazy to think we can´t use azithromycin. It´s bad to undertreat infections,” she told Grady

Dr. Roy M. Gulick, the chief of infectious diseases at New York-Presbyterian/Weill Cornell, also said that more research was needed. But, he said: “In someone with significant cardiovascular risks or documented disease, the results of this study would be one factor that would help you choose among the antibiotics.”

“Any antibiotic is going to have risks and benefits,” Ray said. “We think this is an important piece of information about risks.”

The authors acknowledged they had potential difficulties with confounding in the study, relating to various factors such as underlying cardiovascular disease and behavioral risk factors such as smoking and diet. They addressed this by choosing controls that were matched according to propensity scores, they explained.

A second source of potential confounding was the direct effect of infections, but they limited this by having the additional control group of patients being treated with amoxicillin, which is used for similar indications as azithromycin.

Azithromycin is a macrolide antibiotic that works by stopping the growth of bacteria. Side effects may include skin rashes, itching, swelling, difficulty breathing or swallowing and rapid, pounding or irregular heartbeats, according to the American Hospital Formulary Service.

Exposure To Blasts And Concussive Injuries Have Similar Brain Effects

Connie K. Ho for RedOrbit.com

Researchers from Boston University (BU) and the Veterans Affairs Boston Healthcare System recently found chronic traumatic encephalopathy (CTE) in the brain tissue of military service personnel who were exposed to blasts. The experiment showed that a single blast to an improvised explosive device (IED) resulted in CTE and long-term brain impairments related to the disorder. As well, blast wind from the IED resulted in traumatic brain injury (TBI) and long-term consequences that were similar to the head traumas that athletes have suffered.

CTE, a progressive neurodegenerative brain disorder, can only be diagnosed postmortem and has been seen in athletes who have suffered multiple concussions or subconcussive injuries. With CTE, there is an abnormal amount of protein deposits called tau that are neurofibrillary tangles, glial tangles and neuropil threads found throughout the brain. Brain cell death is the eventual result of the tau lesions. CTE has clinical features that are similar to TBI, such as the evidence of psychiatric symptoms as well as long-term memory and learning disabilities. TBI can affect around 20 percent of the 2.3 million men and women who have been exposed to blasts and have served in the military since 2001. Blast winds from an IED can go up to 330 miles per hour, which is stronger than any gust wind ever recorded on the planet.

“The force of the blast wind causes the head to move so forcefully that it can result in damage to the brain,” said Dr. Lee Goldstein, a lead author in the study.

The study is the first to look at a series of postmortem brains from U.S. military personnel that have been exposed to a blast or have had a concussive injury. The research, recently published in Science Translational Medicine, was a collaborative effort by Goldstein, an associate professor at Boston University School of Medicine (BUSM) and Boston University College of Engineering, and Dr. Ann McKee, a professor at BUSM and a director of the Neuropathology Service for VA New England Healthcare System. They looked at the brain tissue samples from four military servicemen who had been exposed to blasts or concussive injuries; three young amateur football players and a wrestler that had suffered multiple concussive injuries; and four who had no prior exposure to blasts, concussive injuries, or neurological diseases.

The researchers found that the CTE neuropathology in the military servicemen who had been exposed to blasts was similar to the athletes who had prior history of concussive injuries.

“Our results showed that the neuropathology from blast exposure, concussive injury, or both were virtually indistinguishable from those with a history of repeat concussive injury,” noted McKee, director of the Brain Banks for BU’s Alzheimer’s Disease Center and the Center for the Study of Traumatic Encephalopathy, in a prepared statement.

The two also worked with experts in blast physics, experimental pathology, and neurophysiology. The team observed that a blast felt by the military personnel could have the neuropathological and the behavioral evidence of CTE. Two weeks after the blast, the long term impairments in the brain function were already evident.

“Our finding of clear impairments in the ability to both learn and remember one month after a blast exposure leads us to wonder just how long-lasting these impairments are, and whether they can be prevented or rescued,” remarked Dr. Libor Velisek, a professor and director of the New York Medical College Developmental Epilepsy Laboratory and Behavioral Phenotyping Core Facility who assisted in the project, in the statement.

With these results, the researchers hope to continue their studies and examine how brain injuries can be prevented. They showed that immobilizing the head could help prevent disabilities related to learning and memory. This could be applied to assist in the development of strategies to treat and rehabilitate people who have been exposed to blasts in the past or have suffered concussive injuries.

“Our study provides compelling evidence that blast TBI and CTE are structural brain disorders that can emerge as a result of brain injury on the battlefield or playing field,” concluded Goldstein in the statement. “Now that we have identified the mechanism responsible for CTE, we can work on developing ways to prevent it so that we can protect athletes and our military service personnel.”

Paralyzed Woman Moves Robot Arm With Her Brain

[ Watch the Video ]

Connie K. Ho for RedOrbit.com

For one lucky woman, 15 years of paralysis was broken on Saturday, April 12 when the 58-year-old woman, who is unable to speak, controlled a robotic arm by thinking about a particular action. This resulted in the robotic arm picking up a bottle of coffee, lifting it her to mouth, allowing her to take a sip. This feat is part of the progress made in a project regarding brain-computer interfaces restorative neurotechnology and assistive robot technology.

The research, published in the May 17 issue of the journal Nature, is a collaborative effort by BrainGate2; it includes participation by Brown University, the German Aerospace Center (DLR), Harvard Medical School, Department of Veterans Affairs, and Massachusetts General Hospital.

“The smile on her face was a remarkable thing to see. For all of us involved, we were encouraged that the research is making the kind of progress that we had all hoped,” described lead investigator Dr. Leigh Hochberg in a prepared statement.

Besides the female, a 66-year-old male participated in the study; both had been paralyzed by a brain stroke years earlier, which had left them without the ability to control their limbs. The two used neural activity to control robotic arms that had been developed by the DLR Institute of Robotics and Mechatronics as well as the DEKA Research and Development Corp. The robotic arms could complete activities that required reaching and grasping in a three-dimensional space.

The BrainGate2 is a pilot program that uses the BrainSystem that was developed at Brown University. The system features a small aspirin-sized device that is implanted in the motor cortex, which can focus on voluntary movement. The electrodes record the neural activity of the neurons, which is then translated by an external computer into commands to control assistive devices like the DLR and DEKA robot arms.

The study is the first demonstration and first peer-reviewed-article that has people with tetraplegia using brain signals to operate a robot arm in a three-dimensional space and completing a task that is normally done by the arm. The two subjects were able to control the arms with the help of flexible supports.

“Our goal in this research is to develop technology that will restore independence and mobility for people with paralysis or limb loss,” remarked Hochberg, the Sponsor-Investigator for the BrainGate2 pilot clinical trial, in the statement. “We have much more work to do, but the encouraging progress of this research is demonstrated not only in the reach-and-grasp data, but even more so in S3’s smile when she served herself coffee of her own volition for the first time in almost 15 years.”

Some of the funding was supported through Veterans Affairs (VA), which works to improve the lives of injured veterans.

“VA is honored to have played a role in this exciting and promising area of research,” commented VA Secretary Eric Shinseki in the statement. “Today’s announcement represents a great step forward toward improving the quality of life for Veterans and others who have either lost limbs or are paralyzed.”

Researchers were also positive about the results as they observed that the female subject was able to direct the movement five years after the BrainGate was implanted. They believe that it shows the amount of time that implanted brain-computer interface electrodes can remain viable and useful in completing command signals. The report also shows how much progress has been made in brain-computer interfaces since the BrainGate was first demonstrated over ten years ago.

“This paper reports an important advance by rigorously demonstrating in more than one participant that precise three-dimensional neural control of robot arms is not only possible, but also repeatable,” explained John Donoghue, the VA and Brown neuroscientist who is a co-author in the study, in the statement. “We’ve moved significantly closer to returning everyday functions, like serving yourself a sip of coffee, usually performed effortlessly by the arm and hand, for people who are unable to move their own limbs. We are also encouraged to see useful control more than five years after implant of the BrainGate array in one of our participants. This work is a critical step toward realizing the long-term goal of creating a neurotechnology that will restore movement, control and independence to people with paralysis or limb loss.”

In the experiment, the robot arm acts as a replacement for the human arm.

“I just imagined moving my own arm and the [DEKA] arm moved where I wanted it to go,” said the male subject on his experience with the robotic arm.

The DRL arm, which was designed to assist with activities of the arm and to interact with human users, could be useful for people with different disabilities.

“This is what we were hoping for with this arm. We wanted to create an arm that could be used intuitively by varying forms of control. The arm is already in use by numerous research labs around the world that use its unique interaction and safety capabilities. This is a compelling demonstration of the potential utility of the arm by a person with paralysis,” remarked Patrick van der Smagt, head of bionics and assistive robotics at DLR, in the statement.

To further the research, the team hopes to observe the technology with more individuals. In the future, they hope that the robotic arms will be wireless and fully automated. For the present, the sensor and the user have to be connected to cables to the rest of the system. Researchers will also look into improving the precision and control of the robotic arms.

“It is a spectacular result, in many respects,” John Kalaska, a University of Montreal neuroscientist unaffiliated with the study, told the New York Times. “And really the logical next step in the development of this technology. This is the kind of work that has to be done, and it´s further confirmation of the feasibility of using this kind of approach to give paralyzed people some degree of autonomy.”

Image 2 (below): In a clinical trial, a woman used the BrainGate system to mentally control a robotic arm and reach for a drink. Credit: The BrainGate Collaboration

Cloud-Based Home Automation Unit Announced

A Los Altos, California-based startup company, founded by some of the brains behind the Apple iPhone and Google Gmail, has announced a new chip that could be the first step in connecting home appliances to the cloud, accessing them from afar, and essentially automating an entire house.
The company is named Electric Imp, and on Wednesday they unveiled “the Imp,” — a chip that is similar in appearance to a standard SD card but comes equipped with an embedded processor and Wi-Fi capability, Ryan Lawler of TechCrunch reported. It can be programmed to control or measure any number of things depending on what it has been plugged into, it can be switched from one device to another with little hassle, and it will cost just $25 per card, plus a “small monthly subscription fee,” he added.
The Imp cards can be installed directly into units available now by purchasing circuit boards sold by the company, which was founded by former iPhone engineering manager Hugo Fiennes and ex-Gmail designer Kevin Fox (as well as software architect Peter Hartley), and Electric Imp is currently negotiating with manufacturers in order to have slots for the cards pre-installed on appliances and other products, Lawler and Gizmodo‘s Mat Honan explained on Wednesday.
“Once installed, they connect to the Internet and Imp’s cloud-based software controls, allowing them to both be controlled remotely and work in conjunction with other connected devices.” Honan said.
“The pitch to them is that, for less than a dollar they will be able to add a slot and create powerful new applications for otherwise dumb devices“¦ And they can do so without having to worry about hiring specialists to integrate connectivity into the device or increasing the cost of their products by adding Wi-Fi or processors that are otherwise unnecessary,” Lawler added. “Electric Imp will provide the enabling hardware —  the Imp card — and it also will manage the back end service which connects all of the devices. Imp-enabled devices can be controlled either from a web browser or on a smartphone, either through Electric Imp´s own application, or through third-party apps that are developed to take advantage of the new platform.”
The TechCrunch reporter said that Fiennes, the company’s CEO, had demonstrated some of the applications that the Imp card would utilize to him on Tuesday, and that among the tasks it will allow users to perform is remotely switching lights on or off, or setting high-energy consuming appliances such as washing machines to automatically switch when the cost of electricity is at its lowest. He also says that is could sense when a garden needs to be watered and take care of the task, set-up motion sensors that can detect when there is “unusual activity” — a possible intruder or the lack of movement among home-bound seniors, for example.
“The possibilities really are endless, especially when you consider the number of devices which don´t have connectivity now, because it´s too expensive, but could benefit from some smarts and the ability to monitor and control them from afar,” Lawler said. “Just as importantly, the Imp could enable hardware from multiple manufacturers to work together, rather than having to rely on a single vendor for a home automation system that requires a major upfront investment and, frankly, probably won´t be updated or age well with time. And because the smarts of the device can be updated from the cloud, manufacturers will be able to remotely monitor and update their products seamlessly, without consumers even knowing.”
“Until now, creating connected devices was a huge challenge for any vendor,” Fiennes told John Koetsier of VentureBeat in a statement. “Electric Imp changes all this by bringing the power of an easy to use, cloud-based service to almost any device and allowing the internet to interact with everyday objects.”
Lawler reports that the company currently has seven employees, but is looking to add 13 more within the next six months — expansion that will be assisted by $7.9 million in financing the company recently secured from Redpoint Ventures and Lowercase Capital.
On their official website, Electric Imp notes that preview units of the chip will be available along with developer kits starting in late June, and Imp-enabled products are expected to be available “from a variety of vendors” before the end of the year. They add that the devices can be monitored and controlled from the Internet-enabled computers, smartphones, and “even from other Imp-enabled devices.”

National Healthy Vision Month – ZEISS Online Vision Screening Check

Connie K. Ho for RedOrbit.com

Fuzzy outlines blur your vision. You squint to make out the letters on a sign or on the computer screen. These are a few of the signs that your vision is not as clear as it used to be. To correspond with Healthy Vision Month and the 100th Anniversary of ZEISS Precision Lenses, Carl Zeiss Vision decided to launch the ZEISS Online Vision Screening Check along with a set of online educational tools.

Carl Zeiss Vision develops progressive lens, eyeglass coatings, and lens processing equipment. The ZEISS Online Vision Screening Check was created by the research department at Carl Zeiss Vision. It allows participants to obtain more information regarding their current visual acuity, contrast vision, and color vision to better understand their visual performance.

“How we really wanted to celebrate [the anniversary] was through patient education campaigns,” said Jeff Hopkins, senior manager of professional affairs at Carl Zeiss Vision. “We wanted people to understand a little more about their vision and help them understand the options that they have.”

This past November, Carl Zeiss Vision did a national survey of customers and found that 30 percent of people had problems reading, 44 percent had night vision that was inadequate, and 48 percent had visited an eye doctor only once in two years or more.

“There are vision problems going uncorrected and this online tool is a way of encouraging seeing an eye doctor,” commented Hopkins.

The Online Vision Screening Check begins with a set of instructions to allow people to adjust their monitor settings to make sure the test is performed in the correct format for the user. It lists steps in adjusting screen calibration, screen brightness, and the user´s distance from the monitor. After the test has been formatted correctly, the screen takes you to the three different exams.

The first test focuses on visual acuity, which is related to the ability to look at small patterns and structures in high contrast conditions. In the test, there´s a rotating letter “E” that has size changes and the user is asked to use the arrow key to answer which direction the “E” is pointing. Following the visual acuity test, the next portion examines contrast vision with a letter “C” that fades from dark to light. Patients are asked to use the arrow marks to determine the “open side” of the letter “C.” The contrast vision results correlate with situations where there is changing light conditions. Lastly, the test gives users an opportunity to examine any color visual deficiencies. In particular, the ability to see the colors red, blue, and green is tested. The test result is considered positive if the opening of the letter “C” isn´t correctly determined.

“The tools were developed with eye doctors to make sure that we weren´t leading anyone astray,” noted Hopkins. “They worked to make sure that it was accurate and to the degree to which it could be a simple tool.”

Overall, the tests are user friendly. The results come fairly quickly, about five minutes after the exam. The possible results are good, fair, or poor. Though these tests are beneficial, Zeiss states that the online test is not a replacement for a yearly checkup and recommends that people schedule annual eye exams with a professional doctor to check basic vision and eye health. The website also includes a search engine that users can utilize to look up nearby opticians.

According to Hopkins, both adults and children can take the Online Vision Screening Check.

“This can help parents see if [vision´s] a problem. Kids, like everyone else, don´t always notice a vision problem, it´s what they´re used to,” explained Hopkins. “Vision problems come slowly and gradually and you don´t really notice it. I think that anybody can use it.”

Along with the Online Vision Screening Check, the company provides tips and information on eye examinations and vision maintenance. There are various sections, including facts to know about visions and lenses, tips for selecting an eye doctor, questions to ask at an eye appointment, and facts to know about age-related vision changes. Within each section, there is also a PDF available of the notes listed on the website.

“The thing that we find over and over again is that most people don´t really understand what´s available to them in terms of lenses. They think of it as a generic product“¦ it´s got a prescription, but there´s a world of difference among lenses and you have a lot of choices. We´re trying to help people understand that,” remarked Hopkins.

Apart from the Online Vision Screening Check and the educational tools, the company is also currently developing an iPad application.

“I think the web and the interactive applications are the perfect way to demonstrate this“¦ demonstrate visual problems and visual alternatives you have because it´s hard for people to conceptualize what differences proceed and how you can show them,” said Hopkins. “I think that you´re going to see a lot more of these things, I think this is the ideal medium for educating people about their vision.”

Apart from the tools that Zeiss offers, people can participate in Healthy Vision Month in a number of other ways. It´s important to note that the theme for this year´s Healthy Vision Month is “Healthy Vision: Make It Last a Lifetime.” Organizations like the Center for Disease Control and Prevention (CDC), National Institutes of Health’s National Eye Institute (NIH), and the American Optometric Association (AOA) are all working to promote the importance of eye health and vision care. They have listed ways in which people can advocate for good eye health.

In the CDC´s Morbidity and Mortality Weekly Report, vision impairment was stated as having effects on daily activities like reading, driving a car, and cooking. They advocate for early detection, timely treatment, and proper care to prevent any disorders related to eye health or vision impairment. Other organizations are making a statement about vision care as well.

“Our messages stresses the important role comprehensive dilated eye exams play in detecting eye diseases in their early stages and ensuring people are seeing their best,” commented Neyal J. Ammary-Risch, M.P.H., CHES, deputy director of the NEI’s National Eye Health Education Program (NEHEP), on the American Optometric Association´s website. “Millions of people in the United States have undetected vision problems and eye conditions. We need everyone’s help encouraging people in every community to schedule eye exams. As every eye care professional knows, a comprehensive dilated eye exam can detect common vision problems and eye diseases, many of which have no early warning signs.”

Scientists Target Drug That Identifies And Attacks Breast Cancer Stem Cells

Cell surface protein blows potent cells’ cover; targeted drug works in preclinical tests

Breast cancer stem cells wear a cell surface protein that is part nametag and part bull’s eye, identifying them as potent tumor-generating cells and flagging their vulnerability to a drug, researchers at The University of Texas MD Anderson Cancer Center report online in Journal of Clinical Investigation.

“We’ve discovered a single marker for breast cancer stem cells and also found that it’s targetable with a small molecule drug that inhibits an enzyme crucial to its synthesis,” said co-senior author Michael Andreeff, M.D., Ph.D., professor in MD Anderson’s Departments of Leukemia and Stem Cell Transplantation and Cellular Therapy.

Andreeff and colleagues are refining the drug as a potential targeted therapy for breast cancer stem cells, which are thought to be crucial to therapy resistance, disease progression and spread to other organs.

“It’s been difficult to identify cancer stem cells in solid tumors,” Andreeff said. “And nobody has managed to target these cells very well.”

The marker is the cell surface protein ganglioside GD2. The drug is triptolide, an experimental drug that Andreeff has used in preclinical leukemia research. The team found triptolide blocks expression of GD3 synthase, which is essential to GD2production.

Triptolide stymied cancer growth in cell line experiments and resulted in smaller tumors and prolonged survival in mouse experiments. Drug development for human trials probably will take several years.

Cancer stem cells are similar to normal stem cells

Research in several types of cancer has shown cancer stem cells are a small subpopulation of cancer cells that are capable of long-term self-renewal and generation of new tumors. More recent research shows they resist treatment and promote metastasis.

Cancer stem cells are similar to normal stem cells that renew specialized tissues. The breast cancer findings grew out of Andreeff’s long-term research in mesenchymal stem cells, which can divide into one copy of themselves and one differentiated copy of a bone, muscle, fat or cartilage cell.

Andreeff has shown these mobile mesenchymal stem cells home to wounds, including tumors, making them potential carriers of cancer therapy.

An important cellular transition also comes into play.

Co-senior author Sendurai Mani, Ph.D., assistant professor in MD Anderson’s Department of Molecular Pathology and Co-Director of the Metastasis Research Center, is an expert on epithelial-to-mesenchymal transition (EMT). About 85 percent of all solid tumors start in the lining of an organ, called the epithelium. Mani and colleagues at MIT showed that epithelial cells can be induced to take on stem cell properties by forcing them to undergo EMT.

“This change from stationary epithelial cells to the mobile mesenchymal stem cells is an important step in metastasis,” Mani said.

Andreeff and Mani in 2010 discovered that human mammary epithelial cells that undergo epithelial-to-mesenchymal transition act similarly to human bone-marrow-derived mesenchymal stem cells. They can home in to wounds and differentiate into the same cell types.

GD2 separates cancer stem cells from other tumor cells

In the current project, the researchers hypothesized that the cell markers expressed on the surface of mesenchymal stem cells would also be expressed on the surface of breast cancer stem cells.

They found that GD2 expression, one such mesenchymal stem cell marker, divided the breast cancer cell lines into two distinct groups: about 4.5 percent of cells were GD2-positive and about 92.7 percent were GD2-negative.

GD2-positive breast cancer cells:

-Form twice as many mammospheres, a clumping of cells considered an indicator of tumor-forming capacity, as compared to GD2-negative cells. And the spheres were three times as large.
-Migrate four times as fast as GD2-negative cells.
-Form five times as many tumors when 10 cells of each type are transplanted into mice.

GD2-positive cells also have general cancer stem cell marker

A known combination marker of cancer stem cells is high expression of CD44 and low expression of CD24 surface proteins. The researchers found 85 percent of GD2-positive breast cancer cells were CD44 high/CD24 low, while only 1 percent of GD2-negative cells shared that characteristic.

An analysis of 12 human breast cancer tumors found an even higher correlation of 95.5 percent between GD2+ cells and CD44 high/CD24 low status.

Comparing gene expression between GD2+ cells and CD44 high/CD24 low cells revealed 100 percent correlation in the expression of 231 genes.

GD2+ cells had greater expression of genes involved in migration, invasion and epithelial-mesenchymal transition than GD2- cells. They also had a nine-fold increase in GD3 synthase, a key enzyme in the eventual synthesis of GD2.

Further experiments showed that:

-Inducing EMT raised the percentage of GD2+ cells in two breast cancer cell lines.
-Knocking down GD3 synthase cut the percentage of GD2+ cells by more than half.
-Mice injected with 1 million breast cancer cells having a small interfering RNA that blocked GD3 synthase never developed tumors even after eight weeks, while all of the control mice with active GD3S developed tumors.

Triptolide stymies tumor growth, extends survival

The researchers then used triptolide, a known inhibitor of GD3 synthase, to treat immune-deficient mice injected with breast cancer cells. Of the mice treated, 50 percent did not develop breast cancer and the other half had smaller tumors than the control mice. The treated mice also lived longer than the controls.

GD2’s function in cancer stem cells remains unclear. “As GD2 is an immune suppressant, it would be needed by cancer stem cells to counter immune cells during metastases,” said first author Venkata Lokesh Battula, Ph.D., of MD Anderson’s Department of Leukemia. “Inhibition of GD2 expression in cancer cells may enhance the inherent ability of immune cells to kill cancer cells.”

On the Net:

Skull Of Jurassic Sea Creature Shows That Even Dinosaurs Had Arthritis

Arthritis, an often debilitating joint disorder that affects millions and millions of people around the world, may have also caused pain and discomfort for dinosaurs and other ancient reptiles more than 150 million years ago, according to new research from the University of Bristol.

Researchers, led by Bristol´s Dr. Judyth Sassoon, were fascinated to find a degenerative condition similar to arthritis in the jaw of a female pliosaur — an ancient sea creature that lived during the Late Jurassic Period.

This is the first time an arthritis-like disease has been found in fossilized Jurassic reptiles. Usually capable of easily tearing the flesh from its prey with its giant eight-inch teeth, the inflamed jaw eventually kept the animal from feeding, which ultimately led to her death.

The specimen, discovered in Westbury, Wiltshire, had been kept in the collections of the Bristol City Museum and Art Gallery, where Sassoon first saw it. The 26-foot-long pliosaur was a terrifying beast with a large, crocodile-like head, short neck, whale-like body, and four powerful flippers used for propelling it through the waters in search of prey.

The researchers, publishing their results in the journal Paleontology, said the degenerative condition had eroded the left jaw joint of the animal, displacing the lower jaw to one side. It appears the creature lived with the crooked jaw for many years, because there are marks on the bone of the lower jaw where the teeth from the upper jaw impacted on the bone during feeding.

“In the same way that aging humans develop arthritic hips, this old lady developed an arthritic jaw, and survived with her disability for some time,” said Sassoon. “But an unhealed fracture on the jaw indicates that at some time the jaw weakened and eventually broke.”

“With a broken jaw, the pliosaur would not have been able to feed and that final accident probably led to her demise,” added Sassoon.

“You can see these kinds of deformities in living animals, such as crocodiles or sperm whales and these animals can survive for years as long as they are still able to feed,” Professor Mike Benton, a study collaborator, told The Telegraph. “But it must be painful. Remember that the fictional whale, Moby Dick from Herman Melville’s novel, was supposed to have had a crooked jaw!”

Sassoon and colleagues found signs that the skeleton was that of an old female who had developed the condition with age. Its large size, and the fused skull bones, suggests maturity. Also indications of its gender, is the skull crest, which sits low — experts presume males had higher crests.

The Pliosaur specimen is an amazing example of how the study of disease found in fossils can help researchers reconstruct an animal´s life history and behavior and shows that even a prehistoric apex predator such as the pliosaur can succumb to the diseases of old age.

Lizard Spit Shown To Reduce Food Cravings

A new drug made from the saliva of the Gila monster lizard reduces cravings for food, according to researchers at the Sahlgrenska Academy at the University of Gothenburg.

An increasing number of patients suffering from type 2 diabetes are offered a pharmaceutical preparation called Exenatide, which helps them to control their blood sugar. The drug is a synthetic version of a natural substance called exendin-4, which is obtained from a rather unusual source — the saliva of the Gila monster lizard (Heloderma suspectum), North America’s largest lizard.

In a study with rats published in the Journal of Neuroscience, Assistant Professor Karolina Skibicka and her colleagues show that exendin-4 effectively reduces the cravings for food.

“This is both unknown and quite unexpected effect,” comments an enthusiastic Karolina Skibicka. “Our decision to eat is linked to the same mechanisms in the brain which control addictive behaviors. We have shown that exendin-4 affects the reward and motivation regions of the brain.”

“The implications of the findings are significant,” states Suzanne Dickson, Professor of Physiology at the Sahlgrenska Academy.

“Most dieting fails because we are obsessed with the desire to eat, especially tempting foods like sweets. As exendin-4 suppresses the cravings for food, it can help obese people to take control of their weight,” suggests Professor Dickson.

Research on exendin-4 also gives hope for new ways to treat diseases related to eating disorders, for example, compulsive overeating.

Another hypothesis for the Gothenburg researchers’ continuing studies is that exendin-4 may be used to reduce the craving for alcohol.

“It is the same brain regions which are involved in food cravings and alcohol cravings, so it would be very interesting to test whether exendin-4 also reduces the cravings for alcohol,” suggests Assistant Professor Skibicka.

Surgery Returns Hand’s Function After Spinal Cord Injury

[ Watch the Video ]

Connie K. Ho for RedOrbit.com

Surgeons at Washington State University in St. Louis recently revealed that they were able to restore some hand function in a quadriplegic patient who suffered from a spinal cord injury at the C7 Vertebra, which is the lowest bone in the neck. They were able to reroute the working nerves in the arms instead of operating on the spine. The nerves are able to communicate with the brain because they´re attached to the part of the spine that´s above the injury.

The patient was able to regain semi-hand function, particularly bend his thumb and index finger, after undergoing a surgery at Barnes-Jewish Hospital and participating in a year of intensive physical therapy. According to AFP, the patient is a 71-year-old man had suffered a spinal injury in a car accident; he´s now able to feed himself small pieces of food and write with assistance. The study, published in the May 15 issue of the Journal of Neurosurgery, is believed to be the first case study that restores ability to flex the thumb and index finger.

“This procedure is unusual for treating quadriplegia because we do not attempt to go back into the spinal cord where the injury is,” remarked surgeon Dr. Ida K. Fox, an assistant professor of plastic and reconstructive surgery at Washington University, in a prepared statement. “Instead, we go out to where we know things work – in this case the elbow – so that we can borrow nerves there and reroute them to give hand function.”

Patients who have injuries at the C6 and C7 vertebra generally have no hand function, but can move the shoulder, elbow, and some parts of the wrist as the nerves connect to the brain and are found above the injury point. This surgery is not helpful to patients who have lost arm function in vertebra C1 through C5, which are higher parts of the spine.  The surgery is very specific in the type of patient it can assist.

“It’s very important to caution that this applies only to those with spinal injuries far enough down on the spine that there are remnants of nerves that are still functional above the injury that can be tapped into,” stated Dr. J. Marc Simard, a professor of neurosurgery, pathology and physiology at the University of Maryland School of Medicine in Baltimore in a article by U.S. News.

Dr. Susan E. Mackinnon, chief of the Division of Plastic and Reconstructive Surgery at Washington University School of Medicine, was the first to develop and perform the surgery. She specializes in surgeries that relate to the peripheral nerves and has initiated other similar surgeries that bring function back to the arms and legs. The surgery reported in the Journal of Neurosurgery was the first one she had done that applied the peripheral nerve technique and gave the limb function following a spinal cord injury. Mackinnon also believes that the return of limb function was as much based off of intensive therapy as it was on the surgery. Therapy helps the brain relearn how the nerves bend the elbow and complete other limb movements.

“Many times these patients say they would like to be able to do very simple things,” discussed Fox in a statement. “They say they would like to be able to feed themselves or write without assistance. If we can restore the ability to pinch, between thumb and index finger, it can return some very basic independence.”

Regarding the surgery, Mackinnon doesn´t believe that there´s a window of time in which it needs to be done. According to the report, the patient completed the surgery two years after his spinal injury. As long as the nerves are connected to the support of the spine, then the nerves and related muscles stay healthy even long after injury.

“The spinal cord is the control center for the nerves, which run like spaghetti all the way out to the tips of the fingers and the tips of the toes,” noted Mackinnon, the director of the School of Medicine’s Center for Nerve Injury and Paralysis, in a statement. “Even nerves below the injury remain healthy because they are still connected to the spinal cord. The problem is that these nerves no longer ‘talk’ to the brain because the spinal cord injury blocks the signals.”

During the surgery, Mackinnon operated in the upper arms to work around the patient´s C7 spinal cord injury. The working nerves and the non working nerves are parallel to each other. Mackinnon was able to take a non-working nerve and plug it into a working nerve that can help the muscles that flex the elbow. Following the surgery, the bicep still flexed the elbow but the brachilais, another muscle, was used to help bend the thumb and the finger.

“This is not a particularly expensive or overly complex surgery,” Mackinnon explained in the statement. “It’s not a hand or a face transplant, for example. It’s something we would like other surgeons around the country to do.”

The surgery will have lasting effects on those who participate in it.

“One of the issues with techniques such as this is the permanence of the outcome – once done it is hard to reverse. There is an inevitable sacrifice of some healthy function above the injury in order to provide more useful function below,” said Dr. Mark Bacon, the director of research at the charity Spinal Research, to the BBC. “This may be entirely acceptable when we are ultimately talking about providing function that leads to a greater quality of life. For the limited number of patients that may benefit from this technique this may be seen as a small price to pay.”

Medical professionals are positive that this surgery could help other individuals with similar treatment.

“One element that is unusual in this case is success in a 71 year old, because older individuals typically have much lower nerve and regenerative potential,” said Dr. Lewis Lane, chief of hand surgery at North Shore University Hospital in New York, in an interview with AFP.

Women Cope Better Than Men After Kidney Surgery, But More Likely To Receive Blood

Women do better than men after surgical removal of part or all of a cancerous kidney, with fewer post-operative complications, including dying in the hospital, although they are more likely to receive blood transfusions related to their surgery.
But Henry Ford Hospital researchers who documented these gender differences can’t say why they exist.
The results of the new study, based on population samples from throughout the U.S., will be presented this week at the American Urological Association’s Annual Meeting in Atlanta.
“This is a controversial area,” says Quoc-Dien Trinh, M.D., a Fellow at Henry Ford Hospital’s Vattikuti Urology Institute and lead author of the study.
“While the effects of gender on the outcome of many types of surgery, including removal of the bladder, have been demonstrated and widely debated, the association between gender and surgical outcomes of nephrectomy (kidney removal) is not well understood.”
Physical differences between genders can explain different outcomes in some types of surgery, and have been shown and discussed in several earlier studies, Dr. Trinh says. “But this is hard to explain for nephrectomies. There is no clear-cut anatomical difference between men and women that would explain why it’s easier or harder in one sex than the other.”
Surgical removal of part or all of a diseased kidney, whether using traditional “open” techniques or less-intrusive laparoscopic procedures, is the standard of care for kidney cancer and the only curative treatment.
Using nephrectomy data from 1998 through 2007, the most recent available from the Health Care Utilization Project’s Nationwide Inpatient Sample (NIS), “we tested the rates of blood transfusions, extended length of stay (beyond the median of five days), in-hospital mortality, as well as complications during and after surgery, separated by gender,” Dr. Trinh says.
Of the total 48,172 cases that were identified and examined, 18,966 (39.4 percent) were female. The mean age for the women was 62.7 years; for the men, 61.8 years.
While no significant gender-related differences were found in complications during surgery and length of hospital stay after surgery, the Henry Ford Hospital researchers found that women:
-Were less likely than men (14.6 percent vs. 17.1 percent) to have complications after nephrectomy. These included digestive problems, hemorrhage, cardiac complications and infections.
-Were less likely to die while in the hospital (0.6 percent vs. 0.8).
-Were more likely to have blood transfusions than men (11.5 percent vs.9.2 percent).
The differences were most pronounced after partial removal of a cancerous kidney using open surgical techniques.
Because researchers didn’t have specific data on the kidney tumors in each individual case, exactly why these differences were found couldn’t be determined, Dr. Trinh says, adding, “It is entirely possible that women have smaller tumors, so there are fewer complications.”
As to the higher rates of blood transfusion in female patients, Dr. Trinh says the reason or reasons remain unknown.
“The threshold for transfusion could be different between men and women,” he says. “We usually transfuse based on clinical decisions: we look at the patients, how they’re doing, and make judgments. There are no clear-cut guidelines.
“In the end, women are transfused more often. Is it because they bleed more often? Are they transfused more liberally than men? We don’t have the data to answer this.”
While these differences remain to be explored and determined, the fact that they exist, and have now been documented, is a step forward in the treatment of kidney cancer.
“Insight into the effect of gender on major urologic oncology procedures,” Dr. Trinh says, “is critical in reducing disparities in care and improving patient outcomes.”

On the Net:

Almost One Tenth Of Western Hemisphere Mammals In Danger From Climate Change

A new study led by Carrie Schloss, an analyst in environmental and forest sciences at the University of Washington, finds that nine percent of the Western Hemisphere’s mammals, and nearly forty percent in particular regions, will fall victim to the changing climate. Some mammals are merely too slow to escape climate change in their natural habitats and are unable to move into different areas. The study seeks to understand if the mammals can actually adapt to these conditions by moving or not.

Scientists have noted areas for the past ten years that will be able to accommodate the animals in the event that their natural habitats become threatening or uninhabitable. The University of Washington is releasing the paper on the week of May 14 on the online journal Proceedings of the National Academy of Sciences.

The study focused on 493 species of mammals, including a shrew as small as a dime and a moose weighing in at 1,800 pounds. The only dispersing element considered in the study was climate change.

Schloss stated, “We underestimate the vulnerability of mammals to climate change when we look at projections of areas with suitable climate but we don’t also include the ability of mammals to move, or disperse, to the new areas.”

“Indeed, more than half of the species scientists have in the past projected could expand their ranges in the face of climate change will, instead, see their ranges contract because the animals won’t be able to expand into new areas fast enough,” stated co-author Josh Lawler, UW associate professor of environmental and forest sciences.

In order to determine the rate at which a species must move into different areas, UW researchers used earlier works by Lawler, which display certain species and the areas they need to survive. The works also include ten global climate models that give the rate at which a climate may decline, and a scenario established by the UN Intergovernmental Panel on Climate Change that demonstrates the effects of mid-high greenhouse gas emissions on these climates. With this information, along with the assumed timeframe of movement of one generation, researchers looked at certain species to ascertain information about their survival.

Factors including size and reproduction rates would influence the rate at which they traveled. For example, a small mouse in one generation cannot move far, but with its quick reproduction rates, many mice would be able to move more quickly than creatures that reproduce and mature at a slower rate.

According to Schloss, primates in the Western Hemisphere, for instance, have a slower dispersion rate because they mature within several years and this makes them appear more susceptible to climate change. In addition to this, species in tropical areas are not able to move as quickly into new territories as mammals in mountainous regions, where they simply move to a better elevation for their preferred climates. Not only is it more difficult for tropical creatures to reach a suitable area, but these areas are likely to diminish in the future.

“Those factors mean that nearly all the hemisphere’s primates will experience severe reductions in their ranges,” stated Schloss, “on average about 75 percent. At the same time species with high dispersal rates that face slower-paced climate change are expected to expand their ranges.”

Lawler stated, “Our figures are a fairly conservative — even optimistic — view of what could happen because our approach assumes that animals always go in the direction needed to avoid climate change and at the maximum rate possible for them.” He also stated, “The researchers were also conservative in taking into account human-made obstacles such as cities and crop lands that animals encounter.”

Researchers used a formula developed prior to the study to understand how humans affect travel patterns of animals. The formula of “average human influence” focuses on areas that might cause animals to encounter high amounts of human development and technologies; however, it does not consider the necessity of an animal to go around and completely avoid human settlements.

Lawler stated, “I think it’s important to point out that in the past when climates have changed — between glacial and interglacial periods when species ranges contracted and expanded — the landscape wasn’t covered with agricultural fields, four-lane highways and parking lots, so species could move much more freely across the landscape.”

“Conservation planners could help some species keep pace with climate change by focusing on connectivity — on linking together areas that could serve as pathways to new territories, particularly where animals will encounter human-land development,” stated Schloss. “For species unable to keep pace, reducing non-climate-related stressors could help make populations more resilient, but ultimately reducing emissions, and therefore reducing the pace of climate change, may be the only certain method to make sure species are able to keep pace with climate change.”

On the Net:

Low-Calorie Slurpee Will Freeze Your Brain This Summer

The nation´s largest convenience chain, 7-Eleven, has announced a new low-calorie line of Slurpees for this summer in hopes of freezing your brain while keeping extra calories out of your system, called Slurpee Lite.

Bruce Horovitz for USA Today reports that Slurpee Lite will target females in their 20s with this tagline: “All flavor. No sugar.” The chain has sold saccharin-based Slurpees regionally, but the 45-year-old treat is now rolling out with Splenda as a sweetener, following a nationwide trend of treats and foods hinting at a slimmer profile.

Companies have been marketing everything from “Spam Lite” to “skinny cocktails” aimed at calorie-conscious consumers. “You have to wonder what would happen to the obesity epidemic if light products tasted better,” Lynn Dornblaser, new products guru at research firm Mintel, tells Horovitz.

7-Eleven´s 8-ounce Slurpee Lite Fanta Sugar-Free Mango has 20 calories, which is much less than the 66 calories in an 8-ounce Fanta Wild Cherry Slurpee drink, the best-selling conventional Slurpee.

“We talked to a group who said they would drink Slurpees more often if we take out the sugar and reduce the calories,” says Laura Gordon, vice president of brand innovation. To get folks familiar with the line, 7-Eleven will offer free 7.11-ounce Slurpees on “SlurpFree Day” May 23.

Lite Mango (and other flavors) will be available, but not two sugar-free flavors due this summer: strawberry banana and cherry limeade.

“Now it´s just a different kind of junk food,” says Neal Barnard, nutritionist and adjunct associate professor of medicine at George Washington University. “This should not be mistaken as any kind of corporate responsibility. They´re just trying to sell you the same stuff in a different package.”

One overriding problem with most light foods and drinks is the perception that they taste awful. Still, 80 percent of US consumers say they´re interested in low-calorie, low-fat or low-sugar foods. Forty-three percent however, say the biggest challenge to dieting is the taste of diet foods, reports Mintel.

7-Eleven insists it´s nailed low-cal taste. But Barnard warns, “Slurpee had zero nutritional value then, and it has zero nutritional value now.”

“7-Eleven carries lots of fresh foods, beverages and snack items for people looking for better-for-you options,” said Patsy Ross, 7-Eleven´s registered dietitian.

“Our stores have something for everyone whether they´re counting calories, fat or sugar grams. Slurpee Lite offers a beverage alternative to consumers concerned about sugar intake and calorie consumption, whether for dieting reasons or a medical necessity, such as diabetes,” Ross said.

Human Embryonic Stem Cells Used To Grow Bone Tissue

A New York Stem Cell Foundation (NYSCF) scientist has shown in new research that human embryonic stem cells can be used to grow bone tissue grafts for use in research and potential medical applications.

Dr. Darja Marolt, an investigator at the NYSCF, is the lead author of the study, which was published this week in the online edition of the Proceedings of the National Academy of Sciences (PNAS).

It is the first example of using bone cell progenitors derived from human embryonic stem cells to grow compact bone tissue in quantities large enough to repair centimeter-sized defects. When implanted in mice and studied over time, the implanted bone tissue supported blood vessel in-growth, and continued development of normal bone structure, without demonstrating any incidence of tumor growth.

This is a significant step forward in using pluripotent stem cells to repair and replace bone tissue in patients, noted the researchers.  Bone replacement therapies are relevant in treating patients with a variety of conditions, wounds, birth defects, or other traumatic injuries.

Dr. Marolt conducted this research as a post-doctoral NYSCF — Druckenmiller Fellow at Columbia University in the laboratory of Dr. Gordana Vunjak-Novakovic. Since conducting this work, Marolt has continued to build upon the research, developing bone grafts from induced pluripotent stem (iPS) cells.

IPS cells are similar to embryonic stem cells in that they can also give rise to nearly any type of cell in the body, but iPS cells are produced from adult cells and as such are individualized to each patient. Marolt hopes that by using iPS cells to engineer tissue, she can develop personalized bone grafts that will avoid immune rejection and other implant complications.

The New York Stem Cell Foundation conducts cutting-edge translational stem cell research in its laboratory in New York City and supports research by stem cell scientists at other leading institutions around the world.

Expectations, Previous Experiences Influence Perception Of Pain

Connie K. Ho for RedOrbit.com
For many people, pain and needles go hand in hand. They shudder at the thought of going to the dentist or the doctor because of those pointy items. A new study looks at these preconceived notions of pain and needles, looking specifically at how these thoughts can affect a patients´ experience. The project, completed by a group of German researchers, states that past experiences with needle pricks, combined with information received before having an injection, can influence pain experience.
The new research is published in the May issue of the journal Pain.
“Throughout our lives, we repeatedly experience that needles cause pain when pricking our skin, but situational expectations, like information given by the clinician prior to an injection, may also influence how viewing needle pricks affects pain,” noted lead author Marion Höfle, a doctoral student in the research Multisensory Integration group led by Dr. Daniel Senkowski of the Department of Psychiatry and Psychotherapy at Charité University Medicine Berlin, in a prepared statement.
In the experiment, study participants watched various clips while also receiving painful or painless electric stimuli that were administered on their hand. The clips included images of a needle pricking a hand, a Q-tip touching a hand, and a hand standing alone. The images were shown to the participants on a screen that was placed above the participants´ hand; it gave the participants the experience that the video image was their own hand.
According to the participants of the study, they felt varying degrees of pain. When they watched clips of needles pricking a hand, they sensed that their pain was more intense and unpleasant than when they saw images of the hand alone. As well, seeing the needle prick a hand was more painful for the participants than seeing a Q-tip touch a hand. These self-reported results from the participants were also aligned with increased activity in the autonomic nervous system. The researchers were able to measure this increase in autonomic nervous system activity by measuring pupil dilation responses. These results showed how previous experiences regarding needles could impact the users´ degree of pain when viewing needles.
The study also found that situational experiences or expectations could affect a person´s take on pain. Before the stimulation, scientists told participants that either the needles or the Q-tips were more related to painful electrical stimulation than non-painful electric stimulation. The researchers believed that, when they showed clips that highlighted pain (i.e. images related to needles), the participants reported having more pain than when they watched clips less associated with pain. This proved that having certain expectations of particular experiences regarding pain influenced the intensity of the pain patients´ had during treatment.
Overall, researchers came to understand that the study showcased new findings regarding pain and the expectation of pain.
“Clinicians may be advised to provide information that reduces a patient’s expectation about the strength of forthcoming pain prior to an injection,” commented Höfle, also of the Charité – Universitätsmedizin Berlin and the University Medical Center Hamburg-Eppendorf. “Because viewing a needle prick leads to enhanced pain perception as well as to enhanced autonomic nervous system activity, we’ve provided empirical evidence in favor of the common advice not to look at the needle prick when receiving an injection.”

Diet Choices Influenced By Food Combinations And Past History

Connie K. Ho for RedOrbit.com

“You are what you eat.” This is a well known phrase that has been mentioned many times in discussions related to health and nutrition. A new report discusses how the combinations of what you eat can affect your consumption, and also how the diet choices you made as a child could affect the diet choices you make as an adult.

Two researchers, T. Bettina Cornwell of the University of Oregon (UO) and Anna R. McAlister of Michigan State University (MSU), recently found that water could change the way that people eat. Their research findings were published recently in the journal Appetite. Appetite is an international research journal that focuses on the cultural, sensory, and physiological influences on diet choices as well as the consumption of particular foods and drinks.

The article by Cornwell and McAlister discussed separate studies. In one study, 60 young adults from the U.S., who were between the ages of 19 and 23, were surveyed on their pairings of food and beverages. The other experiment revolved around 75 U.S children, who were between the ages of three and five, and their consumption of certain beverages and vegetables. These preschoolers were examined on different days with different situations involving drinks served with vegetables.

The scientists found that older participants liked consuming salty foods and soda together instead of having soda with vegetables. Preschoolers tended to eat more raw vegetables, like carrots or red peppers, when these foods were served with water rather than a sugary drink. The findings in the report showed that people are influenced by diet choices that they make when they´re younger. They also tend to eat out of habit more than anything else.

“Our taste preferences are heavily influenced by repeated exposure to particular foods and drinks,” explained Cornwell, the Edwin E. & June Woldt Cone Professor of Marketing in the Lundquist College of Business at UO, in a prepared statement. “This begins early through exposure to meals served at home and by meal combinations offered by many restaurants. Our simple recommendation is to serve water with all meals. Restaurants easily could use water as their default drink in kids’ meal combos and charge extra for other drink alternatives.”

With these studies, McAlister believes that serving water with meals could help change dietary choices and could become helpful in combating the nation´s obesity epidemic. In the past years, there has been a rise in the number of young adults who have diabetes and also a general increase in the cost of health care. Furthermore, Cornwell stated that drinking water during meals could reduce dehydration, addressing the issue of dehydration that has been seen in 75 percent of adults in the U.S. She believes that, at a young age, children relate sweet, high-calorie drinks like sodas to fatty, high-calorie foods like French fries.

“While this combining seems as normal as rainfall in Northwest winters, when we look cross-culturally we can see that food-and-drink combinations are developed preferences,” continued Cornwell in the statement. “If the drink on the table sets the odds against both adults and children eating their vegetables, then perhaps it is time to change that drink, and replace it with water.”

The report findings show how diet choices made early on could impact a person´s nutrition choices later on.

“From a policy perspective, this means that we need  focus on early preference formation,” remarked McAlister in a prepared statement.

Others in the medical profession believe that the report could address overarching issues in the marketing and distribution of food.

“This important research has broad ramifications for how foods are marketed and served,” noted Kimberly Andrews Espy, vice president for research and innovation at UO, in a press release. “Addressing the early contributors of unhealthy eating that contribute to obesity is important for our general well-being as a nation and, especially, for improving the nutritional choices our children will make over their lifetimes.”

Before completing these studies, Cornwell and McAlister also published an article in the January 2011 issue of Appetite on how children´s taste choices preference for salty, sugary, fatty foods were connected with their awareness of fast food brands and soda brands.

Does Sudden Cardiac Arrest Happen More Often With HIV/AIDS Patients?

Groundbreaking 10-year UCSF research examines causes of death among HIV patients in San Francisco

What is the connection, if any, between sudden cardiac death and people with HIV/AIDS? And can that knowledge help prolong their lives?

In a comprehensive, 10-year UCSF study, researchers found patients with HIV/AIDS suffered sudden cardiac death at a rate four times higher than the general population.

“As part of my ongoing research in 2010, we were looking at every instance of sudden death in San Francisco,” said first author Zian H. Tseng, MD, an electrophysiologist and an associate professor of medicine in the UCSF Division of Cardiology. “I noticed that many of these cases involved individuals with HIV infection who were dying suddenly. I wondered if there was some sort of connection there.”

He posed this question to Priscilla Hsue, MD, a UCSF associate professor of medicine and the director of the HIV Cardiology Clinic at San Francisco General Hospital and Trauma Center (SFGH), who is one of a few cardiologists in the country who specializes in HIV. To her knowledge, no one had ever explored the link between HIV and sudden death, and that is when they began collaborating on this research.

In a paper scheduled to be published May 15 in the Journal of the American College of Cardiology, Tseng, Hsue and other researchers conducted a retrospective study of 2,860 HIV patients from April 2000 to August 2009 at SFGH’s Ward 86, the first HIV/AIDS-specialized clinic, to comprehensively characterize all deaths. They studied medical records, death certificates, paramedic reports, and interviews with family members, doctors, and other clinicians.

Sudden Cardiac Death and HIV/AIDS

During that period, eight percent died during an average of 3.7 years of follow up. Cardiac-related deaths accounted for 15 percent of overall mortality. Of that group, 86 percent died of sudden cardiac death.

“To put that in context, we’re able to compare the rate of sudden death in this population with the overall San Francisco population,” Tseng said. “So adjusted for age, race, demographics, and other variables, the rate of sudden death in the HIV population is more than four times higher than the general population.”

“The fact that the vast majority of cardiac deaths were sudden is surprising and implies that we as clinicians need to be aware of this potential health issue among patients with HIV,” Hsue added. “Our findings also highlight many things that we still don’t know about HIV and sudden death. Did these individuals die of unrecognized coronary artery disease? What can we be doing as clinicians to identify patients at risk and to intervene beforehand?”

Categorizing Sudden Cardiac Death

By 2003, sudden cardiac death made up the largest number of non-AIDS deaths among HIV-positive patients in San Francisco. These deaths were largely among individuals with evidence of well-controlled HIV disease.

Researchers used well-published criteria for retrospectively identifying death as either HIV-related or sudden death-related. If there was any doubt, they classified sudden death as an HIV death.

“In other words, for someone with a CD4 (T-cell) count less than 50 who died suddenly, we classified that as an HIV death, rather than a sudden death because of the profound immunodeficiency,” Tseng said.

More than 17,000 people with AIDS died in 2009 worldwide, and more than 619,000 people have died since the epidemic began. Still, the number of people living with HIV continues to rise. More than 1.2 million people in the United States are HIV-positive, according to the U.S. Department of Health & Human Services.

“Now that HIV-infected individuals are living longer with the benefit of antiretroviral therapy, non-AIDS conditions are becoming increasingly important and at the top of this list is cardiovascular disease,” Hsue said.

Researchers believe HIV changes the electrophysiology of the heart in a way so pronounced that it causes conduction abnormalities. And many HIV medications can throw off the heart’s electrical cycle by prolonging the QT interval, which increases the risk of sudden death. These and other variables could be contributing factors.

“Acknowledging the limitations of a retrospective analysis, what’s exciting about this study is that it opens up many related questions we can ask in future studies, such as which high-risk patients might benefit from defibrillator implantation?” Tseng said.

Tseng is in the middle of a prospective citywide study on sudden cardiac death, including studying HIV patients and monitoring their progress.

On the Net:

Cases Related To Ingestion Of Batteries Have Increased Over 20 Years

Connie K. Ho for RedOrbit.com

“Every 90 minutes, a child younger than 18 years of age is seen in a US emergency department for a battery-related problem.” This is a statistic found in a report by researchers at the Center for Injury Research and Policy of The Research Institute at Nationwide Children’s Hospital which focus on issues related to pediatric death and disabilities. They found that the yearly number of battery-related emergency department visits among children under the age of 18 has more than doubled over 20 years.

The report, recently published online and featured in the June 2012 print edition of Pediatrics, stated that there were 2,591 emergency department visits in 1990 and 5,525 emergency department visits in 2009. Researchers used data from the National Electronic Injury Surveillance System (NEISS), which is under the U.S. Consumer Product Safety Commission. The NEISS provides information regarding consumer product-related injuries as well as sports and recreation-related injuries that were treated in hospitals throughout the U.S. According to HealthDay, the four types of accidental contact included swallowing or inserting the battery in the ear, mouth, or nose. Boys also were the majority of the ER visits (approximately 60 percent).

“They’re shiny, they’re small, and children explore things developmentally with their mouth — if they don’t know what something is, they put it in their mouth,” Dr. Nicholas Slamon, a pediatrician at Nemours/Alfred I. duPont Hospital who has worked on battery-related injuries, told Reuters.

The researchers also found that over 75 percent of battery-related emergency visits were by children who were five-years-old or younger. Within this group, one-year-olds had the highest number of visits to the emergency room. 29 percent of cases involved batteries that were meant to be used in toys or games. The majority of the other cases involved batteries in objects that weren´t supposed to be used by children, such as watches (14 percent), calculators (12 percent), flashlights (9 percent), and remote controls (6 percent).

“We live in a world designed by adults for the convenience of adults, and the safety of children is often not considered,” commented Dr. Gary Smith, the director of the Center for Injury Research and Policy at Nationwide Children’s Hospital, in a prepared statement. “Products with easily-accessible battery compartments are everywhere in our homes today. By making a few simple design changes and strengthening product manufacturing standards, including products not intended for use by young children, we could prevent many of the serious and sometimes fatal injuries that occur when children are able to easily access button batteries in common household products.”

The team of researchers also looked at the different types of batteries that were swallowed by children and found that 84 percent of cases involved button batteries. They believe that the high propensity of button batteries was due to the fact that many home electronics use button batteries to power up. A recent report also found that there was an increase in the number of fatal and severe button battery ingestions and the researchers linked the rise to the increased use of the three volt, 20 millimeter, lithium button batteries.

“The increased prevalence of the higher voltage 20mm lithium batteries is concerning because it coincides with an alarming 113 percent increase in battery ingestions and insertions by young children,” commented Smith, who is also a professor of pediatrics at Ohio State University´s College of Medicine, in the statement. “When a button battery is swallowed and gets caught in a child’s esophagus, serious, even fatal injuries can occur in less than two hours.”

For parents and guardians, there are a number of tips to help prevent battery-related ingestions and injuries. For one, parents and guardians can tape all the battery compartments of household devices shut. Secondly, they can also store devices that use batteries in high up places that are out of reach for children. Third, they can spread the awareness to other parents´ and guardians so that children can be better protected. Parents who believe that their child has swallowed a button battery should immediately seek medical attention, so that an X-ray can check if the battery has not moved to the esophagus. According to ABC News, there are a few signs for parents and guardians if they believe that their child has swallowed a battery; these include drooling, having difficulty swallowing, and vomiting.

“Children should never be unattended and they should never be within reach of any object that can fit through a choke tube, which is basically the cardboard tube of a toilet-paper roll,” explained Dr. Lee Sanders, an associate professor of pediatrics at Stanford University, in the US News article. “That’s the best preventive strategy.

In terms of manufacturers and companies, researchers hope that these groups can post warnings on packaging for batteries and products to show that button batteries are child-proofed. As well, these products should be designed in a way so that a screwdriver is needed to open the battery compartment or should include a child-locking resistant mechanism.

Overall, medical professionals have indicated concern about the rise in battery ingestions.

“Whenever we see a marked rise in any cause of injury for a child, it’s concerning from a public-health standpoint,” continued Sanders. “So we need to investigate the root cause of this doubling. One possibility is that there is, in fact, increased exposure to button batteries themselves. But of course we might have to also look at other causes, like changes in the actual reporting of cases that might have taken place as the system for reporting improves or the coding for reporting improves.”

Fat Gets To Your Gut Faster Than Previously Believed

Michael Harper for RedOrbit.com

You remember the old phrase “A moment on your lips, a lifetime on the hips?”

The saying quoted by doting mothers and grandmothers everywhere is usually the last thing we hear before shoving a piece of cake into our mouths, begging the food gods to silence the inner voices.

No one needs to be told that the fat we ingest turns into fat in our bodies. Going even further, it´s common knowledge that the fattier the food — and usually the more delicious the food — the worse it will eventually be for us. Despite this, those ever curious scientists couldn´t leave well enough alone and decided to give us even more reason to tentatively approach the dinner table.

In a story which could either be taken as good news or bad news, a new report suggests the fat we ingest from foods can wind up in our midsections within hours of eating.

Good news for those who thought they were going crazy as they loosened their belt after a meal, bad news for everyone else.

According to researchers at Oxford University, the equivalent of 2 to 3 teaspoons of whatever you are eating can end up on your waist much quicker than previously thought. Not to be outdone, your hips, thighs and rear-end will begin to plump up if you continue to overeat, just like your mother always said it would.

Fredrik Karpe and Keith Frayne conducted the study, and found the first fat from any meal arrives in the blood within one hour of ingestion.

After 3 to 4 hours, the researchers found most of the fat had been adopted by the adipose tissue near the waist, where most short-term fat ends up.

Karpe and Frayne´s paper has been published in the Physiological Reviews.

If you´re wondering why such a study was conducted, it turns out there may have been some method to their utter madness.

Thanks to their research, we now have a better understanding of the way fat works. The fact that overeating and eating fatty foods adds mass to your gut is clearly nothing new. What makes this study different is the speed at which this fat travels from your mouth to your blood to your waistline.

According to the Telegraph,  Karpe, a professor of metabolic medicine, says, “The process is very fast. The cells in the adipose tissue around the waist catch the fat droplets as the blood carries them and incorporates them into the cells for storage.”

“If you eat too much, you don´t get into this phase of starting to mobilize it. There will just be constant accumulation and you will start to put on weight.”

Not feeling guilty yet? The research also shows that fit people are better suited to get rid of this fat that their overweight counterparts. As it turns out, exercise helps keep the pounds off for a longer period of time, as the workout turns your body into a fat-burning machine.

So, while we may continue to try and utilize fancy pills and machines which promise maximum results with minimal effort, it seems the best approach for weight loss may be the tried and true method of eating less and exercising more. After all, if scientists say it´s true, then it must be true, right?

Russian Satellite Takes 121 Megapixel Image And Video Of Earth

Lee Rannals for RedOrbit.com

A Russian weather satellite has taken a 121 megapixel image of planet Earth, over 22,000 miles above the surface.

The image was taken by Russia’s Electro-L satellite, and unlike other images taken by spacecrafts, it was taken in one single shot.

Normally, several images are taken by a space agency, then are stitched together in order to create one big image.  However, Roscosmos went about it another way with its 121 megapixel photo.  Each pixel in the image represents a little over a half-mile of the Earth’s surface.

The Electro-L satellite captures a picture of this quality every half-hour to monitor the weather on Earth.  If a strange weather pattern is seen, the Russian operators can remotely command the satellite to take images every 10 minutes.

The image uses a combination of visible and near-infrared wavelengths to make-up the image, so vegetation is seen as red, rather than green.

The Russian satellite, which launched in January 2011, sits in an orbit that matches the Earth’s rotation, known as a geo-stationary orbit, so that it remains on a fixed point of the planet.

A time-lapsed video has been created using images taken by the Electro-L satellite’s 121 megapixel camera.  The video is composed of about 350 shots.

During the video, the reflection of the sun can be seen grazing across the water as the Earth goes from morning, to afternoon, and into nightfall.

Clouds slowly move across the blue ball, giving scientists ample data to determine what weather patterns are developing.

Nature’s Mathematical Formula For Survival

[ Watch the Video ]
Geometric patterns link structure to function in leaves
The structure and delivery of nutrients is provided by the Vascular system in the leaf. With the use of fluorescent dye and time-lapse photography, details of nature’s mathematical formula for survival begin to emerge.
Mother Nature is tough to beat when it comes to optimizing form with function.
Marcelo Magnasco, a mathematical physicist at Rockefeller University in New York, says “When looking at the detail you can see beautiful arrangements of impinging angles where the big veins meet the little veins and how well they are arranged”.
Magnasco and his colleague, physicist Eleni Katifori, analyzes the architecture of leaves by finding geometric patterns that link biological structure to function with support from the National Science Foundation (NSF).
They studied a specific vascular pattern of loops within loops found in many leaves down to the microscopic level. It is a pattern that can neutralize the effect of a wound to a leaf, such as a hole in its main vein. Nutrients bypass the hole leaving the leaf completely intact.
Something that looks pretty is pretty for good reason. It has a well defined and elegant function. We scan the leaves at extremely high resolution and reconstruct every single little piece of vein, according to the reseearchers.
Magnasco and Katifori digitally dissect the patterns, level by level. “It was very hard to find a unique way of enumerating how they are ordered. Our idea was to start at the very bottom, counting all of the individual little loops,” recalls Magnasco.
“This research is a unique interdisciplinary partnership in which physics is used to address biological problems. We believe the mathematical and physical sciences will play a huge role in biomedical research in this century,” says Krastan Blagoev, director for the Physics of Living Systems program in NSF’s Mathematical and Physical Sciences Directorate, which funded the research.
Magnasco says this research is a starting point for us to understand other systems that branch and rejoin. This includes everything from river systems to neural networks and even malignant tumors. When a tumor becomes malignant it vascularizes, so this is extremely important for understanding how these things work.

On The Net:
National Science Foundation

DNA Replication Protein Plays Role In Cancer

The foundation of biological inheritance is DNA replication

This is a coordinated process in which DNA is copied at hundreds of thousands of different sites across the genome at the same time. If the copying mechanism doesn’t work properly, the result may be cells with missing or extra genetic material, a hallmark of the genomic instability seen in most birth defects and cancers.

Scientists at the University of North Carolina School of Medicine have discovered a protein known as Cdt1. This is required for DNA replication and has an important role in a later step of the cell cycle, mitosis. This is a possible explanation why so many cancers possess not just genomic instability, but also more or less than the usual 46 DNA-containing chromosomes.

The new research was published online ahead of print by the journal Nature Cell Biology. It is the first to definitively show such a dual role for a DNA replication protein.

This was such a surprise. We thought this protein’s job was to load proteins onto the DNA in preparation for replication, said Jean Cook, PhD, associate professor of biochemistry and biophysics and pharmacology at the UNC School of Medicine and senior study author. “We had no idea it also had a night job, in a completely separate part of the cell cycle.”

The cell cycle is the series of events that happen in a cell leading to its growth, replication and division into two daughter cells. It has four distinct phases: G1 (Gap 1), S (DNA synthesis), M (mitosis) and G2 (Gap 2). Cook’s research focuses on G1, when Cdt1 places proteins onto the genetic material to get it ready to be copied.

Cook ran a molecular screen to find other proteins that Cdt1 could be interacting with inside the cell. She expected to only find more entities that controlled replication but was surprised to discover one that was involved in mitosis. That protein, called Hec1 for “highly expressed in cancer,” helps to ensure that the duplicated chromosomes are  divided equally into daughter cells during mitosis. Cook hypothesized that either Hec1 had a job in DNA replication that nobody knew about, or that Cdt1 was the one with the side business.

To look at these two possibilities, Cook partnered with Edward (Ted) D. Salmon, PhD, professor of biology and co-senior author who is a Hec1 expert. After letting Cdt1 do its replication job, they interfered with the protein’s function to see if it adversely affected mitosis. Using a high-powered microscope that records images of live cells, they showed that cells where Cdt1 function had been blocked did not undergo mitosis properly.

When the researchers knew that Cdt1 was involved in mitosis, they wanted to pinpoint its role in that critical process. They combined their genetic, microscopy and computational methods to demonstrate that without Cdt1, Hec1 fails to adopt the conformation inside the cells necessary to connect the chromosomes with the structure that pulls them apart into their separate daughter cells.

Cook says cells that make aberrant amounts of Cdt1, like that seen in cancer, can experience problems in both replication and mitosis. One current clinical trial is actually trying to increase the amount of Cdt1 in cancer cells, hoping to push them from an already precarious position into a fatal one.

On The Net:

Breathing during Radiotherapy

(Ivanhoe Newswire) — Respiratory movement during radiotherapy makes it difficult to hit the right treatment target and this in turn can lead to an under-dose of radiation to the tumor, or a potentially toxic over-dose to the surrounding healthy tissue. Getting this right is a real challenge for the radiotherapist, but new techniques are helping to deliver the correct dose to the right place

Deep Inspiration Breath Hold (DIBH) can spare the heart when irradiating left-side breast cancer tumors.

“Unlike treatment under free breathing (FB), where the patient breathes normally, DIBH spares the heart by reducing its volume and movement in the field to be irradiated, and the lung expansion involved in holding breath leads to a decrease of relative lung volume which is irradiated,” Dr. Amira Ziouèche, a radiotherapy specialist from the Centre Léon Bérard, Lyon, France, was quoted as saying. “In effect, we can largely eliminate the problem of respiratory movement by using this technique, which allows us to reduce the volume of the healthy organ irradiated around the target volume while improving treatment precision. This is particularly important in breast cancer cases, where the life expectancy of most patients is long.”

In a prospective study, undertaken while she was working with Dr. Alice Mege at the Institut Sainte Catherine, Avignon, France, she showed that treating patients during DIBH, while they were holding their breath at between 60% to 80% of their maximum inspiratory (breathing-in) capacity, could spare their hearts and lungs from radiation without compromising the quality of their treatment.

They collected data on 31 patients treated with DIBH between October 2007 and June 2010 at the Institut Sainte Catherine. Each patient was her own case-control and underwent two CT scans, one in FB and the other in DIBH. The dose to healthy organs and targets was calculated based on these scans. Analysis showed that the heart mean dose decreased from 9 Gy in FB to 3.7 Gy in DIBH, and the maximum heart dose from 44.9 Gy to 24.7 Gy. The amount of radiation to the lung was also decreased with DIBH.

“This is the largest study to date of the use of DIBH in patients undergoing radiotherapy for breast cancer,” Dr Ziouèche was quoted saying. “It is an important result for breast cancer patients, where it can spare the volume of heart and lungs that are irradiated. Commonly, the margins around the tumor to be treated are increased in order to take movement into account. But this involves treating a larger area, some of it unnecessarily. The use of DIBH avoids this problem.

In an earlier presentation, researchers compared results from two different kinds of CT scan to see which could more accurately estimate safety margins for radiotherapy treatment where breathing motion was involved. They compared the results from 3D and 4D treatment-planning CT scans of 50 patients with lung tumors and found that the more recent 4D scans provided better results in cases where large tumor motion was involved.

“The results from this study have shown that we can safely apply the ‘mid-ventilation’ concept, where we only irradiate part of the tumor trajectory instead of the entire volume in which the tumor resides during a breathing cycle. Thus we can reduce treatment volumes, with the result that patients have fewer complications,” Ms Fanneke van den Boomen, from the Catharina Hospital, Eindhoven, The Netherlands, was quoted as saying.

4D scanning equipment has only become available recently, and therefore the number of institutes using it is still limited. However, the researchers say, the results are so impressive that their hospital is now performing it routinely in cases where there is large tumor movement.

Source: 31st conference of the European Society for Radiotherapy and Oncology, May 2012

‘Horror Stories’ Surfacing of Deadly Columbian Mind-Control Drug

Experts are warning about a dangerous drug currently being dealt in Colombia that can reportedly rob an individual of their free will, making them vulnerable to criminals and attackers, or erase their memories.
The drug is known as scopolamine, and according to Gizmodo reporter Sam Biddle, the substance, which is derived from plants like cocaine, “will turn you into an insane zombie and probably kill you.”
Scopolamine, otherwise known as “The Devil’s Breath,” was tested by the CIA as a truth serum during the Cold War, and it was also reportedly used by Nazi interrogators during World War II, he added.
Likewise, Beth Stebner of the Daily Mail said that “stories surrounding the drug are the stuff of urban legends, with some telling horror stories of how people were raped, forced to empty their bank accounts, and even coerced into giving up an organ.”
VICE correspondent Ryan Duffy interviewed a drug dealer operating out of the Columbian capital of Bogota, who told him that the drug was, in Stebner’s words, “frightening for the simplicity in which it can be administered” and prevents a person from remembering anything that happened to them while under the influence.
Duffy himself described his experiences as it relates to the drug on the VICE website.
“When VICE initially asked me to go down to Colombia to dig into this Scopolamine story“¦ I had only a vague understanding of the drug, but the idea of a substance that renders a person incapable of exercising free-will seemed liked a recipe for hilarity and the YouTube hall of fame. I even spent a little time brainstorming the various ways I could transport some of it back to the states and had a pretty good list going of different ways to utilize it on my buddies,” he said.
“The original plan was for me to sample the drug myself to really get an idea of the effect it had on folks,” Duffy said. “The producer and camera man had flew down to Bogota ahead of me to confirm some meetings and start laying down the groundwork. By the time I arrived a few days later, things had changed dramatically. Their first few days in the country had apparently been such a harrowing montage of freaked-out dealers and unimaginable horror stories about Scopolamine that we decided I was absolutely not going to be doing the drug. All elements of humor and novelty were rapidly stripped away during my first few days in town. After meeting only a couple people with firsthand experience, the story took a far darker turn than we ever could have imagined.”

Preventable Diseases Responsible For Most Under-5 Deaths

The overwhelming majority of deaths among children are the result of preventable infectious diseases, the authors of a new study published Friday in the journal The Lancet have reported.

According to Christian Nordqvist of Medical News Today, the international group of experts behind the study discovered that of the 7.6 million deaths among children younger than five years old in 2010, 18% had been caused by pneumonia and 14% were the result of premature birth-related complications. The third leading cause of death was diarrhea, and in all, 64% of deaths were either directly caused by or indirectly related to infectious causes.

Of those 7.6 million reported childhood fatalities, half of them occurred in Africa, and two thirds of those were caused by infectious causes such malaria and Aids, BBC News noted on Friday. Neonatal causes were the predominant cause of death among youngsters in Southeast Asia, while five countries — India, Nigeria, Pakistan, Democratic Republic of Congo and China — combined to account for nearly half of deaths of kids under the age of five, the British news organization added.

Dr. Robert Black, the lead author of the study and a professor at the Johns Hopkins Bloomberg School of Public Health, called the numbers “staggering” in an interview with ANI.

“Of 7.6 million deaths globally in children younger than 5, 1.4 million or 18 percent were a result of pneumonia, 1.1 million or 14 percent were related to preterm birth complications and 0.8 million or 11 percent were a result of diarrhea,” he said.

“Despite tremendous efforts to identify relevant data, the causes of only 2.7 percent of deaths in children younger than 5 years were medically certified in 2010,” Black added. “National health systems, as well as registration and medical certification of deaths, need to be promoted and strengthened to enable better accountability for the survival of children.”

The news wasn’t all bad, though, Jason Koebler of US News & World Report pointed out. Overall, the total number of deaths among these youngsters decreased from 9.6 million in 2000 to 7.6 million in 2010, and the under-5 mortality rate has been reduced to 57 per 1,000 live births, he said.

“I think our treatments of [these infectious] diseases have been great successes, and it speaks to our ability to scale up those interventions, but there’s a lot more things we can do to prevent some of the very frequent cases of pneumonia and diarrhea,” Hope Johnson, a researcher at Johns Hopkins and coauthor of the report, told Koebler.

First Satellite Study Of Tagged Manta Rays Exposes Hidden Habits

Conservationists from the University of Exeter in the UK, the Government of Mexico, and the Wildlife Conservation Society have completed a revolutionary study of Manta Rays, utilizing innovative satellite tracking devices.
The published study is the first to track the ocean´s largest ray, growing to twenty-five feet long, using satellite telemetry to ascertain the whereabouts of the endangered animal. The International Union for Conservation of Nature (IUCN) has listed the Manta ray as “Vulnerable” due to increased threats of accidental capture by fishers. Other threats include bait fishing, where fishers use the ray as bait for sharks, and the need for their gill rakers used in traditional Chinese medical practices.
This study was published Friday by online journal PLoS One authors including: Brenden J. Godley of the University of Exeter, Dan W. Castellanos of the Wildlife Conservation Society, and Francisco Remolina of the National Commission of Protected Areas, Cancun, Mexico as well as Lucy A. Hawkes of Bangor University, Bangor, United Kingdom, Matthew J. Witt of the University of Exeter, Rachel T. Graham of the Wildlife Conservation Society and the University of Exeter, and Sara Maxwell of the Marine Conservation Institute and the University of California-Santa Cruz.
Researchers placed satellite transmitters on the backs of four female rays, one male ray, and one juvenile ray over the course of thirteen days, just off the coast of Mexico’s Yucatan Peninsula.
“Almost nothing is known about the movements and ecological needs of the manta ray, one of the ocean’s largest and least-known species,” Dr. Rachel Graham, chief author on the study and director of WCS’s Gulf and Caribbean Sharks and Rays Program stated. She also said, “Our real-time data illuminate the previously unseen world of this mythic fish and will help to shape management and conservation strategies for this species.”
Manta rays, like filter feeders such as whale sharks and baleen whales, will glide through multitudes of tiny plankton, acquiring their nutrients from the small creatures.
“The satellite tag data revealed that some of the rays traveled more than 1,100 kilometers during the study period,” stated Dr. Matthew Witt of the University of Exeter’s Environment and Sustainability Institute, also stating, “The rays spent most of their time traversing coastal areas plentiful in zooplankton and fish eggs from spawning events.”
The manta ray, also known as the devilfish, is harmless to humans, although their bat like qualities gives them a sinister appearance. They have no stinger, unlike the more commonly known stingray and have the highest body to brain ratio of all rays and sharks identified. Typically, the manta ray will give live birth to one or two pups every one to two years.
The group of researchers found that the manta ray preferred to spend the majority of its time within 200 miles of the coastline, in Mexico´s territorial waters. Unfortunately, only 11.5 percent of these frequented areas are in marine protected zones, and most of the areas are actually major shipping routes, causing concern that ships could hurt the manta rays. They are decreasing in number in tropical ocean areas around the world, including the Caribbean.
Dr. Howard Rosenbaum, Director of WCS’s Ocean Giant Program stated, “Studies such as this one are critical in developing effective management of manta rays, which appear to be declining worldwide.”

New Anti-Obesity Drug Approved By Expert Panel

A panel voted 18 to 4 in favor of approving the new anti-obesity drug Lorcaserin for use on the drug market.

The drug works to control the appetite through receptors in the brain, and a study showed it helped nearly half of participants lose up to five percent of their body weight.

The Food and Drug Administration is scheduled to decide on June 27 whether the drug should be approved for use in the U.S.

Lorcaserin was rejected by the Endocrinologic and Metabolic Drugs Advisory Committee back in 2010 due to concerns that it formed breast tumors in rats.

However, the effects seen in that study did not appear in trials on overweight and obese humans, and the panel’s latest vote showed confidence that its benefits outweigh the risks.

“The advisory committee’s positive vote supports our belief in Lorcaserin as a potential new treatment option for the medical management of overweight and obesity,” Jack Lief, Arena’s president and chief executive officer, said in a statement.

“We will continue to work with the FDA as the agency completes its review of the lorcaserin new drug application.”

The most common side effect for those taking a 10 milligram dose of Lorcaserin were headaches, dizziness, nausea, fatigue and dry mouth.

“It is not very impressive, the weight loss, but it is better than a placebo,” Michael Aziz, an internist at Lenox Hill Hospital in New York City, said in a statement. “However, when people stop the drug they gain the weight back.”

According to the U.S. Centers for Disease Control and Prevention, about two-thirds of adults in this country are overweight or obese.

“There is a need for a drug that can address the obesity issue,” Aziz added. “But we are really not covering the root of the problem which is lifestyle changes and eating right. Many people are just looking for a quick fix.”

Currently, there are few weight-loss drugs approved in the U.S.  Xenical, one drug approved, works by preventing the body from absorbing fat, but it has a tenancy to cause gastrointestinal side effects like oily, loose stools.

FDA advisors also urged the approval of Qnexa back in February, which is a decision expected to be made in mid-July as well.  Studies have shown dieters could lose up to 10 percent of their weight when taking Qnexa, along with regular exercise and following a healthy diet.

If Lorcaserin is approved, an Arena spokeswoman said the Japanese pharmaceutical company Eisai has a U.S. division with exclusive rights to commercialize it in the U.S.

Shares of Arena Pharmaceuticals, the maker of the pill, nearly doubled in value after the panel’s decision.  Arena’s stock price rose 70 percent to $6.17 in morning trade on the NASDAQ.

On the Net:

America Consumes 80% World’s Painkillers

According to a congressional testimony by the American Society of Interventional Pain Physicians, 80 percent of the world’s painkillers are consumed in the U.S.

BBC News reports that Americans consume enough painkillers for each of its citizens to have 64 Percocets or Vicodin.

The report said that prescription drug abuse leads to 14,800 deaths a year, which is more than what heroin and cocaine total combined.

The number of those who are taking painkillers is a 600 percent increase from where it was 10 years ago.  Also, police reports across the country have increased for crimes done by people addicted to oxycodone and hydrocodone, which are key ingredients in most prescription painkillers.

Long Island, New York, pharmacist Howard Levine told the BBC that “we’ve become a society of wusses.”

He said that he stopped carrying all of the major addictive prescription drugs after he was robbed by addicts who were looking for their fix.

Rich Elassar, a 36-year-old who once owned a successful business in New Jersey, said that when he turned to using painkillers, he was popping 90 Percocets a day.

When his money ran out, Elassar told BBC that he decided to turn to illegal activities in order to fulfill his addiction.

He admitted to the news agency that he had walked into a bank, and handed the teller a note that demanded cash.  As he was fleeing the scene, he looked in his rear-view mirror and saw salvation for his addiction.

“I looked in my rear-view mirror and I saw the cops, I saw their lights flashing and I really, really, really remember thinking, well this is it. I’m going to get clean now,” he told BBC.

However, despite spending three years in prison and becoming a part of a drug recovery program, he still couldn’t kick the habit of taking oxycodone.

Elassar said he has relapsed three times since his release, and takes medicine every day to keep away the withdrawal symptoms.

He told BBC that he’s been clean since June, but is still unsure whether he’s kicked his addiction for good.

“I think this is definitely it. I mean, I say think and I pray to God every day that this is it,” Elassar told the news agency in an interview.