What Part Do Sleep Spindles Play In Emotional Memory: Ambien Study

Alan McStravick for redOrbit.com – Your Universe Online

Once asleep, we all experience five different stages in our sleep cycle. That being said, one would be hard pressed to find anyone who is familiar with any of the cycles besides REM sleep, or rapid eye movement sleep, which is typically characterized by quick and random movements of the eyes combined with muscle paralysis.

During REM, neural activity in the brain intensifies showing a similarity to the activity associated with being awake. This is the stage where most people experience vivid dream activity.

In a recent study conducted by the University of California´s Riverside and San Diego campuses, researchers were able to delve further into the known sleep mechanisms to identify how the brain consolidates emotional memory. Additionally, they were able to determine how a popular prescription sleep medication can adversely play a role in the recollection of and response to negative memories.

The team´s study, entitled ℠Pharmacologically Increasing Sleep Spindles Enhances Recognition for Negative and High-arousal Memories´ could advance study aimed at individuals who suffer long-term insomnia as a result of posttraumatic stress disorder (PTSD). Often, these individuals are prescribed the prescription medication Ambien to help them sleep.

Sara C. Mednick, assistant professor of psychology at UC Riverside, along with UC San Diego psychologists Erik. J. Kaestner and John T. Wixted, was able to determine how sleep spindles are important for emotional memory. Sleep spindles are characterized by short bursts of brain activity that last a second or less during a specific stage of sleep.

Previous research published by Mednick had clearly shown how sleep spindles played a critical role in helping to transition information from short-term to long-term memory in the hippocampus. That research, presenting potential benefits for sufferers of Alzheimer´s disease and other age related dementias, showed how Ambien might be an effective method of manipulating the brain for improved memory through pharmacology.

“We know that sleep spindles are involved in declarative memory – explicit information we recall about the world, such as places, people and events,” Mednick explained.

This current study, published in the Journal of Cognitive Neuroscience, is the first study of its kind to consider how sleep spindles play an integral part in emotional memory. All previous research had looked only at REM.

By utilizing two of the more commonly prescribed sleep aids — Ambien and Xyrem — the team was able to separate the effects of sleep spindles and REM sleep on the recall of emotional memories. Their study showed REM had no discernible effect on emotional memory, as previously thought.

In the conduct of their study, the team distributed Ambien, Xyrem and a placebo to their study group comprised of 28 men and women aged 18 to 39. Each person in the study was considered a normal sleeper. The team presented a series of images to the participants for one second both before and after a supervised nap. Each of the participants was able to recall more of the images that had negative or highly arousing content after having taken Ambien. The team contends this may suggest the brain favors consolidation of negative memories.

“I was surprised by the specificity of the results, that the emotional memory improvement was specifically for the negative and high-arousal memories, and the ramifications of these results for people with anxiety disorders and PTSD,” Mednick said. “These are people who already have heightened memory for negative and high-arousal memories. Sleep drugs might be improving their memories for things they don´t want to remember.”

Despite the US Department of Veterans Affairs and the Department of Defense recommending against the use of benzodiazepines for the treatment of PTSD, use has increased among both men and women with PTSD between 2003 and 2010. Benzodiazepines have a similar effect on sleep to Ambien.

Also noted in the study is that for crews who have finished a long mission where the use of stimulants was required, the US Air Force uses Ambien as a prescribed “no-go pill” to aid the crew in calming down.

“In light of the present results, it would be worthwhile to investigate whether the administration of benzodiazepine-like drugs may be increasing the retention of highly arousing and negative memories, which would have a countertherapeutic effect,” they wrote. “Further research on the relationship between hypnotics and emotional mood disorders would seem to be in order.”

The Precarious Nature Of Global Ocean Chemistry

April Flowers for redOrbit.com – Your Universe Online
The oceans of the past were quite different than the ones we see today. Ocean temperatures are increasing due to global warming, and these increases are harming marine food webs. Coastal dead zones are also being created by the run-off from fertilizers.
An international team of researchers, led by McGill University, has completed the first global study of changes that occurred during the nitrogen cycle at the end of the last ice age. The nitrogen cycle was a crucial component of ocean chemistry. The findings of this study, published in Nature Geoscience, confirm oceans are good at balancing the nitrogen cycle on a global scale. However, the results also show this balancing act is a slow process that may take centuries, or even millennia. This raises worries about the effects of the scale and speed of changes in the ocean.
“For the first time we can quantify how oceans responded to slow, natural climate warming as the world emerged from the last ice age,” says Prof. Eric Galbraith from McGill University´s Department of Earth and Oceanic Sciences. ”And what is clear is that there is a strong climate sensitivity in the ocean nitrogen cycle.”
The researchers say the nitrogen cycle is an essential component of the global metabolism, comparable to the proteins that are essential to human health. The nitrogen in the ocean is kept in balance by marine bacteria through a complicated cycle that keeps the ocean healthy, much like proteins are carried by the blood and circulate through the body. Microscopic organisms at the base of the marine food chain, called phytoplankton, fix nitrogen in the shallow, sunlit waters of the ocean. As the phytoplankton die and sink, nitrogen is eliminated in dark, oxygen-poor pockets of the deep oceans, in a process called dentrification.
The research team gathered sediment from the ocean floor in different areas of the world. The samples were used to confirm that as the ice sheets started melting and the climate warmed up at the end of the last ice age — 18,000 years ago — the marine nitrogen cycle started to accelerate. By about 8,000 years ago, the ocean had stabilized itself in a new, warmer state, in which the overall nitrogen cycle was running faster. The team isn´t sure how long it will take for marine ecosystems to adapt given the dramatic rate of change in the current ocean nitrogen cycle.
“We are changing the planet in ways we are not even aware of,” says Galbraith. “You wouldn´t think that putting carbon dioxide into the atmosphere would change the amount of nitrogen available to fish in the ocean, but it clearly does. It is important to realize just how interconnected everything is.”

MIT Software Helps People With Social Awkwardness

Watch the video “MIT Automated Coach Helps With Social Interactions

Lee Rannals for redOrbit.com – Your Universe Online

MIT researchers have developed new software that allows people to practice their interpersonal skills to help them better nail that interview.

The National Institute of Mental Health says about 15 million adults in the US have social phobias. For some people, such as those suffering from Asperger’s syndrome, it is difficult to make eye contact and appropriately react to social clues. This is where MITs new software, called My Automated Conversation coacH (MACH), comes in to help.

The software simulates face-to-face conversations and provides users with feedback using a computer-generated onscreen face, along with speech, facial and behavioral analysis and synthesis tools.

“Interpersonal skills are the key to being successful at work and at home,” said MIT Media Lab doctoral student M. Ehsan Hoque who led the research. “How we appear and how we convey our feelings to others defines us. But there isn´t much help out there to improve on that segment of interaction.”

Hoque and his colleagues performed randomized tests with 90 MIT juniors who volunteered for the research. During the first test, a group of participants were randomly divided into three groups, and each group participated in two simulated job interviews. Another test involved a separate group performing a practice session with the MACH simulated interviewer, but without receiving feedback. The final group used MACH and saw videos of themselves accompanied by an analysis of how much they smile, how well they maintained eye contact, how well they modulated their voices, and how often they used filler words such as “like” or “umm.”

Career counselors analyzed the results and found the third group showed statistically significant improvement by its members, while there was no significant change in the other two groups.

“While it may seem odd to use computers to teach us how to better talk to people, such software plays an important [role] in more comprehensive programs for teaching social skills [and] may eventually play an essential step in developing key interpersonal skills,” says Jonathan Gratch, an associate professor of computer science and psychology at the University of Southern California who was not involved in this research.

“Such programs also offer important advantages over the human role-players often used to teach such skills. They can faithfully embody a specific theory of pedagogy, and thus can be more consistent than human role-players.”

Computer software could be one solution to help people with Asperger’s, but another method is being developed to train kids with autism spectrum disorder (ASD). Researchers wrote back in March about how a humanoid robot could help children with ASD learn to coordinate their attention with other people and objects.

Blame Menopause On Men, Says One Evolutionary Biologist

Michael Harper for redOrbit.com — Your Universe Online

Menopause has always been a stage of life experienced exclusively by women, but a new study finds that its existence could be blamed entirely on men. A professor from McMaster University in Canada made waves this week when he presented evidence that from an evolutionary point of view, women only stopped reproducing at an older age because men always chose younger mates. Once outside of the mating pool, women become victims of natural selection and enter menopause. The study also finds that if women had historically chosen younger mates, menopause might have been experienced exclusively by men instead.

The study was published in this week´s PLOS Computational Biology.

Evolutionary geneticist Rama Singh authored this report and claims menopause isn´t something that was evolutionarily developed but rather an act of natural selection gone wrong.

“In a sense it is like aging, but it is different because it is an all-or-nothing process that has been accelerated because of preferential mating,” said Singh in a statement. “Menopause is believed to be unique to humans, but no one had yet been able to offer a satisfactory explanation for why it occurs.”

As women get older, they generally stop reproducing. Historically, this can be traced to our earliest of ancestors who were concerned about peopling the species and moving humanity forward. Natural selection noticed men were choosing younger women with whom to reproduce instead of older women and began to favor the younger of the species. The fitness of these younger women was then protected by natural selection to keep them in peak reproductive capacity. However, once a woman grows past this period, natural selection allows a wave of mutations and hormones that kick start menopause and leads to a host of other health problems.

If it weren´t for this slow down in reproduction, Singh says menopause would never take effect in women.

Using a computer model and simulations, Singh and team showed how a male´s preference to mate with younger females can actually build up these mutations which are later responsible for the decrease in female fertility and ultimately, menopause.

“If women were reproducing all along, and there were no preference against older women, women would be reproducing like men are for their whole lives.”

Yet, this study is concerned with the historical reproductive choices of men to determine why menopause developed in females. Social structure have changed dramatically since the days when Homo sapiens roamed the plains of northern Africa, and nowadays women have far more of a choice in deciding when they start their families. As such, women are giving birth later than ever, and this, says Singh, could mean that menopause may be pushed back even further into a woman´s life.

This is a bold theory, and with any new idea there will be those who disagree. Dr. Maxwell Burton-Chellew, an evolutionary biologist in the department of zoology at the University of Oxford, spoke with the BBC and said Singh may be looking at the problem of menopause from the wrong angle.

“The human male preference for younger females is likely to be because older females are less fertile,” explained Dr. Burton-Chellew in an interview with BBC.

“I think it makes more sense to see the human male preference for younger females largely as an evolved response to the menopause, and to assume that ancestral males would have been wise to mate with any females that could produce offspring.”

New Data Supports ‘Cold’ Model Of Dark Matter

John P. Millis, PhD for redOrbit.com — Your Universe Online

Scientists continue the search to identify the nature of dark matter that seems to pervade the Universe. Since by its very nature it does not interact well — or perhaps at all — with electromagnetic radiation, direct detection of this mysterious form of matter has proven elusive.

Despite this obstacle, astronomers have become convinced of its existence because of how it can be ℠seen´ to interact with normal, luminous matter. To begin closing in on an understanding of this mysterious substance, scientists from England, Taiwan and Japan have used the Prime Focus Camera (Suprime-Cam) aboard the Subaru Telescope to examine the distribution of dark matter in fifty galaxy clusters — the largest structures in the Universe.

According to Graham Smith from the University of Birmingham, England, and co-leader of the research team, “A galaxy cluster is like a huge city viewed from above during the night. Each bright city light is a galaxy, and the dark areas between the lights that appear to be empty during the night are actually full of dark matter. You can think of the dark matter in a galaxy cluster as being the infrastructure within which the galaxies live.”

Since there are various models of dark matter — typically known as Hot, Warm, and Cold — there are different predictions as to how the dark matter would be distributed throughout the cluster. Hot dark matter models, for instance, would expect the particles to be rather evenly distributed since their high-energy means that they would move too fast to really clump together gravitationally. Conversely, cold dark matter would be slower moving with more massive particles. As a result the dark matter would more easily clump together.

For this study, the team used a phenomenon known as gravitational lensing to determine where the dark matter was distributed. Since most dark matter theories contend that the particles only interact with other matter via the gravitational force, there is no obvious way to measure the dark matter directly. Instead, scientists use telescopes to search for distortions of distant galaxies caused by the gravitational warping — or lensing — of the light from these objects.

Lead author Nobuhiro Okabe, from the Academia Sinica, Taiwan, explains, “The Subaru Telescope is a fantastic instrument for gravitational lensing measurements. It allows us to measure very precisely how the dark matter in galaxy clusters distorts light from distant galaxies and gauge tiny changes in the appearance of a huge number of faint galaxies.”

For each observation, the researchers mapped the distribution of dark matter to create a concentration parameter, which is a measure of a galaxy cluster´s average density. Comparison of the dark matter maps of all 50 galaxies shows ranging values for the concentration parameter, but what was most telling was how the concentration parameter converged when the researchers used a technique known as data stacking.

Instead of analyzing an individual source — in this case a single galaxy cluster — a group of similar source types are analyzed together, as if the data were all from a single object. The down side is that the individual source information is lost, but on the other hand, general trends about the data which may be too subtle to characterize for an individual source will be revealed in the stacked analysis.

What the team found was that the density of the dark matter was greater near the center of the galaxy clusters, on average, and gradually becomes more diffuse moving out towards the cluster´s edge. This is consistent with a cold dark matter model, which, along with other evidence, has been the leading candidate theory for some years. The problem is that there is no candidate particle that has been identified by experimental particle physicists that can fit the cold dark matter mold.

The initial indication is that cold dark matter models continue to lead the way, yet we still lack a good candidate particle. So, where to go from here? “We don’t stop here,” noted Smith. “For example, we can improve our work by measuring dark matter density on even smaller scales, right in the center of these galaxy clusters. Additional measurements on smaller scales will help us to learn more about dark matter in the future.”

Light-Carved ‘Nano-Volcanoes’ Could Help With New Drug Delivery Technologies

North Carolina State University
Researchers from North Carolina State University have developed a method for creating “nano-volcanoes” by shining various colors of light through a nanoscale “crystal ball” made of a synthetic polymer. These nano-volcanoes can store precise amounts of other materials and hold promise for new drug-delivery technologies.
The researchers create the nano-volcanoes by placing spherical, transparent polymer nanoparticles directly onto the flat surface of a thin film. They then shine ultraviolet light through the transparent sphere, which scatters the light and creates a pattern on the thin film. The thin film is made of a photoreactive material that undergoes a chemical change wherever it has been struck by the light. The researchers then submerge the thin film in a liquid solution that washes away the parts of the film that were exposed to light. The material that remains is shaped like a nanoscale volcano.
“We can control the pattern of light by changing the diameter of the nanoparticle spheres, or by changing the wavelength — or color — of the light that we shine through the spheres,” says Xu Zhang, a doctoral student in mechanical and aerospace engineering at NC State and lead author of a paper describing the work. “That means we can control the shape and geometry of these structures, such as how big the cavity of the nano-volcano will be.”
The researchers developed a highly accurate computer model that predicts the shape and dimensions of the nano-volcanoes based on the diameter of the nanoscale sphere and the wavelength of light.
Because these structures have precisely measured hollow cores, and precisely measured openings at the “mouth” of the nano-volcanoes, they are good candidates for drug-delivery mechanisms. The size of the core would allow users to control the amount of the drug a nano-volcano would store, while the size of the opening at the top of the nano-volcano could be used to regulate the drug´s release.
“The materials used in this process are relatively inexpensive, and the process can be easily scaled up,” says Dr. Chih-Hao Chang, an assistant professor of mechanical and aerospace engineering at NC State and co-author of the paper. “In addition, we can produce the nano-volcanoes in a uniformly patterned array, which may also be useful for controlling drug delivery.”
Chang´s team is now working to improve its understanding of the release rate from the nano-volcanoes, such as how quickly nanoparticles of different sizes will “escape” from nano-volcanoes with different-sized mouths. “That´s essential information for drug-delivery applications,” Chang says.
“It´s exciting to take our understanding of how light scatters by particles and apply it to nanolithography in order to come up with something that could actually help people.”
The paper, “Three-Dimensional Nanolithography Using Light Scattering from Colloidal Particles,” was published online June 12 in ACS Nano. Lead author of the paper is NC State Ph.D. student Xu Zhang. Co-authors are Chang and NC State master´s student Jonathan Elek. The research was supported by a NASA Early Career Faculty Award and the National Science Foundation´s ASSIST Engineering Research Center at NC State.

On The Net:

Nanocrystals With Uniform Sizes, Shapes Produced Using Polymer Structures As ‘Nanoreactors’

Georgia Institute of Technology
Tiny chemistry
Using star-shaped block co-polymer structures as tiny reaction vessels, researchers have developed an improved technique for producing nanocrystals with consistent sizes, compositions and architectures — including metallic, ferroelectric, magnetic, semiconductor and luminescent nanocrystals. The technique relies on the length of polymer molecules and the ratio of two solvents to control the size and uniformity of colloidal nanocrystals.
The technique could facilitate the use of nanoparticles for optical, electrical, optoelectronic, magnetic, catalysis and other applications in which tight control over size and structure is essential to obtaining desirable properties. The technique produces plain, core-shell and hollow nanoparticles that can be made soluble either in water or in organic solvents.
“We have developed a general strategy for making a large variety of nanoparticles in different size ranges, compositions and architectures,” said Zhiqun Lin, an associate professor in the School of Materials Science and Engineering at the Georgia Institute of Technology. “This very robust technique allows us to craft a wide range of nanoparticles that cannot be easily produced with any other approaches.”
The technique was described in the June issue of the journal Nature Nanotechnology. The research was supported by the Air Force Office of Scientific Research.
The star-shaped block co-polymer structures consist of a central beta-cyclodextrin core to which multiple “arms” — as many as 21 linear block co-polymers — are covalently bonded. The star-shaped block co-polymers form the unimolecular micelles that serve as a reaction vessel and template for the formation of the nanocrystals.
The inner blocks of unimolecular micelles are poly(acrylic) acid (PAA), which is hydrophilic, which allows metal ions to enter them. Once inside the tiny reaction vessels made of PAA, the ions react with the PAA to form nanocrystals, which range in size from a few nanometers up to a few tens of nanometers. The size of the nanoparticles is determined by the length of the PAA chain.
The block co-polymer structures can be made with hydrophilic inner blocks and hydrophobic outer blocks — amphiphilic block co-polymers, with which the resulting nanoparticles can be dissolved in organic solvents. However, if both inner and outer blocks are hydrophilic — all hydrophilic block co-polymers — the resulting nanoparticles will be water-soluble, making them suitable for biomedical applications.
Lin and collaborators Xinchang Pang, Lei Zhao, Wei Han and Xukai Xin found that they could control the uniformity of the nanoparticles by varying the volume ratio of two solvents — dimethlformamide and benzyl alcohol — in which the nanoparticles are formed. For ferroelectric lead titanate (PbTiO3) nanoparticles, for instance, a 9-to-1 solvent ratio produces the most uniform nanoparticles.
The researchers have also made iron oxide, zinc oxide, titanium oxide, cuprous oxide, cadmium selenide, barium titanate, gold, platinum and silver nanocrystals. The technique could be applicable to nearly all transition or main-group metal ions and organometallic ions, Lin said.
“The crystallinity of the nanoparticles we are able to create is the key to a lot of applications,” he added. “We need to make them with good crystalline structures so they will exhibit good physical properties.”
Earlier techniques for producing polymeric micelles with linear block co-polymers have been limited by the stability of the structures and by the consistency of the nanocrystals they produce, Lin said. Current fabrication techniques include organic solution-phase synthesis, thermolysis of organometallic precursors, sol-gel processes, hydrothermal reactions and biomimetic or dendrimer templating. These existing techniques often require stringent conditions, are difficult to generalize, include a complex series of steps, and can’t withstand changes in the environment around them.
By contrast, nanoparticle production technique developed by the Georgia Tech researchers is general and robust. The nanoparticles remain stable and homogeneous for long periods of time — as much as two years so far — with no precipitation. Such flexibility and stability could allow a range of practical applications, Lin said.
“Our star-like block co-polymers can overcome the thermodynamic instabilities of conventional linear block co-polymers,” he said. “The chain length of the inner PAA blocks dictates the size of the nanoparticles, and the uniformity of the nanoparticles is influenced by the solvents used in the system.”
The researchers have used a variety of star-like di-block and tri-block co-polymers as nanoreactors. Among them are poly(acrylic acid)-block-polystyrene (PAA-b-PS) and poly(acrylic acid)-blockpoly(ethylene oxide) (PAA-b-PEO) diblock co-polymers, and poly(4-vinylpyridine)-block-poly(tert-butyl acrylate)-block-polystyrene (P4VP-b-PtBA-b-PS), poly(4-vinylpyridine)-block-poly (tert-butyl acrylate)-block-poly(ethylene oxide) (P4VP-b-PtBA-b-PEO), polystyrene-block-poly(acrylic acid)-block-polystyrene (PS-b-PAA-b-PS) and polystyrene-block-poly(acrylic acid)-block-poly(ethylene oxide) (PS-b-PAA-b-PEO) tri-block co-polymers.
For the future, Lin envisions more complex nanocrystals with multifunctional shells and additional shapes, including nanorods and so-called “Janus” nanoparticles that are composed of biphasic geometry of two dissimilar materials.

On The Net:

New Layer To Human Cornea Discovered By Researchers

April Flowers for redOrbit.com – Your Universe Online

A previously undetected layer in the cornea, the clear window at the front of the human eye, has been discovered by scientists at The University of Nottingham. This new layer, called the Dua´s Layer after Professor Harminder Dua who discovered it, could help surgeons to dramatically improve outcomes for patients undergoing corneal grafts and transplants.

“This is a major discovery that will mean that ophthalmology textbooks will literally need to be re-written. Having identified this new and distinct layer deep in the tissue of the cornea, we can now exploit its presence to make operations much safer and simpler for patients,” said Dua, Professor of Ophthalmology and Visual Sciences.

“From a clinical perspective, there are many diseases that affect the back of the cornea which clinicians across the world are already beginning to relate to the presence, absence or tear in this layer.”

The cornea is a clear protective lens on the front of the eye through which light enters the eye. Before this study, scientists believed the cornea to be comprised of five layers. These layers, from front to back are the corneal epithelium, Bowman’s layer, the corneal stroma, Descemet’s membrane and the corneal endothelium. The new layer, described in the journal Ophthalmology, is located at the back of the cornea between the corneal stroma and Descemet´s membrane. The whole cornea is approximately 550 microns, or 0.5 mm, thick, with the newly discovered layer making up about 15 microns. Despite its relative thinness, the Dua´s layer is incredibly tough and strong enough to be able to withstand one and a half to two bars of pressure.

To prove the existence of the layer, the researchers simulated human corneal transplants and grafts using eyes donated to eye banks for research purposes. They injected tiny bubbles into the cornea, gently separating the different layers. The separated layers were then subjected to electron microscopy, which allowed the scientists to study them at many thousand times their actual size.

By understanding the location and properties of the Dua´s layer, surgeons will be better able to identify where in the cornea these bubbles are occurring. The scientists being able to inject a bubble next to the Dua´s layer means that the layer´s strength will make it less prone to tearing. This means a better outcome for the patient.

This discovery will advance understanding for scientists and doctors about the number of diseases of the cornea, including acute hydrops, Descematocele and pre-Descemet’s dystrophies.

The research team suggests that corneal hydrops, a bulging of the cornea caused by fluid build-up that occurs in patients with keratoconus (conical deformity of the cornea), is caused by a tear in the Dua layer, through which water from inside the eye rushes in and causes waterlogging.

Adults Who Are Survivors Of Childhood Cancer Have High Risk For Disease

Lawrence LeBlond for redOrbit.com – Your Universe Online

A new study by researchers at St. Jude Children´s Research Hospital has found that prevalence of chronic health conditions in adult years is associated with childhood cancer survival. In an analysis of more than 1,700 adult survivors of childhood cancer, the researchers found a high percentage of survivors with one or more chronic health conditions.

The study, to be published June 12 in JAMA, found that 98 percent of the 1,713 survivors analyzed had at least one chronic health condition, hundreds of which were diagnosed through clinical screenings in the long-term, comprehensive health study. Furthermore, the study found that 80 percent of these survivors had at least one chronic health condition by age 45. Conditions included, but were not limited to, new cancers, heart problems, abnormal lung function and neurocognitive dysfunction.

The evidence establishes the importance of life-long clinical health screenings for this high-risk population, said the researchers.

“These findings are a wake-up call to health care providers and remind survivors to be proactive about their health,” said co-first author Melissa Hudson, MD, director of the St. Jude Division of Cancer Survivorship.

In the study, abnormal lung function was diagnosed in 65 percent of the survivors who had a known risk for lung problems due to childhood cancer treatment. Endocrine problems were diagnosed in 61 percent of the at-risk survivors. Heart abnormalities were diagnosed in 56 percent, and neurocognitive impairment was diagnosed in 48 percent of childhood cancer survivors.

“Many were identified early, often before symptoms developed, when interventions may have their greatest impact,” Hudson said.

As part of the study, childhood cancer survivors were brought back to St. Jude´s — where they were treated as children — to undergo extensive medical tests and assessments. Other studies of adult survivors of childhood cancers relied largely on self-reporting or cancer registry data, which may have resulted in substantial underestimation of health problems among survivors.

Hudson said that tailoring treatments to reduce exposure to chemotherapy agents and radiation would go a long way to help minimize the risk of chronic health conditions later in life.

Also, regular medical checkups should occur to discover issues as early as possible, which could offer patients an incentive to follow healthy lifestyles to avoid or slow the progression of some chronic conditions identified in the study, said Kristen Ness, PhD, an associate member of the St. Jude Epidemiology and Cancer Control department, and co-first author of the research.

“Obesity and some types of heart disease are examples of chronic conditions where survivors may be able to mitigate their risk and improve their long-term health by making careful lifestyle choices, such as not smoking, eating a diet low in fat and sugar and engaging in moderate physical activity for 30 minutes a day, five days a week,” added Ness.

The research, part of the St. Jude Lifetime Cohort Study (St. Jude LIFE), included survivors of leukemia, lymphoma and tumors of the brain, bone and other organs. For half of the survivors in the study, their cancer diagnosis was more than 25 years ago. Half of the survivors were also younger than 32 years old when the assessment was completed.

Hudson said that the relative youth of the participants made the prevalence of neurocognitive and neurosensory deficits, heart abnormalities, lung and other problems particularly striking. “The data may indicate a pattern of accelerated or premature aging,” she added.

“In summary, this study provides global and age-specific estimates of clinically ascertained morbidity in multiple organ systems in a large systematically evaluated cohort of long-term survivors of childhood cancer. The percentage of survivors with 1 or more chronic health conditions prevalent in a young adult population was extraordinarily high. These data underscore the need for clinically focused monitoring, both for conditions that have significant morbidity if not detected and treated early, such as second malignancies and heart disease, and also for those that if remediated can improve quality of life, such as hearing loss and vision deficits,” the researchers wrote.

This research reflects ongoing efforts to help the country´s growing population of childhood cancer survivors and their healthcare providers understand and manage cancer-related risks. An estimated 395,000 childhood cancer survivors live in the US. With long-term survival of pediatric cancer patients now surpassing 80 percent, the survivor community will continue to grow, the research concludes.

Stealth Electric Motorcycle Brainchild Of Military And Private Company

Michael Harper for redOrbit.com — Your Universe Online
The trailblazing company Zero Motorcycles makes impressive bikes with a unique twist — they´re all electric. These motorcycles have all the range you´d expect from a traditional bike. They´re also not scared of a few challenges, such as taking on the motocross world. Now, however, the company is bragging on a new product they´ve developed with a very special partner. The new bike is able to take off quickly from a standstill, start and run with no engine noise and run through up to a meter of water. They´re calling it the Zero MMX Military Motorcycle, and together with US Special Operations Forces (SOF) they´ve developed a bike meant to operate in perpetual stealth mode while giving riding soldiers access to important operative controls on the handlebars.
“It was a very rewarding experience for the Zero team to go through such an exacting development process. The military needed a very specific set of core features on the MMX, and we were incredibly thankful to work side-by-side with them to deliver such a unique product,” said Abe Askenazi, Chief Technology Officer for Zero Motorcycles in a statement on their website.
“The great news for our civilian customers is that we made the decision to incorporate into our 2013 MX, FX and XU retail motorcycles virtually all of the powertrain enhancements associated with satisfying this project´s stringent military requirements. Our 2013 product is truly ℠military grade´!”
The majority of the stealth benefits of this motorcycle can be attributed to Zero´s electric engine. Combustion motorcycles are notoriously loud when starting and racing down a highway or rough terrain. With the electric engine, the Zero MMX can start up and run with almost no engine sound at all. In fact, the majority of the sound emitted from this bike will come from road noise and navigating though wooded terrain. Zero started with its electric platform and made some tweaks to create a military-grade product, starting with the battery pack. These bikes can come in two models, the ZF2.8 and ZF5.7. The first model ships with one battery module and the second with two, either of which can be quickly changed in less than a minute regardless of the state of charge. Each battery offer about an hour of drive time.
The Zero MMX also offers integrated wiring to accommodate infrared systems at either the front or rear of the vehicle as well as a specialized dash just for military use. They also ship in flat black to better hide in dimly lit situations.
This isn´t the first tactical bike Zero has ever produced,however. They also offer a model built specifically for police officers which has much of the same functionality. The Zero Motorcycles Police Fleet can operate quietly, rapidly accelerate from a standstill, and even operate at top a speed of 95 miles per hour. In a demo video a police officer says his department loves using the motorcycles because they´re quiet enough to perform surveillance operations as well as patrol local neighborhoods and parks without disrupting citizens.
Don´t expect to find the MMX Military Motorcycle in stores, but there is a publicly available version of Zero MX which retails for $9,495.00.

Driving In Traffic Nearly As Stressful As Skydiving, Says MIT Study

[ Watch The Video Audi´s Road Frustration Index Test ]

Michael Harper for redOrbit.com — Your Universe Online

Driving in traffic can be stressful, and even more so depending on where you live. German car maker Audi and researchers from MIT teamed up to discover just how stressful driving is and came up with some surprising results. According to their data, driving can be more stressful than eating breakfast, attending a lecture at MIT or, in some instances, almost as stressful as jumping from an airplane. This information was all collected as a part of a project they call the Road Frustration Index (RFI), a scoring card which the partnership used to measure how stressful traffic conditions are in different parts of US.

To gather this data, the group outfitted an Audi automobile with a host of cameras and sensors to measure the emotional and physical reactions of the driver while driving through different cities. Each car was rigged with three cameras all pointed at the driver, a GPS device to track the route taken, a Kinect sensor to digitize the driver´s responses and a heart monitor on the seatbelt. These devices then captured data as the driver traversed through towns like Atlanta, Boston, Houston and Los Angeles. Next the team took similar stress measurements of participants who ate breakfast in the morning, attended office meetings, sat through lectures at MIT, or participated in extreme sporting events like skydiving.

“The data we received is fascinating. One study showed that getting side swiped by an oncoming car can be almost as stressful as jumping out of a plane,” said Filip Brabec, the director of product management at Audi America in a statement.

This is a bit of a fuzzy statement, of course. According to the data presented by MIT, the peaks in stress for a driver in Boston are higher than some of the lower points experienced by a skydiver. The general idea remains the same and has largely been understood, at least anecdotally — driving can be a risky and stressful business.

Going one step further, Audi and MIT took to the American roadways to score some major metropolitan cities to determine where drivers are most stressed. They scored these highways based on four main factors: incidents, sentiment, traffic and weather. Incidents, traffic and weather are easy enough to quantify and are easily measured. To understand sentiment, however, the team took to Twitter to understand what residents were saying about their respective city´s traffic conditions. Searching the microblogging site for words like “traffic jam” and “stuck” in specific cities, the team was able to come up with the sentiment factor and include it in the overall RFI score.

Using this scorecard, the RFI had no trouble issuing the highest scores for bad traffic to some of America´s metropolitan areas. For instance, notoriously bad traffic spots like Atlanta, Boston, Cleveland and Los Angeles all received a score of 100. The majority of the scores for most cities came from the sentiment and incidents factor, though some areas like LA had plenty of help from the incidents and traffic dimensions as well. The average RFI score for the US sits at 65, though many cities scored lower than that. The three cities with the lowest RFI were Phoenix (23), Charlotte (27) and Baltimore (29). Other large metropolitan cities with surprisingly low scores were Dallas/Fort Worth (29) New York (34) and Houston (35).

The stress of driving certainly isn´t equal or greater than the stress of throwing yourself from a plane, but based on these results, it certainly affects the body more than we may have previously thought.

Dark Matter May Not Be (Completely) Dark After All, Says Vanderbilt Researcher

John P. Millis, PhD for redOrbit.com — Your Universe Online

For more than 80 years physicists and astronomers have been searching for an elusive form of matter that appears to be responsible for most of the mass in the Universe. While the presence of this dark matter can be measured by the gravitational influence of the particles, directly “seeing” the mass has proven troublesome.

The working theory for decades has been that the candidate particle should be electrically neutral, possessing no electric or magnetic field of its own. Consequently, dark matter would not be expected to interact with light and other electromagnetic fields, which is why we do not detect light scattering after it comes in contact with the matter — the usual way we study things in the Universe.

Many, if not most, models of dark matter predict that the particles would be a special variety predicted by Ettore Majorana that are self-annihilating. Such dark matter particles would convert their mass into radiation when they collide with other dark matter particles. Recent results from the Fermi Space Telescope and the Alpha Magnetic Spectrometer (AMS) detector on board the International Space Station have provided hints that this may, in fact, be happening.

Now, new research is building upon the notion that dark matter may be Majorana particles, and suggests that perhaps dark matter can interact electromagnetically after all — meaning that dark matter may not be ℠dark´ after all.

The candidate particle, suggested by Professor Robert Scherrer and post-doctoral fellow Chiu Man Ho at Vanderbilt University, would still be electrically neutral but would possess a rare type of electromagnetic magnetic field known as an anapole. This special type of field is characterized by rings of current around the particle that create a donut shaped magnetic field contained near the particle boundary. This differs from classical electromagnetic dipole fields which spread out from the particle.

According to Scherer, “Most models for dark matter assume that it interacts through exotic forces that we do not encounter in everyday life. Anapole dark matter makes use of ordinary electromagnetism that you learned about in school — the same force that makes magnets stick to your refrigerator or makes a balloon rubbed on your hair stick to the ceiling.”

If this theory is correct, then dark matter particles would in fact be able to interact electromagnetically with other matter. Which begs the question, why have we not “seen” such interacts yet?

The reason is that the strength of anapole magnetic fields derives its strength from the speed of the particle´s motion. Anapole fields around stationary particles interact very weakly, if at all, but can interact quite strongly as their speed increases. Observations of dark matter halos suggest that the particles are very slow moving, and clump together, so we would not expect to see strong electromagnetic signals.

Yet this new model provides hope. As Scherrer explains, “the model makes very specific predictions about the rate at which it should show up in the vast dark matter detectors that are buried underground all over the world. These predictions show that soon the existence of anapole dark matter should either be discovered or ruled out by these experiments.”

Crazy Ants Eating Electronics Across US Gulf States

Lawrence LeBlond for redOrbit.com – Your Universe Online

A growing epidemic is eating its way across the Gulf Coast of the US and it seems there is little that can be done about it. The invasive ant species, Nylanderia fulva, commonly known as the tawny crazy ant, hairy crazy ant or Raspberry crazy ant, which was first discovered in Houston, Texas in 2002, is causing huge problems for people in at least four states around the Gulf of Mexico.

The tawny crazy ant is an exotic species native to South America, specifically Argentina and/or Brazil. Since it was first spotted by a pest control worker in Texas in 2002, it has been subsequently spotted in Mississippi, Louisiana and Florida. In Florida, some reports have estimated that N. fulva has been plaguing the Sunshine State since the 1990s, pushing out the crazy ant´s native cousin, N. pubens.

The ant has been named “crazy” for the trail of destruction it leaves in its wake. The tiny (less than an eighth-inch long) insect has been known to eat everything from livestock to electrical equipment. Singularly, these specimens may not seem so harmful, but at times millions have been found hiding under rocks, inside computers and elsewhere, devouring everything it touches.

The ant is now known in at least 21 counties in Texas and 20 in Florida, transported to all areas unwittingly by humans. While its bite isn´t known for stinging, it is highly invasive and has infested homes, RVs, computers, laptops, smartphones, and wildlife across the south.

Edward LeBrun of the University of Texas, Austin, said these invasive pests are displacing fire ants in areas across the southeastern US. They are also reducing diversity and abundance across a range of other ant and arthropod species. He said their spread can be controlled if people are more careful when they travel.

LeBrun, who is a research associate with the Texas invasive species research program at the Brackenridge Field Laboratory in the College of Natural Sciences, published a study on the invasive ant species in April 2013´s issue of Biological Invasions.

People are wishing their fire ants are back, realizing they are relatively calm compared to the voracious crazies, according to LeBrun. “Fire ants are in many ways very polite. They live in your yard. They form mounds and stay there, and they only interact with you if you step on their mound.”

Crazy ants, unlike the fire ants, “go everywhere.” And they are the worst of all ant invasions seen in the US to date.

In late 1800s, the Argentine ant invaded the US through the port of New Orleans. In 1918 the black imported fire ant showed up in Alabama. In the 1930s the red imported fire ant arrived and began displacing the black fire ant and the Argentine ant, according to LeBrun.

In the tawny crazy ant´s native habitat — Argentina and Brazil — it is likely that the population is held in check by other ant species and a variety of predators that feed on them. However, in the US there are no such natural predators, and the US native ant species are not as aggressive as those farther south.

Another issue that has arisen is the fact that these crazies are much harder to kill than other ants. Most colony ants are generally controlled by placing poison ant baits out, where they go and consume them and bring the death to their colony. But the crazy ant does not fall for this trick. And because the crazy ant does not have any particular colony boundary, if they are in fact killed in a certain area, the sheer size of the supercolony survives and can fill back in the area that was previously wiped out.

The biggest threat from N. fulva is the threat to electronics. According to ABC News, the crazy ants have caused more than $146.5 million in damages to electrical equipment in one year alone, just in Texas.

But with electronics, the whole ordeal seems rather tragic, not just for the electrical equipment, but for the ant as well. For example: when an ant touches a hot wire it will be electrocuted. When this occurs, the ant immediately performs what is called gaster flagging, which is an instinctual move that releases pheromones, luring more of its relatives to the scene. As each of these ants arrive and touch either the dead ant or another hot wire, they too will fry and release pheromones. Eventually, with so many ants arriving on the scene, circuits get overloaded and short out.

LeBrun said the single biggest role humans can play in keeping these pests from spreading is to check luggage, clothing and other items before traveling. The breeding members of the ant species cannot fly, so they are fairly limited to their range — generally traveling less than 650 feet per year.

Cutting down on the number of transplantation events could slow the spread of the insect by decades, said LeBrun. That extra time could be enough to give the ecosystem time to adapt to the insect and researchers more time to develop better control techniques, he added.

“We can really make a difference,” he said, “but we need to be careful, and we need to know more.”

Mysterious Stone Monument Lurks Beneath The Waves Of The Sea Of Galilee

April Flowers for redOrbit.com – Your Universe Online
A number of significant archaeological sites are found along the shores of the Sea of Galilee, located in the North of Israel. While conducting a geophysical survey, a team of researchers from Tel Aviv University found an ancient structure deep beneath the waves of the southern Sea of Galilee as well.
The research team, led by Prof. Shmulik Marco of TAU’s Department of Geophysics and Planetary Sciences, stumbled upon a cone-shaped monument, approximately 230 feet in diameter and 39 feet high. The monument, which weighs an estimated 60,000 tons, was built on dry land approximately 6,000 years ago according to initial findings. It was later submerged under the water.
The study, published in the International Journal of Nautical Archaeology, suggests that the building blocks for the structure were probably brought from more than a mile away and arranged according to a specific construction plan. Dr. Yitzhak Paz of the Antiquities Authority and Ben-Gurion University notes that the site resembles early burial sites in Europe and was likely built in the early Bronze Age. There might be a connection to the nearby ancient city of Beit Yerah, Paz believes. Beit Yerah was the largest and most fortified city in the region.
The initial goal of the survey was to uncover the origins of alluvium pebbles found in this area of the Sea of Galilee. They believed the pebbles were deposited by the ancient Yavniel Creek, a precursor to the Jordan River south of the Sea of Galilee. The researchers observed a massive pile of stones in the middle of an otherwise smooth basin while using sonar technology to survey the bottom of the lake.
This pile of stones aroused their curiosity. Prof. Marco went diving to learn more, revealing that the pile was not a random accumulation of stones, but a purposefully-built structure composed of three-foot-long volcanic stones called basalt. The closest deposit of basalt is more than a mile away, leading Marco to believe that they were brought to the site specifically for this structure.
The team used the accumulation of sand around its base to estimate the age of the structure. By noting that the base is now six to ten feet below the bottom of the Sea of Galilee due to a natural build-up of sand, and taking into account the rate of accumulation, the team deduced that the monument is several thousand years old.
To further study the artifact, the research team plans to organize a specialized underwater excavations team. They will be trying to learn more about the origins, including an investigation of the surface the structure was built on. Hunting for artifacts surrounding the structure will help to more accurately date the monument and give clues to its purpose and builders. Prof. Marco says that the answers might illuminate the geological history of the area as well.
“The base of the structure – which was once on dry land – is lower than any water level that we know of in the ancient Sea of Galilee. But this doesn’t necessarily mean that water levels have been steadily rising,” he says.
The Sea of Galilee is a tectonically active region, meaning the bottom of the lake, and therefore the structure, may have shifted over time. The team intends to investigate further to increase the understanding of past tectonic movements, the accumulation of sediment, and the changing water levels throughout history.
Image 2 (below): An underwater photo shows the structure is made of basalt boulders. Photo: Shmulik Marco.

New Algorithm Solves Cloud Security Issues

Lee Rannals for redOrbit.com – Your Universe Online

MIT researchers have developed a new algorithm that could help make up-and-coming cloud computing technology more secure.

Researchers from MIT´s Computer Science and Artificial Intelligence Laboratory presented their work on a new encryption scheme for cloud computing at the Association for Computing Machinery´s 45th Symposium on the Theory of Computing .

Homomorphic encryption is a new research topic in cryptography that promises to make cloud computing perfectly secure. With the encryption theory, a Web user would send encrypted data to a server in the cloud, which would then process it without decryption and send back a still-encrypted result.

However, a downfall scenario of this idea would be attempting to search the server. If a user sent a search term to a server to find a specific record, the server would have no choice but to send back information on every record in the database. The MIT team says they have developed a solution solving this problem that involves a bit of a collaboration of many schemes.

The researchers built their functional-encryption scheme by fitting together several existing schemes, each of which has vital attributes of functional encryption, but none of which is entirely sufficient in itself. This new system begins with homomorphic encryption and embeds the decryption algorithm in a garbled circuit, which is when only the holder of a secret cryptographic key can encrypt data.

The key to the garbled circuit is protected by attribute-based encryption, which is a public-key system that is reusable but cannot reveal the output of a function without revealing the input. The team said their encryption scheme is layered in such a way that one use grants the server access to a general function rather than a single value.

“It´s an extremely surprising result,” said Ran Canetti, a professor of computer science at Boston University. “I myself worked on this problem for a while, and I had no idea how to do it. So I was wowed. And it really opens up the door to many other applications.”

He said the researchers’ scheme will not be deployed any time soon, but is sure that “it’s going to lead to more stuff.”

“It´s an enabler, and people will be building on it,” Canetti, who was not a part of the research, said.

Researchers Find No Link Between Vegetable Oil Consumption And Inflammation

[ Watch the Video: Vegetable Oil IS Good for You ]

redOrbit Staff & Wire Reports – Your Universe Online

There is no link between vegetable oil consumption and circulating indicators of inflammation typically associated with heart disease, cancer, asthma and arthritis, researchers from the University of Missouri and the University of Illinois claim in a recently-published study.

Vegetable oils, including those from soy, corn and canola, are high in linoleic acid (LA), an unsaturated omega-6 fatty acid that has been found in previous animal studies to promote inflammation. However, Missouri researcher Kevin Fritsche and his colleagues argue that is not always the case, as people respond differently to linoleic acid.

“In the field of nutrition and health, animals aren´t people,” explained Fritsche, an MU professor of animal science and nutrition in the Division of Animal Sciences. “We´re not saying that you should just go out and consume vegetable oil freely. However, our evidence does suggest that you can achieve a heart-healthy diet by using soybean, canola, corn and sunflower oils instead of animal-based fats when cooking.”

According to the researchers, the average American consumes at least three tablespoons of vegetable oil each day. In addition, for the past four decades, linoleic acid has been linked to reduced blood cholesterol levels, and it has also been linked to a lower risk of heart disease. Some experts have warned that Americans could be consuming too much vegetable oil (and essentially, too much LA). Fritsche and his colleagues contradict those claims.

He and Guy Johnson, an adjunct professor of food and human nutrition at the University of Illinois, reviewed 15 different clinical trials involving 500 US adults who consumed vegetable oils and various other forms of fats.

They set out attempting to discover whether LA promoted inflammation in humans, and after reviewing the research, they discovered no such link between diets high in the fatty acid and inflammation in the body.

“Some previous studies have shown that inflammation, which is an immune response in the body, can occur when certain fats are consumed,” Fritsche said. “We´ve come to realize that this inflammation, which can occur anywhere in the body, can cause or promote chronic diseases. We know that animal fats can encourage inflammation, but in this study, we´ve been able to rule out vegetable oil as a cause.”

Fritsche and Johnson report that their findings emphasize the importance of following Institute of Medicine and American Heart Association recommendations regarding the use of vegetable oil when cooking. Ideally, people should consume between two and four tablespoons of vegetable oil daily to reach the necessary amount of linoleic acid needed for a heart-healthy diet, they said.

“Consumers are regularly bombarded with warnings about what foods they should avoid,” explained Fritsche. “While limiting the overall fat intake is also part of the current nutrition recommendations, we hope people will feel comfortable cooking with vegetable oils.”

For more information about the study, please visit: http://cafnrnews.com/2013/05/still-soy-good-for-you/

Complex Trojan Takes Advantage Of Previously Unknown Android Exploit

redOrbit Staff & Wire Reports — Your Universe Online
Security researchers have discovered a new Trojan program that exploits previously undetected flaws in the Android operating system and utilizes techniques more commonly found in Windows malware to remain undetected as it executes rogue commands on infected mobile devices.
The Trojan has been named Backdoor.AndroidOS.Obad.a (Obad.a) by representatives of computer security firm Kaspersky Lab, who has dubbed it the most sophisticated piece of Android malware to date, according to Lucian Constantin of IDG News Service. The program makes heavy use of encryption and code obfuscation in an attempt to prevent security software from discovering what it is doing, the antivirus company said.
The program “is designed to send SMS messages to premium-rate numbers and allows attackers to execute rogue commands on infected devices by opening a remote shell,” Constantin explained. “Attackers can use the malware to steal any kind of data stored on compromised devices or to download additional malicious applications that can be installed locally or distributed to other devices over Bluetooth.”
After it sends those text messages, “it deletes replies made to the text. Next, it downloads a file from a remote server and automatically runs it for installation. All Bluetooth-enabled devices in the vicinity can be infected by a unit carrying the malware,” Giancarlo Perlas of The Droid Guy added. “There are many other dangers associated with Obad.a that includes stealing personal information of the user like contacts and financial details.”
Currently, Obad.a is not very widespread, according to Dan Goodin of Ars Technica. In fact, Constantin said that installation attempts for this particular program amounted to just 0.15 percent of the total number of mobile device malware infection attempts over a three-day period.
However, Goodin warns that it does show that it is possible for cybercriminals to develop malware programs that exploit smartphone vulnerabilities. While most viruses and Trojans targeting Android devices are fairly rudimentary in nature, he said, the way that Obad.a can use various connections to spread to nearby phones and allow hackers to issue malicious commands remotely reveals a new level of complexity and sophistication.
“Obad.a exploits two additional undocumented bugs–one in a component known as DEX2JAR and the other related to the AndroidManifest.xml file,” Goodin added. “Those exploits are designed to make it harder for researchers to reverse engineer the malware. The backdoor also has no interface and works in background mode, further complicating analysis by whitehats or competing malware developers.”
The origins of the Obad.a malware are currently unknown, and Kaspersky Lab has not speculated as to who might be running the program, Neil McAllister of The Register said. The security firm reported that they have already contacted Google about the OS vulnerabilities exploited by the Trojan, and that it can now be detected by Android security software.

Human Error, System Glitches Responsible For More Data Breaches Than Hackers

redOrbit Staff & Wire Reports — Your Universe Online
Mistakes, negligence and glitches are more likely to be responsible for computer-related security breaches than cyber attacks, according to a report released last week by Symantec and the Ponemon Institute.
The eighth-annual “Cost of a Data Breach” study, which was conducted by the independent research firm and sponsored by the California-based security software company, found that nearly two-thirds of all data breaches that occurred last year could be chalked up to human error (35 percent) or system glitches (29 percent).
“However, malicious attacks remain the single highest cause of breaches, with 37 percent of the intrusion pie,” John P. Mello Jr. of PCWorld reported on Saturday, adding that the figures “vary by nation“¦ Germany had an almost even split between malicious attacks (48 percent) and negligence/glitches (52 percent). By comparison, more than three-quarters of the breaches (77 percent) in Brazil were blamed on human error-system failures.”
Those mistakes can be costly, according to the study. The average number of records breached per organization was 23,647, ZDNet´s Rachel King said, and the average costs of each ranged from $130 to $136. Data loss was even more expensive in the US and Germany, where the average increased to $199 and $188 respectively, King added.
Officials with the Ponemon Institute told eWeek that those costs could be reduced if companies were to implement stricter security management practices. Taking steps such as creating an effective incident response team and hiring a chief information security officer could reduce the cost of network breaches by up to 25 percent, they claim.
The research firm conducted surveys of more than 1,400 people at 277 organizations in nine different countries, including the US, the UK, Germany, France, Australia, India, Italy, Japan, and Brazil.
King said that Brazilian companies were the most likely to experience breaches caused by human error and Indian businesses were more likely to see breaches resulting from system glitches. German firms were most likely to experience issues related to hackers or malicious attacks, followed by Australia and Japan.
“American companies said the greatest increase in data breach costs stemmed from a third-party error or even quick notification to data breach victims, regulators, and other stakeholders. UK companies pointed towards lost and stolen devices as the biggest culprits,” she added. “But US and UK companies saw the greatest reduction in costs when they had strong response plans in place. Furthermore, American and French businesses also saw reduced costs when they enlisted consultants for data breach remediation.”

MESSENGER Team Names Ten Major Fault Scarps On Mercury

April Flowers for redOrbit.com — Your Universe Online

Recently, the MESSENGER Science Team proposed names for 10 rupes on Mercury. The International Astronomical Union (IAU), which has been the arbiter of planetary and satellite nomenclature since 1919, approved the names. In keeping with the theme of naming rupes on Mercury, they all bear the names of ships of discovery.

Rupes is the Latin word for cliff, which perfectly suits the formations on Mercury. They are long cliff-like escarpments that form over major faults. One large block of crust thrusts up and over another along these fault lines. Currently, “rupes” is only used by planetary geologists to describe formations on other worlds than Earth.

“We proposed the name Enterprise Rupes for the longest rupes on Mercury, which is 820 kilometers (510 miles) long. The USS Enterprise was launched in 1874 and conducted the first surveys of the Mississippi and Amazon rivers,” says Michelle Selvans of the Center for Earth and Planetary Studies at the National Air and Space Museum. Selvans led the effort to name this group of rupes.

“We also recommended some fun names, such as Calypso Rupes, for Jacques Cousteau’s ship,” she says. Some names were chosen for their personal connection to the team, such as Palmer Rupes — named after an icebreaker research vessel Selvans sailed on to conduct marine geophysics research off the coast of Antarctica.

The other names accepted are:

  • Alvin Rupes, after DSV Alvin. Built in 1964 as one of the world´s first deep-ocean submersibles, Alvin has made more than 4,400 dives and can reach nearly 63 percent of the global ocean floor.
  • Belgica Rupes, after RV Belgica. Originally designed as a whaling ship in 1884, the steamship was converted to a research ship in 1896 and took part in the Belgian Antarctic Expedition of 1897—1901, becoming the first ship to overwinter in the Antarctic.
  • Carnegie Rupes, after a yacht launched in 1909 as a research vessel. Built almost entirely from wood and other non-magnetic materials to allow sensitive magnetic measurements to be taken for the Carnegie Institution’s Department of Terrestrial Magnetism, the ship spent 20 years at sea. The vessel traveled nearly 300,000 miles and carried out a series of cruises until an onboard explosion in port destroyed the ship in 1929.
  • Duyfken Rupes, after a small Dutch ship built in the late 16th century. In 1606, the vessel sailed from the Indonesian island of Banda in search of gold and trade opportunities on the island of Nova Guinea. The ship and her crew did not find gold, instead they found something more scientifically exciting: the northern coast of Australia.
  • Eltanin Rupes, after the USNS Eltanin, launched in 1957 as a noncommissioned Navy cargo ship. The ship had a double hull and was officially classified as an Ice-Breaking Cargo Ship. In 1962, the ship was refitted to perform research in the southern oceans and reclassified an Oceanographic Research Vessel. Magnetic field measurements made with the Eltanin were critical in validating the hypothesis of sea-floor spreading.
  • Nautilus Rupes, after the Exploration Vessel Nautilus. In service since 1967, the Nautilus has conducted underwater studies in archeology in the Mediterranean and Caribbean seas. The vessel is currently equipped with remotely operated vehicles (ROVs) and a high-bandwidth satellite communication system for remote science and education.
  • Terror Rupes, after the HMS Terror. Built in the early 1800s as a British Royal Navy bomb vessel, the ship was involved in the bombardment of Fort McHenry, one of the last battles of the War of 1812. The bombardment provided the inspiration for Francis Scott Key to write the American national anthem, “Star Spangled Banner.” The ship was retrofitted for polar exploration and has participated in Antarctic exploration.

According to Selvans, Mercury´s rupes are revealing a great deal about the planet´s evolution. Each of the rupes formed over a major fault system that, in turn, accommodated kilometers of horizontal shortening of the planet´s crust. The accumulated contraction represented by this shortening collectively records the cooling and contraction of Mercury´s interior over the last four billion years.

To decide which rupes to name, the team chose the longest and most geologically interesting features imaged by MESSENGER. “These features are easy to identify in images taken at dawn and dusk, when they throw shadows along their entire length,” Selvans says. “A crisp shadow that is only about 1 kilometer wide but hundreds of kilometers long really stands out in images.”

Since 1976, 27 rupes have been named on Mercury. The ten new names are the first new designations for rupes in more than five years.

“The MESSENGER team is grateful to the IAU for their approval of formal names for rupes on Mercury,” adds MESSENGER Principal Investigator Sean Solomon of Columbia University’s Lamont-Doherty Earth Observatory. “MESSENGER observations have revealed that these deformational features accommodated far more crustal contraction than indicated by earlier estimates. The new names will permit the MESSENGER team to document this finding in a clear and straightforward manner. Moreover, the names give us the opportunity to recognize that the exploration of Earth´s oceanic regions continues in parallel with the exploration of Earth´s sister planets.”

Study Gives Hope For A Potentially New PTSD Treatment

Lee Rannals for redOrbit.com — Your Universe online

Researchers writing in the journal Science Translational Medicine say they have discovered a potential new treatment for posttraumatic stress disorder (PTSD).

The team from Emory University, University of Miami and Scripps Research Institute discovered a compound called SR-8993 that was able to reduce PTSD-like symptoms in mice after they are exposed to stress. The discovery could pave the way for a treatment given to people shortly after experiencing a traumatic event.

“At first glance, one might infer that the main mechanism by which morphine is working is through pain reduction, but our results lead us to think it could also be affecting the process of fear learning,” says senior author Kerry Ressler, MD, PhD, professor of psychiatry and behavioral sciences at Emory University School of Medicine and Yerkes National Primate Research Center.

The compound tested hits one of several molecular buttons in the brain that are pushed by opioid drugs like morphine or oxycodone. SR-8993 was developed by scientists at Miami and Scripps to help treat alcohol and drug addiction originally, and so far the team has not seen any narcotic or addictive effects.

“We hypothesized that the fear and anxiety component of addiction relapse may be related, in terms of brain chemistry, to the anxiety felt by PTSD patients,” says co-author Thomas Bannister, PhD, associate director of translational research and assistant professor of medicinal chemistry at Scripps Research Institute in Florida.

The scientists were looking at what genes are activated in the brains of mice after they are exposed to stress. They said they were specifically looking for changes in the amygdala, which is the region of the brain known to be involved in regulating fear responses. Mice exposed to stress become more anxious and tend to freeze in fear, even if there is no “danger,” which is a behavior similar to those experienced by humans suffering from PTSD.

The researchers found that exposure to stress particularly affects regulation of the gene Oprl1 in the amygdala. The protein encoded by Oprl1 is part of a family of opioid receptors, which allow brain cells to receive signals from opioid drugs as well as natural compounds produced by the body. The team developed SR-8993 as a compound that activates Oprl1 more than other opioid receptors, thus avoiding narcotic and addictive effects. When they gave the compound to the mice, it impaired “fear memory consolidation.”

Mice in the study still were able to become afraid of sounds and shocks, but the fearful memories were not as durable and the mice did not freeze as much in response to the sound alone two days later.

“We think SR-8993 is helping to promote a natural process that occurs after trauma, preventing fear learning from becoming over-represented and generalized,” Ressler says. “Our model is that in PTSD, the Oprl1 system is serving as a brake on fear learning, but that brake is not working if prior trauma had occurred.”

The team said people with a variation of the Oprl1 gene who experienced childhood abuse tend to have stronger PTSD symptoms. They also have more difficulty discriminating between “danger” and “safety” signals in experiments when they hear startling noises.

“While many hurdles remain for SR-8993 or a related compound to become a drug used to prevent PTSD, these results are important first steps in understanding how such treatments may be effective,” Bannister says.

This research comes right in time for National PTSD Awareness Month. According to Oakland County Community Mental Health Authority, one in three troops returning from active-duty are being diagnosed with serious PTSD. Current treatment options for the condition include different forms of therapy, counseling and medication.

You Can Share Your Xbox One Games (Sort Of)

Michael Harper for redOrbit.com — Your Universe Online
We´ve come a long way from the days of Atari and Nintendo gaming. The graphics have greatly improved, titles are released with the fanfare normally reserved for summer blockbusters, and consoles have taken on multiple roles to serve as entertainment hubs rather than a single-use device. Microsoft´s new Xbox One is a prime example of how advanced game consoles have become. Yet as Microsoft takes gaming into the future, they´re also applying seemingly arbitrary restrictions on their games. While gamers will be allowed to trade-in and resell their disc-based games, (it had been previously rumored this would be banned) Microsoft is leaving this practice in the hands of the game developers. In an article explaining these new policies, Microsoft also said gamers could give their disc-based games to friends, but only those friends Microsoft approves of.
The Xbox One allows users to play games in one of two ways. First, users can buy a physical disc online or from a brick-and-mortar store in the same way it´s always been done up to this point. Gamers can also buy digital copies of these games and have them instantly installed on their One´s hard drive. This combination allows gamers to play their games without switching discs. This also allows players to sign in on a friend´s console and play their purchased games, even if their friend hasn´t bought the title.
From the looks of it, this is about as close as Microsoft will get to allowing friends the ability to share their games with one another.
Though the company says they won´t charge a fee for those who want to sell or trade their disc-based games, they have left it up to the developers to decide if trading or selling will be permitted.
“Today, some gamers choose to sell their old disc-based games back for cash and credit,” reads Microsoft´s explanation of these new policies.
“We designed Xbox One so game publishers can enable you to trade in your games at participating retailers.  Microsoft does not charge a platform fee to retailers, publishers, or consumers for enabling transfer of these games.”
Microsoft has made their own decision concerning giving the games away. Gamers with an Xbox One and a generous heart will only be able to give their disc-based games to those who have been on their friends list for 30 days or more. Additionally, these games can only be given once, though it´s not entirely clear if this means the transaction can only occur once between friends or if the game itself can only change hands one time.
In other words, Microsoft will allow you to give away a disc-based game, but only once and only to those they believe are true friends.
It would have been nigh impossible to enforce this kind of restriction on previous consoles, but the One´s “always on” behavior acts as a tattletale for those who try to circumvent these policies. Though earlier rumors claimed the new Xbox needed to connect to the Internet and report back to Microsoft´s servers, the company has said the One only needs to connect to the Internet once every 24 hours. If a console doesn´t communicate with home base by this time, it will not perform it´s primary objective; playing games.
Offline gaming is not possible after these prescribed times until you re-establish a connection, but you can still watch live TV and enjoy Blu-ray and DVD movies,” said Microsoft in a statement.
Microsoft will allow users to share their games with ten friends, but this is about as close as it gets to the old days of freely swapping and selling cartridges and discs.

A Healthy Neighborhood Can Lower The Obesity Level Of Its Residents

redOrbit Staff & Wire Reports – Your Universe Online

Neighborhoods that include restaurants and businesses that support healthy eating choices can make a “measurable” difference in the battle against obesity, according to a new study led by researchers at the Drexel University School of Public Health.

Dr. Amy Auchincloss, an assistant professor at the Philadelphia-based institution, and her colleagues conducted a five-year study analyzing the impact that a neighborhood could have on an individual´s health.

They found that “significantly” fewer people became obese when they lived within a mile of healthier food environments compared to those without access to such places. Previous studies have demonstrated that healthier, less-obese men and women are more likely to live in neighborhoods that had access to supermarkets and fresh foods, and to a lesser extent, in neighborhoods that are walkable.

“Interpretation of results from cross-sectional analyses is limited since that type of study can’t determine whether weight gain preceded the neighborhood exposure” explained Auchincloss.

She and co-authors from the University of Michigan School of Public Health, the University of California Berkeley, the Johns Hopkins School of Medicine and Gramercy Research Group detail their findings in a recent edition of the journal Obesity.

The researchers claim that their research improves upon previous studies of neighborhood risk factors and obesity, as they selected participants who were not obese, and tracked which subjects gained considerable amounts of weight during a five-year follow-up period. They also took into account individual factors which could influence both a person´s health status and the choice of which neighborhood he or she resides in.

Auchincloss and her colleagues analyzed the health data of over 4,000 adults living in six different US cities. The study participants were followed over the course of five years as part of a larger Multi-Ethnic Study of Atherosclerosis, and answered questions regarding the areas that surrounded their homes. Those questions involved how easy it was to walk in the neighborhood, and to what extent healthy foods were available nearby.

Over the course of the study, approximately 10 percent (406) of the study participants became obese. Healthy food environments were associated with lower obesity levels, even after accounting for age, gender, income, education, and other factors. The walkability of the neighborhood was also linked to lower obesity, though the researchers note that this association was not independent of a healthy food environment.

“Healthy food environments and walkability are often correlated in urban areas which is why it can be hard to assess their independent effects,” Auchincloss said.

“Programs including farmer´s markets and subsidies for fresh food vendors to locate in disadvantaged areas, are the types of adaptations cities and towns can make to create healthier communities — without putting the burden on individuals to have to move to a new neighborhood in order to adopt a healthier lifestyle,” she added.

Coral Recovery In Light Of Cyclone Yasi Shows Promise

April Flowers for redOrbit.com – Your Universe Online
The coral reefs on Australia´s Great Barrier Reef were devastated by Cyclone Yasi — a Category 5 Hurricane which made landfall in Queensland, Australia, on February 3, 2011.
A new study from the ARC Center of Excellence for Coral Reef Studies (CoECRS) shows that large numbers of coral larvae replenished the reefs within nine months of the cyclone. The findings, published in PLoS ONE, provide fresh hope for the ability of the world´s coral reefs to recover from destructive storms.
Cyclone Yasi all but obliterated the corals on exposed reefs when its 32-foot waves and 177 mph winds hit Queensland´s Palm Islands, according to Dr Vimoksalehi Lukoschek of CoECRS and James Cook University.
Lukoschek and colleagues have dived on the devastated region over the last two years, studying the extent of the damage and the potential for recovery.
“Before the storm exposed reefs were covered in hard corals like the branching Acropora, arguably the most important group of reef-building corals on the Great Barrier Reef,” Lukoschek says. “The destructive effects of the cyclone reduced overall coral cover on exposed reefs to less than 2 percent and Acropora accounted for less than 1 percent of coral cover.”
“Basically, cyclone Yasi removed all adult colonies of Acropora and only a few very small juveniles survived the cyclone. What we witnessed was absolute devastation, previously healthy reefs almost completely devoid of any live coral,” adds Lukoschek.
The reefs of the Palm Islands were largely sheltered and escaped the majority of the storm´s damage. Coral cover was around 25 percent after the storm, which was similar to the coverage before. The sheltered reefs, most importantly, had large numbers of adult colonies of Acropora following the storm. These colonies can produce larvae to replenish devastated reefs.
The scientists found that areas devastated by Cyclone Yasi had been partly overgrown with algae a year after the storm. However, areas that had escaped undamaged remained algae free and coral-dominated.
Most importantly, the study found that there was a high coral larval recruitment on exposed reefs that had been damaged by the storm following the first mass-spawning event after it. According to Lukoschek, this is good news because “it essentially means that reefs that were completely devoid of reproductively mature adult corals, which are needed to produce larvae, were being replenished by coral larvae from reefs that had not been impacted by the cyclone.”
The team is conducting ongoing genetic research to determine which reefs these coral larvae came from. “Nonetheless, regardless of where they came from, the rapid replenishment of devastated reefs by large numbers of new recruits, combined with the juvenile corals that survived the cyclone, suggests that the recovery process is underway.”
“The take home message from our study for the GBR is that although cyclones can have a major destructive impact on coral cover, these impacts tend to be patchy and coral larvae coming in from less impacted sites has the potential to reseed impacted reefs leading to their recovery,” says Lukoschek.
“Our research indicates that corals can recover if given a chance to do so. Nonetheless, if the recovery process is disrupted by cyclones occurring in quick succession, or by other disturbances, such as coral bleaching or starfish outbreaks, or hindered by chronic stressors, such as poor water quality, pollution or disease, then coral populations may fail to recover,” Lukoschek concludes.

Breastfeeding Babies Boosts Language, Cognition And Emotion

redOrbit Staff & Wire Reports – Your Universe Online

By the age of two, babies who were breastfed exclusively for at least three months experienced enhanced development in parts of the brain responsible for language, cognition and emotional function compared to infants that were given at least some formula as infants, according to a new study.

A team of researchers used special child-friendly “quiet” magnetic resonance imaging (MRI) machines to study brain growth in children under the age of four. They discovered that breastfeeding alone produced better development in key areas of the brain than a combination of breast milk and baby formula, which in turn produced better results than the use of formula without breast milk. The results of their study are detailed online in the journal NeuroImage.

Previous research has produced similar results, correlating breastfeeding with improved cognitive outcomes in older adolescents and adults, the researchers said. However, they say that this is the first imaging study to search for differences associated with breastfeeding in the brains of very young, healthy children.

“We wanted to see how early these changes in brain development actually occur,” explained Sean Deoni, assistant professor of engineering at Brown University, the head of the institution´s Advanced Baby Imaging Lab, and lead author of the study. “We show that they´re there almost right off the bat.”

Using a special MRI technique that he developed, Deoni analyzed the microstructure of the brain´s white matter — the tissue which contains long nerve fibers and helps different areas of the brain communicate with one another. He and his colleagues specifically looked for myelin, the fatty material which insulates nerve fibers and speeds electrical signals as they travel throughout the brain.

Deoni´s team discovered that the exclusively breastfed group experienced the fastest growth of myelinated white matter of the three different groups, experiencing a “substantial” increase in white matter volume by the age of 24 months. The group which was fed both breastmilk and formula experienced less growth than the breastmilk-only group, but more than those who were fed only formula.

“We´re finding the difference [in white matter growth] is on the order of 20 to 30 percent, comparing the breastfed and the non-breastfed kids. I think it´s astounding that you could have that much difference so early,” Deoni said. “I think I would argue that combined with all the other evidence, it seems like breastfeeding is absolutely beneficial.”

To verify the results of the MRI scans, Deoni and his associates asked older children to complete a series of basic cognitive tests. Those exams revealed “increased language performance, visual reception, and motor control performance in the breastfed group,” the university reported.

“The study also looked at the effects of the duration of breastfeeding,” they added. The authors compared babies who had been breastfed for more than one year with those who had been breastfed for less than 12 months, and found “significantly enhanced brain growth in the babies who were breastfed longer — especially in areas of the brain dealing with motor function.”

Bioengineered Blood Vessels Successfully Implanted In Patient

Brett Smith for redOrbit.com – Your Universe Online

Surgeons at Duke University have announced the successful implantation of a bioengineered blood vessel in the United States.

“This is a pioneering event in medicine,” said Dr. Jeffrey H. Lawson, a vascular surgeon and vascular biologist at Duke Medicine who helped develop the technology. “It´s exciting to see something you´ve worked on for so long become a reality.”

“We talk about translational technology — developing ideas from the laboratory to clinical practice — and this only happens where there is the multi-disciplinary support and collaboration to cultivate it,” he added.

The Duke team is not the first-ever to perform the operation as clinical trials for bioengineered veins began in Poland in December. The Food and Drug Administration (FDA) recently agreed to allow a phase 1 trial involving 20 kidney dialysis patients in the United States. The concept of bioengineering veins came alive in 2011 when an East Carolina University team announced they successfully grew a bioengineered blood vessel.

Trials are initially set for easily accessible sites in hemodialysis patients, but researchers ultimately aim to develop a similar technique for heart bypass surgeries, which are performed nearly 400,000 times annually in the United States. “We hope this sets the groundwork for how these things can be grown, how they can incorporate into the host, and how they can avoid being rejected immunologically,” Lawson said. “A blood vessel is really an organ — it´s complex tissue. We start with this, and one day we may be able to engineer a liver or a kidney or an eye.”

Dr. Laura Niklason, a former faculty member at Duke and co-founder of medical spin-off company called Humacyte, began working on the technology in animal models as a post-doctoral student and eventually worked to develop the technology for humans.

“The bioengineered blood vessel technology is a new paradigm in tissue engineering,” Niklason said. “The fact that these vessels contain no living cells enables simple storage onsite at hospitals, making them the first off-the-shelf engineered grafts that have transitioned into clinical evaluation.”

The surgeons start the process with a biodegradable mesh that serves as a scaffolding for the vein. After the mesh is seeded with smooth muscle cells, it gradually dissolves as the cells grow in a medium of various nutrients. After a couple of months, a life-like vein is produced.

An earlier technique involved using the patient´s cells to seed the scaffolding and prevent their own body from rejecting the implant. However, that process was too time consuming, ruling out the possibility of mass production.

The refined technique uses donated human tissue to seed the blood vessel matrix. The resulting vein is washed with a special solution to rinse out any cells that might trigger an immune response. “At the end of the process, we have a non-living, immunologically silent graft that can be stored on the shelf and used in patients whenever they need it,” Niklason said. “Unlike other synthetic replacements made of Teflon or Dacron, which tend to be stiff, our blood vessels mechanically match the arteries and veins they are being sewn to. We think this is an advantage.”

When the novel vessels are placed into animals, they actually adopt the cellular properties of a blood vessel, eventually becoming indistinguishable from the tissue around it. “They are functionally alive,” Lawson said. “We won´t know until we test it if it works this way in humans, but we know from the animal models that the blood travels through the blood vessels and they have the natural properties that keep the blood cells healthy.”

Common Genetic Disease Linked To Father’s Age

Genetic mutation of a testis stem cell actually gives the disease an edge, making older fathers more likely to pass it along to their children

Scientists at USC have unlocked the mystery of why new cases of the genetic disease Noonan Syndrome are so common: a mutation that causes the disease disproportionately increases a normal father´s production of sperm carrying the disease trait.

When this Noonan syndrome mutation arises in a normal sperm stem cell it makes that cell more likely to reproduce itself than stem cells lacking the mutation. The father then is more likely to have an affected child because more mutant stem cells result in more mutant sperm. The longer the man waits to have children the greater the chance of having a child with Noonan syndrome.

Noonan Syndrome is among the most common genetic diseases with a simple inheritance pattern. About one of every 4,000 live births is a child with a new disease mutation. The disease can cause craniofacial abnormalities, short stature, heart defects, intellectual disability and sometimes blood cancers.

By examining the testes from 15 unaffected men, a team led by USC molecular and computational biologists Norman Arnheim and Peter Calabrese found that the new mutations were highly clustered in the testis, and that the overall proportion of mutated stem cells increased with age. Their computational analysis indicated that the mutation gave a selective edge over non-mutated cells.

“There is competition between stem cells with and without the mutation in each individual testis,” said Arnheim, who has joint appointments at the USC Dornsife College of Letters, Arts and Sciences and the Keck School of Medicine of USC. “But what is also unusual in this case is that the mutation which confers the advantage to testis stem cells is disadvantageous to any offspring that inherits it.”

The new findings also suggest an important new molecular mechanism to explain how certain genetic disease mutations can alter sperm stem cell function leading to exceptionally high frequencies of new cases every generation.

The Arnheim and Calabrese team included USC postdoctoral research associates Song-Ro Yoon, and Soo-Kung Choi, graduate student Jordan Eboreime and Dr. Bruce D. Gelb of the Icahn School of Medicine at Mount Sinai in New York City. A paper detailing their research will be published on June 6 in The American Journal of Human Genetics.

This research was supported by the National Institute of General Medical Sciences grant number R01GM36745 and the National Heart, Lung and Blood Institute (National Institutes of Health) grant number HL071207.

On the Net:

Devastating Symptoms Of Tay-Sachs Disease May Be Reduced With Readily-available drugs

McMaster University

A team of researchers has made a significant discovery which may have a dramatic impact on children stricken with Tay-Sachs disease, a degenerative and fatal neurological condition that often strikes in the early months of life.

Available drugs may dramatically ease a child’s suffering, say scientists.

“There is hope for this disease,” says Suleiman Igdoura, lead researcher of the study and an associate professor of biology at McMaster University. “Imagine what that could mean for parents who have a child diagnosed with this incurable condition, who may have only a few years with their child.”

Tay-Sachs is a genetic disorder caused by the absence of vital enzymes which are involved in the breakdown of waste within cells. Without those enzymes, waste accumulates and eventually destroys healthy cells, leading to blindness, paralysis, mental retardation and eventually death.

Igdoura and his team have found that when a key protein in the brain–known as TNFa–is removed, some of the devastating symptoms of Tay-Sachs and its close relative Sandhoff, were much less severe when tested in mice. Those symptoms include spasms, muscle wasting and loss of neurological function.

The findings are significant because the protein can be managed by FDA-approved drugs, readily available on the market.

“With Tay-Sachs and Sandhoff, we have very little to offer families in terms of therapeutics to help their children,” says Igdoura.

“These are orphan diseases where there are not many medications available. But we feel this is a significant step in improving quality of life and quite possibly extending lives.”

Children who are diagnosed with Tay- Sachs early in life, typically die before they reach the age of four or five.

“There are distinct stages within the disease, so we wanted to find targets we could interfere with, to delay the terrible outcome or halt it altogether,” explains Igdoura.

Using mice which were genetically altered to mimic Tay-Sachs and Sandhoff, researchers found levels of TNFa rose significantly during the early stages of the diseases.

But when TNFa was subsequently removed, there was a significant improvement in the lifespan of the mice and neurological function.

“We also found that neurons didn’t die as early as they do with the disease, so we delayed the progression as well,” says Igdoura. “We have identified a molecule that is the culprit and we believe there are drugs available to stop it.”

The research is published online in the journal Human Molecular Genetics.

On The Net:

Tiny Bubbles In Your Metallic Glass Not Cause For Celebration

Johns Hopkins University

Bubbles in a champagne glass may add a festive fizz to the drink, but microscopic bubbles that form in a material called metallic glass can signal serious trouble. In this normally high-strength material, bubbles may indicate that a brittle breakdown is in progress.

That´s why Johns Hopkins researchers used computer simulations to study how these bubbles form and expand when a piece of metallic glass is pulled outward by negative pressure, such as the suction produced by a vacuum. Their findings were published recently in the journal Physical Review Letters.

“A lot of people are interested in metallic glasses because of their strength and their potential use to make better cell phones cases, computer housings and other products,” said Michael L. Falk, who supervised the research. “But what precisely causes these materials to break apart or ℠fail´ has remained a mystery. By studying the behavior of the bubbles that appear when these glasses crack, we were able to learn more about how that process occurs.”

When glass is mentioned, many people think of window panes. But to scientists, a glass is a material that is cooled quickly from a liquid to a solid so that its atoms do not arrange themselves into orderly crystal lattices, as most metals do. A nearly random arrangement of atoms gives glasses distinctive mechanical and magnetic properties. Unlike window panes, most metallic glasses are not transparent or easy to break, but they do often spring back to their original shape after being bent. Still, when powerful enough force is applied, they can break.

“Our lab team is interested in learning just how susceptible metallic glasses are to fracturing, how much energy it takes to create a crack,” said Falk, a professor in the Whiting School of Engineering´s Department of Materials Science and Engineering. “We wanted to study the material under conditions that prevail at the tip of the crack, the point at which the crack pulls open the glass. We wanted to see the steps that develop as the material splits at that location. That´s where dramatic things happen: atoms are pulled apart; bonds are broken.”

At the site where this breakup begins, a vacant space–a bubble–is left behind. The spontaneous formation of tiny bubbles under high negative pressures is a process known as cavitation. The researchers in Falk´s lab discovered that cavitation plays a key role in the failure, or breakdown, of metallic glasses. “We´re interested in seeing the birth of one of these bubbles,” he said. “Once it appears, it releases energy as it grows bigger, and it may eventually become big enough for us to see it under a microscope. But by the time we could see them, the process through which they had formed would be long over.”

Therefore, to study the bubble´s birth, Falk´s team relied on a computer model of a cube of a metallic glass made of copper and zirconium, measuring only about 30 atoms on each side. By definition, a bubble appears as a cavity in the digital block of metallic glass, with no atoms present within that open space. “Through our computer model experiments, we wanted to see if we could predict under what conditions these bubbles can form,” Falk said.

The simulations revealed that these bubbles emerge in a way that is well predicted by classical theories, but that the bubble formation also competes with attempts by the glass to reshuffle its atoms to release the stress applied to a particular location. That second process is known as a shear transformation. As the glass responds to pressure, which of the two processes has the upper hand–bubble formation or shear transformation–varies, the researchers found. For example, they determined that bubbles dominate in the presence of  high tensile loads, meaning the strong pulling forces that are more common near the tip of a crack. But when the pulling forces were at a low level, the atom reshuffling process prevailed.

Falk and his colleagues hope their findings can help scientists who are developing new metallic glass alloys for products that can take advantage of the material´s high strength and elasticity, along with its tendency not to shrink when it is molded to a particular shape. These characteristics are prized, for example, by makers of cell phones and computers. Producers of such products have expressed interest in metallic glass, and the Falk team´s research may help them develop new metallic glass alloys that are less likely to break.

“Our aim is to incorporate our findings into predictive models of failure for these materials,” Falk said, “so that they can be optimized and used in applications that require materials that are both strong and fracture-resistant.”

The lead author of the Physical Review Papers article was Pengfei Guan, a postdoctoral fellow in Falk´s lab. Along with Falk, the co-authors were Shuo Lu, Michael J. B. Spector and Pavan K. Valavala, who were all part of Falk´s lab team at the time the research was conducted. The work was supported by National Science Foundation Grant No. DMR0808704.

On The Net:

Multiple Sclerosis Breakthrough Finishes Phase 1 With Flying Colors

Lee Rannals for redOrbit.com — Your Universe Online

Northwestern Medicine researchers say they have discovered a big breakthrough that could help people suffering from multiple sclerosis (MS).

The researchers completed the first phase of a clinical trial for the first treatment to reset the immune system of MS patients. They found that the patients showed the therapy was safe and dramatically reduced their immune systems’ reactivity to myelin by 50 to 75 percent.

In MS, the immune system attacks and destroys myeline, which is the insulating layer that forms around nerves in the spinal cord, brain and optic nerve. When the insulation is destroyed, electrical signals cannot be effectively conducted, which results in symptoms that range from mild numbness to paralysis or blindness.

“The therapy stops autoimmune responses that are already activated and prevents the activation of new autoimmune cells,” said Stephen Miller, the Judy Gugenheim Research Professor of Microbiology-Immunology at Northwestern University Feinberg School of Medicine. “Our approach leaves the function of the normal immune system intact. That’s the holy grail.”

During the trial, the team used MS patients’ own white blood cells to stealthily deliver billions of myelin antigens into their bodies so their immune systems would recognize them as harmless and develop tolerance to them. Current therapies for MS suppress the entire immune system, which makes patients more susceptible to everyday infections and higher rates of cancer.

The study showed that patients who received the highest dose of white blood cells had the greatest reduction in myelin reactivity.

According to the researchers, the primary goal of the study was to demonstrate that the treatment was safe and tolerable. The study accomplished this by showing intravenous injection with myelin antigens caused no adverse affects in MS patients.

The study sets the stage for a phase two trial to determine if the treatment can prevent the progression of MS in humans. Scientists are trying to raise $1.5 million to launch the trial, which has already been proven in Switzerland.

“In the phase 2 trial we want to treat patients as early as possible in the disease before they have paralysis due to myelin damage.” Miller said. “Once the myelin is destroyed, it’s hard to repair that.”

This therapy may not only be useful for treating MS, but could also help other autoimmune and allergic diseases by simply switching the antigens attached to the cells.

This research comes a week after “World MS Day” took place in over 40 countries around the globe. This day was a worldwide collaborative awareness campaign to build understanding of MS, which is the most common neurological disease affecting young adults.

Need A Million Bucks? Solve Beal’s Conjecture

Brett Smith for redOrbit.com – Your Universe Online

Looking for a way to make $1 million? All you need to do is solve a math equation that has been boggling the minds of the world´s greatest mathematicians for over 20 years.

Beal´s Conjecture, represented by A^x + B^y = C^z, is named after Andrew Beal, the same man who is offering up the seven-figure reward for anyone who can prove that when A, B and C are positive integers, and x, y and z are positive integers greater than 2 — A, B and C must have a common factor.

The conjecture was first proposed in 1993 while Beal was working on Fermat´s Last Theorem. He noted that both equations are “easy to say, but extremely difficult to prove.”

“Increasing the prize is a good way to draw attention to mathematics generally and the Beal Conjecture specifically,” said Beal. “I hope many more young people will find themselves drawn into the wonderful world of mathematics.”

Currently working as a banker in Dallas, Beal first offered up a $5,000 prize to anyone who could perform the proof back in 1997. He has increased the reward several times over the years without a solution being found. The $1 million prize is a ten-fold upgrade from Beal´s last offer of $100,000.

“I was inspired by the prize offered for proving Fermat,” said the self-taught mathematician who professes an affinity for number theory.

Andrew Wiles and Richard Taylor solved Fermat´s Last Theorem in 1995, more than 350 years“¯after it was first posed and collected around $50,000 for their work. French mathematician Pierre de Fermat claimed he had a proof more than 300 years ago, but did not leave an adqueate record of it.

Instead of being relegated to the arcane fields of mathematics and number theory, Fermat´s Last Theorem has enjoyed quite a bit or referencing in popular culture. One episode of Star Trek: The Next Generation showed a 24th century Captain Jean-Luc Picard searching for a solution to the fabled theorem. That episode aired in 1989, six years before the solution to the proof was discovered.

To earn the prize money for Beal´s equation, participants have two years to present either a solution or counterexample. The proposed solution must be published in a respected mathematics journal, while the counterexample is subject to independent confirmation, the American Mathematical Society (AMS) said in a statement.

While Beal´s Conjecture has been with us for about two decades, it is by no means the oldest unsolved mathematical equation. That honor belongs to Goldbach’s Conjecture, which was“¯posed by the eponymous Russian mathematician in 1742, according to the Guinness Book of World Records. That theorem asserts that every even consecutive positive integer starting with four is the total of two prime numbers.

At its current level, the prize for Beal´s Conjecture isn´t the biggest cash prize to ever be offered for solving a math equation. In 2000,“¯the Clay Mathematics Institute offered seven $1 million awards for the solution to seven separate math problems. One of the problems was solved by the Russian mathematician Grigori Perelman who turned down the prize money in 2010.

The AMS is tasked with determining which mathematics publications meet the standards for publishing the Beal´s solution and is currently holding the prize funds in a trust.

Deep Sea Trash Accumulating Up To 4,000 Meters Below Surface, Says Study

Brett Smith for redOrbit.com – Your Universe Online

For years, people have known about the amount of human-generated trash that ends up in the ocean, but a new study in the journal Deep-Sea Research I: Oceanographic Research Papers showed just how deep our detritus sinks, particularly in the waters around Monterey, California.

“We were inspired by a fisheries study off Southern California that looked at seafloor trash down to 365 meters,” said lead author Kyra Schlining, a senior research technician at the Monterey Bay Aquarium Research Institute (MBARI).

Scientists from the MBARI examined 18,000 hours of underwater video collected by cameras on the institute´s remotely operated underwater vehicles in search of man-made debris along the ocean floor.

“We were able to continue this search in deeper water — down to 4,000 meters,” Schlining said. “Our study also covered a longer time period, and included more in situ observations of deep-sea debris than any previous study I’m aware of.”

The videos had been used to identify objects and animals that appeared in these videos and record them in the MBARI´s Video Annotation and Reference System (VARS). In the latest study, Schlining and her colleagues combed through the database to locate video clips of rubbish on the seafloor. They were able to identify over 1,500 observations of deep-sea debris, from sites near Vancouver Island to the Gulf of California to the Hawaiian Islands.

The researchers said they focused their attention on seafloor debris in and around Monterey Bay. They were able to note over 1,150 pieces of debris in these waters alone. In particular, they found that steep, rocky slopes were magnets for trash. The researchers speculated that ocean currents flowing past these rocky projections help to deliver detritus there. They also found that trash was fairly common in the deeper parts of the canyon, below 6,500 feet.

“I was surprised that we saw so much trash in deeper water,” Schlining said. “We don’t usually think of our daily activities as affecting life two miles deep in the ocean.”

“I’m sure that there’s a lot more debris in the canyon that we’re not seeing,” she added. “A lot of it gets buried by underwater landslides and sediment movement. Some of it may also be carried into deeper water, farther down the canyon.”

According to the report, about one third of the trash was made of plastic. Of these objects, over 50 percent were plastic bags. Metal objects were the second most common type of debris and about two thirds of these objects were aluminum, steel, or tin cans. Other debris included fishing equipment, glass bottles, and cloth items.

“The most frustrating thing for me is that most of the material we saw — glass, metal, paper, plastic — could be recycled,” Schlining noted.

The near-freezing water, lack of sunlight, and low oxygen concentrations where this deep-water trash is located creates highly inhospitable conditions for bacteria and other microorganisms that can break down debris, meaning a plastic bag or soda can float around for decades or longer.

“Ultimately, preventing the introduction of litter into the marine environment through increased public awareness remains the most efficient and cost-effective solution to this dilemma,” the scientists´ report concluded.

Last year, a Scripps study found that plastic trash is accumulating in the “Great Pacific Garbage Patch” at an alarming rate and damaging marine ecosystems as it grows.

Vitamin D Deficiency May Help Spread Of Hepatitis B Throughout Liver

Researchers from Germany have found that low levels of vitamin D are associated with high levels of hepatitis B virus (HBV) replication. Findings published online in Hepatology, a journal of the American Association for the Study of Liver Diseases, suggest seasonal fluctuations in vitamin D and HBV levels point to a link in these variables among patients with chronic HBV.

While highly effective vaccines are available, HBV still remains one of the most significant infectious diseases worldwide. In fact, the World Health Organization (WHO) states that HBV is 50 to 100 times more infectious than human immunodeficiency virus (HIV). Furthermore WHO reports that two billion individuals have been infected with HBV, which is responsible for nearly 600,000 deaths each year. In the U.S. the Centers for Disease Control and Prevention (CDC) estimates that up to 1.4 million Americans are living with chronic HBV.

“Vitamin D helps maintain a healthy immune system and there is evidence of its role in inflammatory and metabolic liver disease, including infection with hepatitis C virus (HCV),” explains lead investigator Dr. Christian Lange from Johann Wolfgang Goethe University Hospital in Frankfurt. “However, the relationship between vitamin D metabolism and chronic HBV infection remains unknown and is the focus of our present study.”

Between January 2009 and December 2010, the team recruited 203 patients with chronic HBV who had not previously received treatment for their infection. Levels of 25-hydroxyvitamin D were measured from each participant. Patients co-infected with HCV, HIV, or hepatitis D; those with excessive alcohol use; and those with liver cancer or other malignancies were excluded.

Results show that 34% of participants had severe vitamin D deficiency (less than 10 ng/mL), 47% with vitamin D insufficiency (between 10-20 ng/mL) and 19% had normal levels of vitamin D (greater than 20 ng/mL). Further analyses indicate that the concentration of HBV in the blood, known as viral load, was a strong indicator of low vitamin D levels. In patients with HBV DNA less than 2000 IU/mL versus 2000 IU/mL or more, the levels of vitamin D were 17 and 11 ng/mL, respectively.

Researchers also determined that patients with the hepatitis B antigen (HBeAg) had lower levels of vitamin D than HBeAg negative participants. Inverse seasonal fluctuations between vitamin D and HBV levels were noted, which further suggests a relationship between the two variables.

“Our data confirm an association between low levels of vitamin D and high concentrations of HBV in the blood,” concludes Dr. Lange. “These findings differ from previous research of patients with chronic hepatitis C, which found no connection between vitamin D levels and concentration of HCV in the blood.” The authors propose further investigation of vitamin D as a therapeutic intervention for controlling HBV.

On the Net:

Poor Sleep More Harmful For Female Heart Patients

redOrbit Staff & Wire Reports – Your Universe Online

Women with coronary heart disease could be at risk for elevated levels of inflammation if they aren´t getting enough sleep, according to new research published online Wednesday in the Journal of Psychiatric Research.

The study was conducted by investigators at the University of California, San Francisco (UCSF) and found that poor sleep patterns — in particular, waking up too early — appears to play “a significant role” in raising inflammation levels among female coronary heart disease patients.

However, the phenomenon was observed only in women, not in male patients. The authors report that their research points to the existence of potentially important gender differences in heart patients, and that their findings suggest that inflammation could “serve as a key biological pathway through which poor sleep contributes to the progression of heart disease in women.”

“Inflammation is a well-known predictor of cardiovascular health,” said lead author assistant professor of psychiatry at UCSF Dr. Aric Prather. “Now we have evidence that poor sleep appears to play a bigger role than we had previously thought in driving long-term increases in inflammation levels and may contribute to the negative consequences often associated with poor sleep.”

Previous research has illustrated that sleeping fewer than six hours each night is a risk factor for coronary heart disease and other chronic medical conditions, and has also linked a lack of sleep with elevation in biomarkers of inflammation. The UCSF researchers set out to examine the link between self-reported sleep quality and changes in inflammation levels in older patients with stable coronary heart disease over a five-year span.

Prather and his colleagues started their work in 2000, and recruited nearly 700 participants — all of whom had coronary heart disease. Subjects were recruited from the university, the Veterans Affairs Medical Centers in San Francisco and Palo Alto, and nine public health clinics in the Community Health Network of San Francisco.

The average age of male patients was 66, while the average age of female patients was 64. Furthermore, the women tended to have a higher systolic blood pressure, were more likely to be using antidepressants, and were less likely to be using statins, beta-blockers, or other drugs that treat blood pressure or other heart-related conditions.

“Participants were asked when they first enrolled and five years later: ℠During the past month, how would you rate your overall sleep quality?´ Their choices were ℠very good,´ ℠fairly good,´ ℠good,´ ℠fairly bad,´ or ℠very bad.´ Biomarkers assessed in the study were Interleukin-6, C-reactive protein, and Fibrinogen,” the researchers said.

“The researchers found that poor sleep quality was significantly associated with five-year increases in the biomarkers in women but not men: Women who reported very poor or fairly poor sleep quality showed a percent increase in markers 2.5 times that of men who said they slept poorly,” they added. “The association remained statistically significant after adjustment for characteristics such as lifestyle, medication use and cardiac function.”

Many of the women who took part in the study were post-menopausal, leading the researchers to surmise that reduced estrogen levels could be a factor in the inflammatory activity associated with poor sleep. They added that it could be possible that testosterone could have helped ward off the effects of poor sleep quality in men.

“The researchers note that men comprised the majority of the study subjects, but point out that their findings may actually underestimate the real effects given the limited sample size,” UCSF explained. “They say that further investigation is needed to explain the gender-specific associations between poor sleep quality and markers of inflammation which could help clarify gender disparities in coronary heart disease.”

Brain Changes May Be Responsible For Clumsiness In Old Age

redOrbit Staff & Wire Reports – Your Universe Online

Often attributed to age-related decay in vision and physical prowess, incidents of clumsiness in seniors could actually be caused by changes in the brain, researchers from Washington University in St. Louis claim in a new study.

Incidents in which older men and women have difficulty reaching for and/or grasping things, such as inability to dial a phone of knocking over a glass while attempting to grab a different object, could be the result of changes in the mental frame of reference that these individuals use to visualize nearby objects, the researchers explained.

“Reference frames help determine what in our environment we will pay attention to and they can affect how we interact with objects, such as controls for a car or dishes on a table,” said study co-author Dr. Richard Abrams, a professor of psychology in Arts & Sciences at the university. “Our study shows that in addition to physical and perceptual changes, difficulties in interaction may also be caused by changes in how older adults mentally represent the objects near them.”

When asked to perform a series of simple tasks that involve hand movements, younger people participating in the study adopted an attentional reference frame that centered on that part of the body. However, older subjects adopted a reference frame that centered on the body, not on the hand, the authors explained in research published in the journal Psychological Science.

Younger men and women have been known to use a reference frame that is “action-centered,” which means that it is sensitive to the movements they are making. For this reason, when these individuals move their hands to pick up an object, they maintain their awareness of objects potentially blocking their movement path. Conversely, seniors typically pay more attention to the objects closer to their body, even if they are not on the action path.

“We showed in our paper that older adults do not use an ℠action centered´ reference frame. Instead they use a ℠body centered´ one,” explained lead author, Dr. Emily K. Bloesch, a former Washington University student who now serves as a postdoctoral teaching associate at Central Michigan University. “As a result, they might be less able to effectively adjust their reaching movements to avoid obstacles — and that´s why they might knock over the wine glass after reaching for the salt shaker.”

The study supports previous research which has documented age-related physical declines in multiple regions of the brain which are responsible for hand-eye coordination, the university said. Older adults demonstrate “volumetric declines” in both the parietal cortex and intraparietal sulcus, and they also lose white matter in the parietal lobe and precuneus. This decay could prevent them from using an action-centered reference frame.

Those areas, the authors noted, “are highly involved in visually guided hand actions like reaching and grasping and in creating attentional reference frames that are used to guide such actions.”

Furthermore, these specific neurological changes in older men and women “suggest that their representations of the space around them may be compromised relative to those of young adults and that, consequently, young and older adults might encode and attend to near-body space in fundamentally different ways,” said the study, which was supported by a grant from the National Institute on Aging.

Improving Ventilation In Schools Could Reduce Illness-Related Absences

redOrbit staff & Wire Reports — Your Universe Online

Making sure that students get enough fresh air in their classrooms could keep kids from missing school due to illness, researchers from the at Lawrence Berkeley National Laboratory (Berkeley Lab) claim in a new study.

Writing in the journal Indoor Air, lead author Mark Mendell and his colleagues report that their analysis of over 150 California classrooms revealed that bringing classroom ventilation rates up to state standards could reduce illness-related absences by about 3.4 percent.

Mendell and his colleagues collected data from 162 third- through fifth-grade classrooms in 28 schools across three different California school districts. They found that over half of those classrooms failed to meet state ventilation standards.

It would take an investment of approximately $4 million each year to make the necessary upgrades to reach those standards, but by doing so, they say that California´s school districts would gain an additional $33 million in attendance-related funding — and families could avoid upwards of $80 million in caregiver costs due to sick children missing school.

“Our overall findings suggest that, if you increased ventilation rates of classrooms up to the state standard, or even above it, you would get net benefits to schools, to families, to everybody, at very low cost,” Mendell, an epidemiologist at the Berkeley Lab´s Indoor Environment Department, said in a statement. “It´s really a win-win situation.”

Under California state building codes, schools are required to provide a ventilation rate of 7.1 liters per second per person, or about 15 cubic feet per minute per person. The ventilation rate, the researchers explain, is a flow rate that measures how much outside air in brought indoors via open windows, ventilation fans, or other means.

Mendell´s team found that ventilation rates “varied widely” within schools and districts, as well as from one district to another. Portable classrooms tended to receive less ventilation, and schools located in the Central Valley of California — a region susceptible to extreme weather conditions in both the summer and winter — tended to rely upon heaters and air-conditioners and were below state ventilation standards 95 percent of the time.

The school districts provided information on daily absences due to illnesses in each classroom, and the researchers then calculated ventilation rates based on carbon dioxide levels as measured indoors and estimated outdoors. Indoor CO2 levels were based on environmental sensor readings from each classroom — data that was transmitted to the research team via the Internet.

“The overall finding from all three districts was that for every additional 1 liter/second/person of ventilation provided to a classroom, illness absence declined by 1.6 percent, with the benefit continuing at least up to 15 liters/second/person, more than double the state standard,” officials with the Berkeley Lab said.

That data was extrapolated based on two assumptions — first, that the findings represented true causal relationships, and that all California K-12 classrooms had the 4 liters/second/person average ventilation rate, which had been estimated from carbon dioxide data obtained as part of a previous state-wide survey.

The authors calculated that upgrading all schools to the state standard of 7.1 liters/second/person would not only reduce illness-related absences by 3.4 percent, it would increase overall state funding by $33 million under a program that provides schools with approximately $30 per day in funding for every student in attendance on that day.

“Further increases in ventilation rates, at least up to 15 liters/second/person, would result in additional benefits“¦ such as reduced costs related to sick leave for teachers and staff and reduced health care costs for students,” the laboratory said. “They also noted that, in most parts of California, replacing ventilation equipment would probably not be necessary to increase the ventilation rates.”

Modern Parents Not Concerned About Their Child’s TV Time

Lee Rannals for redOrbit.com — Your Universe Online

According to a new study, modern parents are unlikely to worry about how much screen time their child is receiving.

Northwestern University researchers said they have found that the majority of American parents today are unconcerned about their child’s media use. They also have found that parents have adopted different parenting styles related to media.

The team surveyed more than 2,300 parents of children up to 8-years-old. Ellen Wartella, director of Northwestern´s Center on Media and Human Development and lead author of the report, said the study “reveals a generational shift in parental attitudes about technology´s role in young children´s lives.”

Researchers challenged two key assumptions about media and parenting with their study. First, they challenged the idea that smartphones and tablets have become today’s “go-to” parenting tools. Secondly, the team looked into the idea that the dominant pattern in most families is children driving the desire for screen time while parents pull on the reins.

“Today´s parents grew up with technology as a central part of their lives, so they think about it differently than earlier generations of parents,” says Wartella, the Hamad Bin Khalifa Al-Thani Professor in Communication in Northwestern´s School of Communication. “Instead of a battle with kids on one side and parents on the other, the use of media and technology has become a family affair.”

The team identified three different types of media environments parents create: “media-centric,” “media- moderate” and “media-light.” Children growing up in the media-centric families spend three hours or more a day with a television, computer, video game, smartphone or tablet. According to the study, 39 percent of families fall into this category.

Eight in ten of the parents in the media-centric families say they are “very” or “somewhat” likely to use TV to keep their child occupied when they need to get something done around the home. Forty-eight percent of these families leave the TV on at home most of the time, while nearly half have a TV in their child’s bedroom.

The team said 45 percent of the families in the study fall into the media-moderate category, which is where children in the group spend an average of just under three hours a day using screen media. Families in this group are more likely to enjoy doing things outside.

According to the study, 16 percent of families fall into the media-light group, which consists of children spending just 90 minutes with screen media each day. The parents in this group average less than two hours a day with media, compared to those in the media-moderate group, which average five hours a day.

Fifty-nine percent of parents in the study said they are not worried about their children becoming addicted to new media, while 55 percent say they are “not too” or “not at all” concerned about their children’s media use. The majority of the parents, 70 percent, said smartphones and tablets do not make parenting any easier.

NGC 6334: Shedding Light On Starburst Regions

John P. Millis, Ph.D. for redOrbit.com — Your Universe Online
Stars are formed in massive clouds of gas that are being compressed by some nearby event. Over time, the region will be consumed by the forming stars, leaving a cluster that will eventually drift apart. But what starts this process in the first place?
Often these regions of star formations — known as starburst regions — are found in distant galaxies where the activity level is very high. But this presents a problem: because of the great distance, identifying smaller stars is difficult. And these low mass stars are an important piece to the puzzle.
To get a better idea about the true stellar population of the starburst regions, and to perhaps even gain insight into what types of events trigger the rapid star formation, researchers look to regions of the Milky Way galaxy that might be undergoing a similar process.
A team from Iowa State University and the National Optical Astronomy Observatory (NOAO) have completed the most detailed study of nearby starburst region NGC 6334 (or Cat´s Paw Nebula) — a gas rich region of our galaxy about 5,500 light-years away in the constellation Scorpius. Filled with hot, massive stars, the nebula has been previously studied at great length. However, prior work focused on these heavier stars, had not revealed the same level of detail about smaller stars that may exist in the region.
Team leader, and Iowa State graduate student Sarah Willis, used the NOAO Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, Chile, and the Spitzer Space Telescope, to image the lower mass stars in the cloud.
Willis and her collaborators were able to catalog the stellar population down to stars about the size of our Sun. Then, using stellar census data, they were able to estimate the number of stars with even lower mass that exist in the system. Essentially, they were able to draw on formation rates of how many low mass stars form for every Sun-like star that exists.
The result is that about 3,600 solar masses worth of gas is being converted to stars every million years. This is higher than expected, and if taken to be a good model for starburst regions, could indicate that distant starburst regions are even more active than previously thought.

Low-Dose Anesthetic Shows Promise As Depression Treatment

redOrbit Staff & Wire Reports – Your Universe Online

New research funded by the Mayo Clinic suggests that low doses of a general anesthetic typically used during minor surgeries could be an effective treatment for depression.

According to the researchers, medical experts discovered approximately 10 years ago that ketamine had the potential to alleviate severe depression, but since the anesthetic can also have serious psychiatric side effects, they have been searching for a safe way to take advantage of its beneficial properties.

Now, writing in the latest edition of the Journal of Psychopharmacology, the Mayo Clinic team reports that low-dose intravenous infusions of ketamine administered over a sustained period of time could safely be used as a treatment for depression.

“It’s surprising both that it works and how rapidly it has effects,” Mayo Clinic psychiatrist and study co-author Dr. Timothy Lineberry said in a statement. “It sometimes can work in hours to reduce depressive symptoms and suicidal ideation. Our goal is to begin to determine how the drug can be administered safely in routine treatment.”

Lineberry and his colleagues analyzed 10 patients with a severe depressive episode. Each episode resulted from either a major depressive disorder (MDD) or a type of bipolar disorder, and the patients participating in the study failed to find relief from a minimum of two anti-depressant medications.

During the trial, the patients were treated twice a week (up to a total of four treatments) with low-dose (0.5 mg/kg total dose) ketamine infusions given over a 100-minute period until their depression lifted.

The researchers said that their work demonstrates that when the anesthetic is administered at the lower rate, it is just as effective as when the infusions are given at higher rates. Furthermore, they monitored side effects using two different psychiatric scales — the Young Mania Rating Scale and the Brief Psychiatric Rating Scale.

“Eight of 10 patients showed at least 50 percent improvement. Five patients experienced complete remission of their depression, and four weeks after the study, two were still depression free,” the Mayo Clinic reported.

One patient experienced “brief and limited hallucinations,” they noted. The other study participants experienced no major side effects — only some drowsiness or dizziness — while receiving the infusions. The only question that remains, the researchers said, is determining which patients will respond best to the therapy.

“While patients and clinicians are excited about ketamine’s potential, we know that much more research lies ahead before we know which depressive conditions can be addressed with ketamine safely by clinicians in routine clinical practice,” said Dr. Lineberry.

Common Muscle Control Patterns Governing The Motion Of Swimming Animals Found With New Model

Georgia Institute of Technology

What do swimmers like trout, eels and sandfish lizards have in common? According to a new study, the similar timing patterns that these animals use to contract their muscles and produce undulatory swimming motions can be explained using a simple model. Scientists have now applied the new model to understand the connection between the electrical signals and body movement in the sandfish.

Most swimming creatures rely on an undulating pattern of body movement to propel themselves through fluids. Though differences in body flexibility may lead to different swimming styles, scientists have found “neuromechanical phase lags” in nearly all swimmers. These lags are characterized by a wave of muscle activation that travels faster down the body than the wave of body curvature.

A study of the sandfish lizard — which “swims” through sand — led to development of the new model, which researchers believe could also be used to study other swimming animals. Beyond assisting the study of locomotion in a wide range of animals, the findings could also help researchers design efficient swimming robots.

“A graduate student in our group, Yang Ding, who is now at the University of Southern California, was able to develop a theory that could explain the kinematics of how this animal swims as well as the timing of the nervous system control signals,” said Daniel Goldman, an associate professor in the School of Physics at the Georgia Institute of Technology. “For animals swimming in fluids using an undulating movement, there are basic physical constraints on how they must activate their muscles. We think we have uncovered an important mechanism that governs this kind of swimming.”

The research was reported June 3 in the early edition of the journal Proceedings of the National Academy of Sciences. It was sponsored by the National Science Foundation’s Physics of Living Systems program, the Micro Autonomous Systems and Technology (MAST) program of the Army Research Office, and the Burroughs Wellcome Fund.

Undulatory locomotion is a gait in which thrust is produced in the opposite direction from a traveling wave of body bending. Because it is so commonly used by animals, this mode of locomotion has been widely used for studying the neuromechanical principles of movement.

Sarah Sharpe, the paper’s second author and a graduate student in Georgia Tech’s Interdisciplinary Bioengineering Program, led laboratory experiments studying undulatory swimming in sandfish lizards. She used X-ray imaging to visualize how the animals swam through sand that was composed of tiny glass spheres.

At the same time their swimming movements were being tracked, a set of four hair-thin electrodes implanted in the lizards’ bodies were providing information on when their muscles were activated. The two information sources allowed the researchers to compare the electrical muscle activity to the lizards’ body motion.

“The lizards propagate a wave of muscle activations, contracting the muscles close to their heads first, then the muscles at the midpoint of their body, then their tail,” said Sharpe. “They send a wave of muscle of contraction down their bodies, which creates a wave of curvature that allows them to swim. This wave of activation travels faster than the wave of curvature down the body, resulting in different timing relationships, known as phase differences, between muscle contracts and bending along the body.”

Sand acts like a frictional fluid as the sandfish swims through it. However, a sandfish swimming through sand is simpler to model than a fish swimming through water because the sand lacks the vortices and other complex behavior of water — and the friction of the sand eliminates inertia.

“Theoretically, it is difficult to calculate all of the forces acting on a fish or an eel swimming in a real fluid,” said Goldman. “But for a sandfish, you can calculate pretty much everything.”

The relative simplicity of the system allowed the research team — which also included Georgia Tech professor Kurt Wiesenfeld — to develop a simple model showing how the muscle activation relates to motion. The model showed that combining synchronized torques from distant points in the lizards’ bodies with local traveling torques is what creates the neuromechanical phase lag.

“This is one of the simplest, if not the simplest, models of swimming that reproduces the neuromechanical phase lag phenomenon,” Sharpe said. “All we really had to pay attention to was the external forces acting on an animal’s body. We realized that this timing relationship would emerge for any undulatory animal with distributed forces along its body. Understanding this concept can be used as the foundation to begin understanding timing patterns in all other swimmers.”

The sandfish swims using a simple single-period sinusoidal wave with constant amplitude. A key finding that facilitated the model’s development was that the sandfish’s body is extremely flexible, allowing internal forces — body stiffness — to be ignored.

“This animal turns out to be like a little limp noodle,” said Goldman. “Having that result in the theory makes everything else pop out.”

The model shows that the waveform used by the sandfish should allow it to swim the farthest with the least expenditure of energy. Swimming robots adopting the same waveform should therefore be able to maximize their range.

Goldman and his colleagues have been studying the sandfish, a native of the northern African desert, for more than six years.

“Sandfish are among the champions of all sand diggers, swimmers and burrowers,” said Goldman. “This lizard has provided us with an interesting entry point into swimming because its environment is surprisingly simple and behavior is simple. It turns out that this little sand-dweller may be able to tell us things about swimming more generally.”

On The Net:

Study: UV Light Doubles Shelf Life Of Strawberries

redOrbit Staff & Wire Reports — Your Universe Online

Just in time for the summer berry season, scientists at the US Department of Agriculture (USDA) have demonstrated a way to delay spoilage of fresh strawberries for up to several days.

The researchers showed that low irradiance ultra-violet (UV) light directed at strawberries over long exposure periods at low temperature and very high humidity — typical home refrigerator conditions — doubles the shelf life to nine days.

The team used a novel device incorporating light-emitting diodes (LEDs) that emit UV at wavelengths found in sunlight transmitted through Earth’s atmosphere. The findings are important because previous attempts using traditional UV light sources for storage of produce resulted in severe drying, and it was unknown if the advantages of long exposure to low-level UV light would be effective against rot.

LEDs are now commonplace thanks to their long life, energy efficiency and ability to span the wavelength range from near UV to infrared. However, the full UV spectrum has presented challenges for LED manufacturers — until recently.

Sensor Electronic Technology, Inc. (SETi) in Columbia, S.C, which collaborated with the USDA on the work, developed a special technology to fabricate UV LEDs across the entire UV spectrum from UVA to UVC. This flexibility allowed them to tune the emitted light to the wavelengths most effective for this application.

“UV-LEDs presented the opportunity to try low power devices that work well in the cold and can be engineered to work in small spaces such as refrigerator compartments,” said lead USDA researcher Steven Britz, who will present the findings next Thursday at the Conference on Lasers and Electro-Optics (CLEO: 2013).

Using strawberries purchased from a local supermarket, Britz’s team placed one batch in a dark refrigerator and one batch in a refrigerator exposed to UV-LEDs. The results showed the UV-treated berries had their shelf life extended twofold — nine days mold-free — over darkened berries, as judged by weight, moisture content, concentration of select phytochemicals, visible damage, and mold growth.

Based on these promising results, the researchers are working to commercialize the technology for use in home refrigerators.

“These findings are expected to have a major impact on the appliance business to extend the shelf life and preserve nutritional value of fresh produce while reducing waste and saving money for every household,” said Remis Gaska, president and CEO of SETi.

OCD Humans And CCD Dogs Share Common Brain Abnormalities, Says Studies

Brett Smith for redOrbit.com – Your Universe Online
Some dogs´ bad habits could be a sign of canine compulsive disorder (CCD), and a new study from Tufts University and Harvard Medical School shows that structural brain abnormalities in Doberman pinschers with the disorder are similar to those seem in humans with obsessive compulsive disorder (OCD). The results of the study suggest a new approach to understanding both OCD and the behaviors of man´s best friend.
“While the study sample was small and further research is needed, the results further validate that dogs with CCD can provide insight and understanding into anxiety disorders that affect people,” said Nicholas Dodman, from the Cummings School of Veterinary Medicine at Tufts University and a co-author of the study that appeared in the journal Progress in Neuro-Psychopharmacology & Biological Psychiatry.
“Dogs exhibit the same behavioral characteristics, respond to the same medication, have a genetic basis to the disorder, and we now know have the same structural brain abnormalities as people with OCD,” Dodman explained.
OCD is thought to affect about 2 percent of the population, but the disorder is not well-understood and commonly goes untreated or undiagnosed for decades. The disorder is marked by repetitive behaviors or persistent thoughts that are time consuming and result in a disruption of daily routines. CCD is marked by repetitious and destructive behaviors such as tail chasing and destructive chewing.
The research team examined a sample set of 16 Dobermans, comparing MRI brain images of eight Dobermans with CCD to those of the control group. The team discovered that the CCD group had cerebral biomarkers in their brains that are consistent with those reported in humans with OCD.
“It has been very gratifying to me to use our imaging techniques developed to diagnose human brain disorders to better understand the biological basis for anxiety/compulsive disorders in dogs, which may lead to better treatments for dogs and humans with these disorders,” said co-author Marc J. Kaufman, associate professor of psychiatry at Harvard Medical School and director of the McLean Hospital Translational Imaging Laboratory.
“Canines that misbehave are often labeled as ‘bad dogs’ but it is important to detect and show the biological basis for certain behaviors,” said co-author Niwako Ogata, an assistant professor of animal behavior at Purdue University College of Veterinary Medicine.
“Evidence-based science is a much better approach to understanding a dog’s behavior.”
The study´s findings build on previous research on compulsive disorders in animals such as CCD, which affects a variety of breeds. In 2010, researchers identified a genetic marker in dogs that coincides with an increased risk of OCD.
Another recent study found a type of brain surgery that appears to be an effective treatment for people suffering from severe obsessive compulsive disorder (OCD) who have not responded to other treatments.
While the surgery shows promise, the study also reported that 2 of the 19 patients experienced permanent complications from the procedure, including paralysis on one side of the body and cognitive injury.

Cooler Temperatures Make Mosquitoes More Dangerous

Lee Rannals for redOrbit.com — Your Universe Online

Researchers wrote in the journal PLoS Neglected Tropical Diseases that weather could be influencing the transmission of the West Nile virus and other mosquito-related disease.

The scientists said mosquitoes living in cooler temperatures have weaker immune systems, making them more susceptible to dangerous viruses and more likely to transmit them to people. They said this connection is significant in light of global climate change.

“Our data offers a plausible hypothesis for how changes in weather influence the transmission of these diseases and will likely continue to do so in the future,” said Kevin Myles, associate professor of entomology in the College of Agriculture and Life Sciences (CALS) at Virginia Tech.

The team’s work suggest that it would be unwise to focus solely on warmer temperatures when considering links between climate change and disease transmission.

“Mosquitoes like to breed and lay their eggs in dark, cool places because that means the water will last longer,” said Zach Adelman, associate professor of entomology at CALS. “They don’t lay their eggs in sunny spots because that will dry the water out in a day or two. Although this has been known of some time, we are just learning about its potential effects on the mosquito immune response. Hopefully, this information can be used to build better models that more correctly predict when we’ll have disease transmission.”

Current computer modeling of outbreaks considers meteorological variables and human population indexes but has failed to consider the effect of temperature on mosquito immunity. The team found that the mosquito’s RNA interference pathway is impaired when reared at cooler temperatures.

The rate of transmission of both diseases has increased with outbreaks occurring in unexpected places, like introductions of West Nile virus to New York in 1999 and the chikungunya virus to Italy and France in 2007 and 2010.

Last April, the California State Legislature declared the week of April 21 through 27 2013 as West Nile Virus and Mosquito and Vector Control Awareness Week in California. West Nile virus is a disease transmitted by mosquitoes that can result in debilitating cases of meningitis and encephalitis, and leads to death in humans, horses, some bird species and other wildlife.

According to the California Department of Public Health (CDPH), 479 human cases of West Nile virus were reported in 2012, and of those, 19 fatalities were reported.

The Science Behind How Meditation Reduces Anxiety

Lee Rannals for redOrbit.com — Your Universe Online
Scientists at Wake Forest Baptist Medical Center have identified the brain functions involved in how meditation reduces anxiety.
The team wrote in the journal Social Cognitive and Affective Neuroscience about how they studied 15 healthy volunteers with normal levels of everyday anxiety. They said these individuals had no previous meditation experience or anxiety disorders.
The participants took four 20-minute classes to learn a technique known as mindfulness meditation. In this form of meditation, people are taught to focus on breath and body sensations and to non-judgmentally evaluate distracting thoughts and emotions.
“Although we´ve known that meditation can reduce anxiety, we hadn´t identified the specific brain mechanisms involved in relieving anxiety in healthy individuals,” said Dr. Fadel Zeidan, Ph.D., postdoctoral research fellow in neurobiology and anatomy at Wake Forest Baptist and lead author of the study. “In this study, we were able to see which areas of the brain were activated and which were deactivated during meditation-related anxiety relief.”
The researchers found that meditation reduced anxiety ratings by as much as 39 percent in the participants.
“This showed that just a few minutes of mindfulness meditation can help reduce normal everyday anxiety,” Zeidan said.
Fadel and colleagues were also able to reveal that meditation-related anxiety relief is associated with activation of the anterior cingulate cortex and ventromedial prefrontal cortex, which are areas of the brain involved with executive-level function.
“Mindfulness is premised on sustaining attention in the present moment and controlling the way we react to daily thoughts and feelings,” Zeidan said. “Interestingly, the present findings reveal that the brain regions associated with meditation-related anxiety relief are remarkably consistent with the principles of being mindful.”
He said the results of this neuroimaging experiment complement that body of knowledge by showing the brain mechanisms associated with meditation-related anxiety relief in healthy people.
Scientists wrote in the journal Frontiers in Human Neuroscience in November 2012 about how meditation has lasting emotional benefits. They found that participating in an eight-week meditation training program could have measurable effects on how the brain functions, even when someone is not actively meditating. The team used two forms of meditation training and saw some differences in the response of the amygdala, which is the part of the brain known to be important for emotion.

Detecting Disease Using Smartphone Accessory

The Optical Society

New plug-in optical sensor could be used for in-the-field diagnosis of Kaposi’s sarcoma, a cancer linked to AIDS

As antiretroviral drugs that treat HIV have become more commonplace, the incidence of Kaposi’s sarcoma, a type of cancer linked to AIDS, has decreased in the United States. The disease, however, remains prevalent in sub-Saharan Africa, where poor access to medical care and lab tests only compound the problem. Now, Cornell University engineers have created a new smartphone-based system, consisting of a plug-in optical accessory and disposable microfluidic chips, for in-the-field detection of the herpes virus that causes Kaposi’s. “The accessory provides an ultraportable way to determine whether or not viral DNA is present in a sample,” says mechanical engineer David Erickson, who developed the technique along with his graduate student, biomedical engineer Matthew Mancuso. The technique could also be adapted for use in detecting a range of other conditions, from E. coli infections to hepatitis. Mancuso will describe the work at the Conference on Lasers and Electro Optics (CLEO: 2013), taking place June 9-14 in San Jose, Calif.

Unlike other methods that use smartphones for diagnostic testing, this new system is chemically based and does not use the phone’s built-in camera. Instead, gold nanoparticles are combined (or “conjugated”) with short DNA snippets that bind to Kaposi’s DNA sequences, and a solution with the combined particles is added to a microfluidic chip. In the presence of viral DNA, the particles clump together, which affects the transmission of light through the solution. This causes a color change that can be measured with an optical sensor connected to a smartphone via a micro-USB port. When little or no Kaposi’s virus DNA is present, the nanoparticle solution is a bright red; at higher concentrations, the solution turns a duller purple, providing a quick method to quantify the amount of Kaposi’s DNA.

The main advantage of the system compared to previous Kaposi’s detection methods is that users can diagnose the condition with little training. “Expert knowledge is required for almost every other means of detecting Kaposi’s sarcoma,” Mancuso says. “This system doesn’t require that level of expertise.”

Erickson and Mancuso are now collaborating with experts on Kaposi’s at New York City’s Weill Cornell Medical College to create a portable system for collecting, testing, and diagnosing samples that could be available for use in the developing world by next year. The team’s start-up company, vitaMe Technologies, is commercializing similar smartphone diagnostic technologies for domestic use.

Detecting Kaposi’s sarcoma is not the only goal, Mancuso says. “Nanoparticle assays similar to the one used in our work can target DNA from many different diseases,” such as methicillin-resistant Staphylococcus aureus (MRSA), a bacterium responsible for several difficult-to-treat infections in humans, and syphilis. The smartphone reader could also work with other color-changing reactions, such as the popular enzyme-linked immunosorbent assays (ELISA), a common tool in medicine to test for HIV, hepatitis, food allergens, and E. coli. The lab also has created smartphone accessories for use with the color-changing strips in pH and urine assays. “These accessories could form the basis of a simple, at-home, personal biofluid health monitor,” Mancuso says.

CLEO: 2013 presentation AM3M.2. “Smartphone Based Optical Detection of Kaposi’s Sarcoma Associated Herpesvirus DNA” by David Erickson is at 2 p.m. on Monday, June 10 at the Marriott San Jose.

On The Net:

Inkjet-printed Hybrid Quantum Dot LEDs Bring Cheaper, ‘Greener’ Lighting To Market

The Optical Society

It’s not easy going green. For home lighting applications, organic light emitting diodes (OLEDs) hold the promise of being both environmentally friendly and versatile. Though not as efficient as regular light-emitting diodes (LEDs), they offer a wider range of material choices and are more energy efficient than traditional lights. OLEDs can also be applied to flexible surfaces, which may lead to lights or television displays that can be rolled up and stowed in a pocket.

A promising line of research involves combining the OLEDs with inorganic quantum dots, tiny semiconductor crystals that emit different colors of light depending on their size. These “hybrid” OLEDs, also called quantum dot LEDs (QD-LEDs), increase the efficiency of the light-emitting devices and also increase the range of colors that can be produced. But commercially manufacturing this promising green technology is still difficult and costly.

To make OLEDs more cheaply and easily, researchers from the University of Louisville in Kentucky are developing new materials and production methods using modified quantum dots and inkjet printing. The team will discuss its work developing more commercially feasible QD-LED devices at the Conference on Lasers and Electro-Optics (CLEO: 2013) June 9-14 in San Jose, Calif.

According to Delaina Amos, professor at the University of Louisville and principal investigator of the team’s efforts, expense of materials and manufacturing processes has been a major barrier to using OLEDs in everyday lighting devices.

To inexpensively apply the quantum dots to their hybrid devices, the Louisville researchers use inkjet printing, popular in recent years as a way to spray quantum dots and OLED materials onto a surface with great precision. But unlike other groups experimenting with this method, Amos’ team has focused on adapting the inkjet printing technique for use in a commercial setting, in which mass production minimizes expense and translates to affordable off-the-shelf products. “We are currently working at small scale, typically 1 inch by 1 inch for the OLEDs,” Amos says. “The process can be scaled up from here, probably to 6 inches by 6 inches and larger.”

“There’s a reason you don’t see OLED lights on sale at the hardware store,” says Amos, though she adds that they do find uses in small devices such as cameras, photo frames, and cell phone displays. To bring their QD-LEDs closer to becoming market-ready as household lighting appliances, Amos and her team have been synthesizing new, less expensive and more environmentally friendly quantum dots. The team has also modified the interfaces between the quantum dots and other layers of the OLED to improve the efficiency with which electrons are transferred, allowing them to produce more efficient light in the visible spectrum.

In addition to their higher efficiency, wider range of colors, and ability to be applied to flexible surfaces, Amos’ QD-LEDs also use low-toxicity materials, making them potentially better for the environment. “Ultimately we want to have low cost, low toxicity, and the ability to make flexible devices,” Amos says. The team has recently demonstrated small working devices, and Amos adds that she hopes to have larger devices within the next several months.

CLEO: 2013 presentation CF1M.3. “Printed Hybrid Quantum Dot Light-Emitting Diodes For Lighting Applications” by Delaina A. Amos is at 9:15 a.m. on Friday, June 14 in the San Jose Convention Center.

On The Net:

Our Human Ancestors Had A Diet Rich In Grass

Lawrence LeBlond for redOrbit.com – Your Universe Online

Four new studies have taken a new look at the diets of our ancestors and have found their behavior was a “game changer” for early humans some 3.5 million years ago. An ape-like diet that included grasses and sedges paved the way for a diet rich in grains, meats and dairy from grazing animals.

In the first of the four studies, researchers from the University of Colorado Boulder conducted high-tech tests on tooth enamel of ancient remains. The tests indicate that four million years ago Africa´s hominids were eating like chimpanzees, which consisted primarily of fruits and some leaves, according to CU anthropology Professor Matt Sponheimer, the study´s lead author. Despite the fact that grasses were available, the hominids largely ignored them as a food source for some time.

“We don´t know exactly what happened,” said Sponheimer. “But we do know that after about 3.5 million years ago, some of these hominids started to eat things that they did not eat before, and it is quite possible that these changes in diet were an important step in becoming human.”

Sponheimer’s paper has been published online this week in the Proceedings of the National Academy of Sciences (PNAS), along with the three other related papers.

Prior to this ground-breaking research, scientists analyzed teeth from 87 hominids. The new paper from CU presents information on an additional 88 specimens, including five previously unanalyzed hominid species.

CARBON SIGNALS

Sponheimer, who specializes in stable isotope analysis, compared particular forms of the same chemical element in fossilized teeth. Carbon isotopes obtained from the ancient hominids can help researchers piece together the types of plants that were being eaten way back when, he noted.

The carbon signals from the ancient teeth are derived from two distinct photosynthetic pathways, he added. C3 signals are from plants like trees and bushes, and C4 signals are from grasses and sedges. The wear of the teeth also provided more information on the type of foods these hominid specimens were eating.

After evolving from Australopithecus, the genus Homo likely looked to broaden its food choices. During this time, one short, upright hominid known as Paranthropus boisei from eastern Africa was moving toward a C4 type of diet. Originally dubbed the “Nutcracker Man” because of its large, flat teeth and powerful jaws, this species was later redefined, with scientists theorizing that the back teeth were actually used for grinding grasses and sedges, explained Sponheimer.

“We now have the first direct evidence that as the cheek teeth on hominids got bigger, their consumption of plants like grasses and sedges increased,” he said. “We also see niche differentiation between Homo and Paranthropus — it looks probable that Paranthropus boisei had a relatively restricted diet, while members of the genus Homo were eating a wider variety of things.”

“The genus Paranthropus went extinct about 1 million years ago, while the genus Homo that includes us obviously did not,” Sponheimer said.

Sponheimer noted that there still remain some puzzling differences in the evolutionary tree of hominids in eastern Africa and those of southern Africa. P. robustus of southern Africa was anatomically similar to its cousin P. boisei in eastern Africa, but the new analysis indicates the two species had very different carbon isotopic compositions in their teeth. P. robustus seems to have consumed a fair amount of C3 vegetation along with the evolved C4 diet.

“This has probably been one of the biggest surprises to us so far,” said Sponheimer. “We had generally assumed that the Paranthropus species were just variants on the same ecological theme, and that their diets would probably not differ more than those of two closely related monkeys in the same forest.”

“But we found that their diets differed as much isotopically as those of forest chimpanzees and savanna baboons, which could indicate their diets were about as different as primate diets can be,” he said. “Ancient fossils don´t always reveal what we think they will.  The upside of this disconnect is that it can teach us a great deal, including the need for caution in making pronouncements about the diets of long-dead critters.”

Thure Cerling, a geochemist from University of Utah, and lead author of two of the four papers published online in PNAS, said: “At last, we have a look at 4 million years of the dietary evolution of humans and their ancestors.”

“For a long time, primates stuck by the old restaurants–leaves and fruits–but by 3.5 million years ago, they started exploring new diet possibilities–tropical grasses and sedges–that grazing animals discovered a long time before, about 10 million years ago,” Cerling said, when African savanna began expanding.

He noted that tropical grasses provided early hominids with a new food option and there is increasing evidence that our ancestors relied on this resource; oddly enough, most primates today still do not eat grasses.

Between six and seven million years ago grassy savannas and grassy woodlands in East Africa were abundant. But the question that remains is why our ancestors didn´t start exploiting this resource until less than four million years ago.

The isotopic method may paint a good picture of what types of vegetation was consumed, but it cannot distinguish which parts of these plants were eaten, such as the leaves, stems, seeds, or roots. It also cannot determine exactly when our ancestors began getting much of their grass through consuming grass-eating insects or from grazing animals.

Cerling said direct evidence of meat scavenging doesn´t occur until about 2.5 million years ago, and definitive evidence of hunting only exists 500,000 years ago. The new evidence does clear up some of the mystery as to what was on our ancestors´ plates, but there does still remain uncertainties, he added.

“We don’t know if they were pure herbivores or carnivores, if they were eating fish [which leave a tooth signal that looks like grass-eating], if they were eating insects, or if they were eating mixes of all these,” he said.

The four papers appear in the journal PNAS this week.

A paper on the teeth of hominids from Ethipoia´s Hadar-Dikika area was written by lead author Jonathan Wynn, program director at NSF´s Division of Earth Sciences, on leave from University of South Florida. Other lead authors are Arizona State University‘s William Kimbel and California Academy of Sciences scientist Zeresenay Alemseged.

One of Cerling´s papers is on the teeth from the Turkana Basin in Kenya, collaborating with lead author paleoanthropologist Meave Leakey of Turkana Basin Institute and geologist Frank Brown of the University of Utah. His other paper is on baboon diets.

Sponheimer´s research paper summarizes the other three studies.

DIETARY HISTORY

Previous research has shown that an early relative of human, Ardipithecus ramidus (“Ardi”), from Ethipoia ate mainly C3 leaves and fruits.

Between 4.2 and 4 million years ago on the Kenyan side of the Turkana Basin, Cerling suggests Aus. anamensis subsisted on at least 90 percent leaves and fruits — the same diet that modern chimps have.

By 3.4 million years ago, Wynn describes Aus. afarensis of Ethiopia´s Awash Basin living on a rich diet (22 percent on average) of C4 grasses and sedges that extended anywhere from zero to 69 percent of their diet.

The switch to C4 vegetation “documents a transformational stage in our ecological history,” said Wynn.

Many scientists previously believed Aus. afarensis had an ape-like C3 diet. It remains a mystery why Aus. afarensis expanded its menu to C4 grasses when its likely ancestor, Aus. anamensis, did not, although both inhabited savanna habitats, Wynn says.

Also, at around 3.4 million years ago, the human relative Kenyanthropus platyops moved to a highly varied diet of both C3 and C4 vegetation. The average was 40 percent grasses and sedges, but individuals varied widely, eating anywhere from 5 to 65 percent, explained Cerling.

In Cerling´s baboon study, he presented findings that two extinct Kenyan baboon species represent the only primate genus that primarily ate grasses throughout its history.

Theropithecus brumpti ate a 65 percent tropical grass-and-sedge diet when the baboons lived between four million and 2.5 million years ago, contradicting previous claims that they ate forest foods. Theropithecus oswaldi ate a 75 percent grass diet by two million years ago and a 100 percent grass diet by one million years ago. Both species went extinct, perhaps due to competition from hooved grazing animals.”

Most modern baboons eat only C3 cool-season grasses.

The research was funded by the National Science Foundation (NSF), the National Research Foundation in South Africa, the Leakey Foundation, the Wenner-Gren Foundation, Arizona State University, the CU-Boulder Dean’s Fund for Excellence, and George Washington University (GWU).

New Silicon Electrode Could Make Your Device’s Battery Last A Lot Longer

Brett Smith for redOrbit.com – Your Universe Online
Scientists led by a team at Stanford University have developed a new silicon-based electrode for lithium-ion batteries that could significantly improve the performance of the popular batteries, according to a new report in Nature Communications.
In a lithium-ion battery, the electrode connects the lithium electrolyte, which is primarily responsible for power generation, with the rest of a circuit.
“Developing rechargeable lithium-ion batteries with high energy density and long cycle life is of critical importance to address the ever-increasing energy storage needs for portable electronics, electric vehicles and other technologies,” said study co-author Zhenan Bao, a professor of chemical engineering at Stanford University.
“We’ve been trying to develop silicon-based electrodes for high-capacity lithium-ion batteries for several years,” added co-author Yi Cui, an associate professor of materials science and engineering at Stanford. “Silicon has 10 times the charge storage capacity of carbon, the conventional material used in lithium-ion electrodes. The problem is that silicon expands and breaks.”
Because silicon particles can expand up to 400 percent of their size when combined with lithium, the swollen particles can splinter and lose electrical contact when the battery is being charged or discharged. To overcome this problem, the Stanford team coated the electrode´s silicon nanoparticles with hydrogel, a spongy material similar to material used in contact lenses.
Tests of the electrode showed that the novel battery retained a high storage capacity through 5,000 cycles of charging and discharging.
“We attribute the exceptional electrochemical stability of the battery to the unique nanoscale architecture of the silicon-composite electrode,” Bao said.
Using a scanning electron microscope, the scientists were able to determine that the porous hydrogel contains countless empty spaces that permit the silicon nanoparticles to inflate when lithium is inserted. The nanostructures also form a network that produces a conducting pathway during charging and discharging.
“It turns out that hydrogel has binding sites that latch onto silicon particles really well and at the same time provide channels for the fast transport of electrons and lithium ions,” Cui said. “That makes a very powerful combination.”
To optimize battery performance, the engineers created the hydrogel and silicon electrodes using a technique called in situ synthesis polymerization.
“With our technique, each silicon nanoparticle is encapsulated within a conductive polymer surface coating and is connected to the hydrogel framework,” Bao said. “That improves the battery’s overall stability.”
The hydrogel initially presented another obstacle because water within the substance can cause lithium-ion batteries to burst into flames.
“We utilized the three-dimensional network property of the hydrogel in the electrode, but in the final production phase, the water was removed,” Bao said. “You don’t want water inside a lithium-ion battery.”
The Stanford scientists said they are optimistic about the new technique that they´ve used to create electrodes made of silicon and other materials.
“The electrode fabrication process used in the study is compatible with existing battery manufacturing technology,” Cui said. “Silicon and hydrogel are also inexpensive and widely available. These factors could allow high-performance silicon-composite electrodes to be scaled up for manufacturing the next generation of lithium-ion batteries. It’s a very simple approach that’s led to a very powerful result.”

Genetic Traits Of Cells Responsible For One Type Of Brain Cancer Discovered

redOrbit Staff & Wire Reports — Your Universe Online

The genetic traits of glial cells — the cells which give rise to the most common form of malignant brain cancer in humans — have been identified by a team of researchers led by University of Rochester Medical Center (URMC) neurologist Dr. Steven Goldman.

“This study identifies a core set of genes and pathways that are dysregulated during both the early and late stages of tumor progression,” Goldman, senior author of the study and the co-director of the URMC Center for Translational Neuromedicine, said in a statement Monday. “By virtue of their marked difference from normal cells, these genes appear to comprise a promising set of targets for therapeutic intervention.”

The brain tumors, which are known as gliomas, arise from those glial cells, which are found in the central nervous system. Gliomas progress in severity over time, the researchers said, and they ultimately turn into glioblastomas — highly invasive tumors that are difficult to treat and are, in most cases, almost always fatal.

Currently, surgery, radiation therapy, and chemotherapy are used to treat the disease, but they can only delay the cancer´s progression and ultimately prove ineffective at combating the tumor. That could change thanks to the new study, which has been published in the journal Cell Reports.

“Cancer research has been transformed over the past several years by new concepts arising from stem cell biology,” URMC explained. “Scientists now appreciate that many cancers are the result of rogue stem cells or their offspring, known as progenitor cells. Traditional cancer therapies often do not prevent a recurrence of the disease since they may not effectively target and destroy the cancer-causing stem cells that lie at the heart of the tumors.”

“Gliomas are one such example,” they added. “The source of the cancer is“¦ the glial progenitor cell. The cells, which arise from and maintain characteristics of stem cells, comprise about three percent of the cell population of the human brain. When these cells become cancerous they are transformed into glioma stem cells, essentially glial progenitor cells whose molecular machinery has gone awry, resulting in uncontrolled cell division.”

Goldman and his colleagues have been studying normal glial progenitor cells for several years. They primarily have focused their efforts on using these cells to treat multiple sclerosis and other neurological disorders. However, during the course of their work, they have gained an enhanced understanding of glial progenitor cell biology, as well as the molecular and genetic changes that transform them into cancers.

The researchers took human tissue samples that represented the three principal stages of the cancer, and were able to both pinpoint and isolate the cancer-inducing stem cells. They then compared the gene expression profiles of these cancer stem cells to those of normal glial progenitor cells. Their goal was to determine the earliest point in which genetic changes associated with cancer formation occurs and identify the genes that were both unique to the cancer-causing stem cells and expressed at every stage of the disease´s progression.

“Out of a pool over 44,000 tested genes and sequences, the scientists identified a small set of genes in the cancerous glioma progenitor cells that were over-expressed at all stages of malignancy,” URMC said. “These genes formed a unique ℠signature´ that identified the tumor progenitor cells and enabled the scientists to define a corresponding set of potential therapeutic targets present throughout all stages of the cancer.”

To test their hypothesis, Goldman and his colleagues targeted one of the genes that was highly overexpressed in the glioma progenitor cells. This gene, which is known as SIX1, is active in the early development in the nervous system but had not previously been detected in the adult brain.

SIX1 signaling had previously been linked to breast and ovarian cancer, which led the researchers to believe that it could also contribute to brain cancer as well. Sure enough, when they blocked the gene´s expression, the cancer cells stopped growing and the implanted tumors actually began to shrink.

“This study gives us a blueprint to develop new therapies,” Goldman said. “We can now devise a strategy to systematically and rationally analyze — and eliminate — glioma stem and progenitor cells using compounds that may selectively target these cells, relative to the normal glial progenitors from which they derive.”

“By targeting genes like SIX1 that are expressed at all stages of glioma progression, we hope to be able to effectively treat gliomas regardless of their stage of malignancy,” he added. “And by targeting the glioma-initiating cells in particular, we hope to lessen the likelihood of recurrence of these tumors, regardless of the stage at which we initiate treatment.”

Study: Sexually Active Women Don’t Make For Friends, Mates

April Flowers for redOrbit.com — Your Universe Online

A new study from Cornell University developmental psychologists reveals that college-aged women judge promiscuous female peers more negatively than more chaste women. The promiscuous women — defined as having had 20 or more sexual partners by their early 20s — are viewed as unsuitable for friendship.

The research team notes that even when the women reported liberal attitudes about casual sex or a high number of lifetime lovers for themselves, they still preferred less sexually active women as friends. The findings of this study were published in the early online edition of the Journal of Social and Personal Relationships.

The study found that men´s views were less uniform. The men´s views ranged from favoring the sexual permissive potential friend, the non-permissive one or showing no preference for either when asked to rate them on ten different friendship attributes. The men´s views were also more dependent on their own sexual behavior, the study found. When they viewed other promiscuous men as a potential threat to steal their girlfriends, the sexually permissive men preferred less sexually experienced men.

Zhana Vrangalova, a Cornell graduate student in the field of human development, suggests that although cultural and societal attitudes about casual sex have loosened in recent decades, there is still a double standard shaming “slutty” women and celebrating “slutty” men. Such societal isolation, the study reports, may place promiscuous women at greater risk for poor psychological and physical health outcomes.

“For sexually permissive women, they are ostracized for being ‘easy,’ whereas men with a high number of sexual partners are viewed with a sense of accomplishment,” Vrangalova said. “What surprised us in this study is how unaccepting promiscuous women were of other promiscuous women when it came to friendships — these are the very people one would think they could turn to for support.”

Previous studies have shown that men often view promiscuous women as unsuited for long-term relationships. This attitude leaves these women outside of many social circles.

“The effect is that these women are really isolated,” Vrangalova said, noting that future research is needed to determine whom these women could befriend – perhaps straight or gay men who would be accepting of their behaviors.

The researchers surveyed 751 college students, who provided information about their past sexual experience and their views on casual sex. The participants read near-identical vignettes about a male or female peer, with the only difference being the character’s number of lifetime sexual partners — either two or 20. The researchers then asked the students to rate the potential friend on a range of friendship factors, including warmth, competence, morality, emotional stability and overall likability.

The female participants, regardless of their own promiscuity, viewed sexually permissive women more negatively on nine of ten friendship attributes, judging only their outgoingness as a favorable attribute. Promiscuous men only favored less sexually active men as suitable friends on two attributes – mate guarding and dislike of sexuality. They showed no preference, or favored the more promiscuous men, on the other eight variables. The more sexually modest men preferred the non-permissive potential friend in half the variables.

The study suggests that evolutionary concerns might be driving men and women to disapprove of their more promiscuous peers as friends — they might be seeking to guard their mates against a threat to the relationship.

Vrangalova suggested that in the case of promiscuous women rejecting other women with a high number of sexual partners, they may be seeking to distance themselves from any stigma that is attached to being friends with such women.

The study´s findings could aid parents, teachers, counselors, doctors and others who work with young people facing social isolation due to their sexual activity.

Astronomers Observe Galaxy In Act Of Dying

Lee Rannals for redOrbit.com — Your Universe Online

Astronomers announced at the 222nd American Astronomical Society meeting in Indianapolis that they have observed the first clear example of a galaxy in the act of dying.

The team says they saw a bright dwarf galaxy relatively close to Earth’s Milky Way that was “trailing fireballs.” They said until now, there has been no clear example of this transformation happening.

“We think we´re witnessing a critical stage in the transformation of a gas-rich dwarf irregular galaxy into a gas-poor dwarf elliptical galaxy — the depletion of its lifeblood,” Jeffrey D. P. Kenney of Yale University, the principal investigator, said in a statement.

Star formation is the drive behind a galaxy’s vitality, and many of the universe’s known galaxies are active star factories. However, due to gas depletion, many galaxies have stopped making stars. The researchers said galaxy IC 3418 inside the Virgo Cluster is now all but totally out of gas.

The astronomers believe IC 3418’s distinctive fireball-dotted tail shows evidence of recent star formation that took place within the last few million years or less. However, the depletion of the core and the generation of the fireballs are probably the result of the process that is killing the galaxy, which is known as “ram pressure stripping.”

During this process, the interaction of gases in the space between galaxies generates an enormous pressure that can force out an individual galaxy’s interior gas while leaving existing stars untouched. Ram pressure pushes gas away from the galaxy, forming stars that do not feel the ram pressure and remain behind. The fireballs are a signature of active ram pressure.

“If you hold popcorn and unpopped kernels of corn in your hand and stick it out the car window as you drive, the wind caused by the car´s motion through the air will blow away the popcorn but leave the denser unpopped kernels in your hand,” Kenney said in the statement. “This is like the gas clouds in galaxies being blown out of the galaxy by the wind of cluster gas, while the denser stars remain behind.”

Kenney said IC 3418 would no longer be fertile because stars, planets and life can form only if a galaxy has gas to make them.

“It´s gratifying to find a clear example of an important process in galaxy evolution,” Kenney explained. “I enjoy digging through evidence to assemble a story about what happens to galaxies. I´ve come to think of myself as an intergalactic forensic pathologist — someone who studies the bodies of galaxies seeking evidence of traumatic events responsible for the present state of the galaxy.”

The astronomers plan to submit their paper titled “Transformation of a Virgo Cluster Dwarf Irregular Galaxy by Ram Pressure Stripping: IC 3418 and Its Fireballs” to the Astrophysical Journal.