Sodium-Based Lithium May Be Less Toxic Way To Treat Bipolar Disorder

redOrbit Staff & Wire Reports – Your Universe Online
Researchers from the University of South Florida (USF) have reportedly discovered that an alternative salt form of lithium might be a safer, less toxic way to treat bipolar disorder and other neuropsychiatric conditions.
Publishing a paper in a recent edition of RSC Advances, the journal of the Royal Society of Chemistry, the study authors discuss how oral lithium salicylate was able to produce steady lithium levels for up to 48 hours in rats – without the toxic side effects typically linked with the rapid absorption of current FDA-approved lithium carbonate.
While lithium carbonate has been tremendously effective in treating the mania associated with bipolar disorder, as well as reducing the likelihood of suicide during the depressive phases of the condition, it can also cause such side effects as diarrhea, vomiting, weight gain, hand tremors and even decreased thyroid function.
As a result of those adverse effects, patients often stop taking the medication, and alternative treatments that are less toxic than lithium carbonate but equally as effective are not forthcoming, the researchers said. However, they report that the discovery that lithium salicylate could potentially be more effective than lithium carbonate in treating these conditions, only without the toxicity, could be an important step forward in the field of lithium-based treatments.
“Despite its narrow therapeutic window and the emergence of proprietary alternatives, U.S. FDA-approved lithium therapeutics are still regarded as the ‘gold standard’ for the treatment of the manic phase of bipolar disorder,” Dr. Adam J. Smith, a neuroscientist at the USF Health Center of Excellence for Aging and Brain Repair, said in a statement.
“Our previous research suggested that re-engineering lithium therapeutics by crystal engineering might produce better performance with reduced toxicities,” he continued, explaining that crystal engineering is the design and synthesis of molecular solid crystal structures with desired properties using intermolecular interactions.
As part of the study, Dr. Smith and his colleagues examined the impact of two previously untested lithium salts (salicylate and lactate) on laboratory rats. They found that these two substances, each of which is structurally different from lithium carbonate, demonstrated “profoundly different pharmacokinetics” – or absorption and distribution of the drug – when compared to the more widely used version of the substance.
According to Dr. Smith, this is likely the first pharmacokinetic study of lithium salicylate and lithium lactate in lab animals, and the results support previous findings suggesting that an ideal method of lithium preparation would both even-out the high blood level peaks while also slowing declining blood concentrations.
“This is exactly the pharmacokinetic profile produced by lithium salicylate in our study,” said senior author Dr. Doug Shytle, also of the USF Health Center of Excellence for Aging and Brain Repair. “Remarkably, lithium salicylate produced elevated levels of lithium in the blood and brain 48 hours after the dose, but without the sharp peaks that contribute to the toxicity problems of lithium in the currently used form.”
The 48-hour time frame represents a crucial difference between between lithium salicylate and current FDA-approved lithium therapeutics, according to the study authors. If they can manage to duplicate these preclinical outcomes in humans, it would allow for patients to take less frequent doses and experience fewer problematic side effects than current conventional lithium treatments.
“Psychiatry has long struggled with the fact that, while lithium is highly effective for treating bipolar disorder, the narrow therapeutic window and side effect profile often makes lithium both difficult and sometimes dangerous to work with clinically,” explained Dr. Todd Gould, an expert in the neurobiology of bipolar disorder and the lithium at the University of Maryland.
“The pharmacokinetic data by Dr. Smith and colleagues suggests that lithium salts other than the commonly used lithium carbonate may have a broader therapeutic window and potentially fewer side effects,” he added. “Studies in humans will be needed to confirm safety and demonstrate that the pharmacokinetic profile observed in rats is similarly observed in humans.”

Obesity Linked With Poor Academic Performance In Girls

Lee Rannals for redOrbit.com – Your Universe Online

According to a study published in the International Journal of Obesity, teenage girls struggling with obesity could also be struggling with their academics.

Researchers from the Universities of Strathclyde, Dundee, Georgia and Bristol found that obesity in adolescent girls is linked to a lower academic attainment level throughout high school.

The team examined data from nearly 6,000 children from the Avon Longitudinal Study of Parents and Children (ALSPAC). They studied academic attainment assessed by national tests at 11, 13 and 16 years and weight status. According to the researchers, 71 percent of those in the study were a healthy weight, 13 percent were overweight and 15 percent were obese, based on body mass index.

Researchers took into account factors like socio-economic deprivation, mental health, IQ and age of menarche for the study. However, they found that these factors did not change the relationship discovered between obesity and poor grades.

The study found that girls who were obese at age 11 had lower academic attainment at 11, 13 and 16 years when compared to those of a healthy weight. Tests included the core subjects of English, Math and Science.

According to the findings, tests taken by obese girls on core subjects were lower by an amount equivalent to a D instead of a C, which was the average grade in the sample. The researchers said the associations between obesity and academic attainment were less clear in boys than for girls.

“Further work is needed to understand why obesity is negatively related to academic attainment, but it is clear that teenagers, parents, and policymakers in education and public health should be aware of the lifelong educational and economic impact of obesity,” John Reilly, University of Strathclyde Professor of Physical Activity and Public Health Science, said in a statement.

Dr Josie Booth, of the School of Psychology at the University of Dundee, pointed out that this study unveils a link between obesity and academic performance.

“There is a clear pattern which shows that girls who are in the obese range are performing more poorly than their counterparts in the healthy weight range throughout their teenage years,” Booth said in a statement.

Digging Up A New Mite Species

A graduate student at the Ohio State University discovered a creepy, crawly microscopic species. Yuck. The species is a type of mite that the student affectionately nicknamed “The Buckeye Dragon Mite.” The mite kind of resembles the Chinese dragon that you see in New Year celebrations. At just over a half a millimeter, the mite can’t be seen by the naked eye. The mite was found 20 inches below the surface on campus and had unusual straight hairs along its body not found in other known members of its family. The mite also has interesting features on its mouth called rutella that function similar to teeth in other mites. The rutella supports a pouch-like vessel in the front of the mouth and it’s believed that the pouch acts like a nutcracker, holding microorganisms in place while the internal pincers puncture them and suck up their fluid contents.

[ Read the Article: Microscopic Creepy-Crawly Discovered By Ohio State Graduate Student ]

Pew Report Looks Into Our Digital Future

Enid Burns for redOrbit.com – Your Universe Online
This year marks the 25th anniversary of the Internet. To mark the occasion, Pew Research Center’s Internet Project and Elon University’s Imagining the Internet Center worked together to discuss what connected lives will look like in the year 2025. The report, “Digital Life in 2025,” lays out “15 things about the digital future” for Internet users to ponder, looking at both good and bad aspects of what may come.
“The world is moving rapidly towards ubiquitous connectivity that will further change how and where people associate, gather and share information, and consume media,” the paper said. To peer forward into the future, Pew Internet canvassed 2,558 experts and technology builders to come up with 15 themes.
Information will become much more available in the next 10 years. That is, “experts foresee an ambient information environment where accessing the Internet will be effortless and most people will tap into it so easily it will flow through their lives ‘like electricity,'” the report said. Connectivity will be ever present through mobile, wearable and embedded computing. All of these devices will be tied together through the Internet of Things, where connected devices will have programs to carry out tasks automatically in certain situations with limitless possibilities.
After studying responses from their sources, researchers found that experts agree on the direction that technology is going, though the “ramifications” present some different opinions. Four trends that all sources agree upon include four underlying themes. An ambient computing environment network will become global, immersive and invisible. This environment will increase through a proliferation of smart sensors, cameras, software, databases and massive data centers. “Augmented reality” will enhance real-world input through portable, wearable and implantable technologies. Business models will see disruption. The areas most affected by this disruption will be finance, entertainment, publishers and education. The fourth trend involves the tagging, databasing and intelligent analytical mapping of physical and social realms.
The 15 trends that emerged from the Pew Internet survey discuss how connectivity will evolve in the next 10 years:
• Information sharing over the Internet will be so effortlessly interwoven into daily life that it will become invisible, flowing like electricity, often through machine intermediaries.
• The spread of the Internet will enhance global connectivity that fosters more planetary relationships and less ignorance.
• The Internet of Things, artificial intelligence, and big data will make people more aware of their world and their own behavior.
• Augmented reality and wearable devices will be implemented to monitor and give quick feedback on daily life, especially tied to personal health.
• Political awareness and action will be facilitated and more peaceful change and public uprisings like the Arab Spring will emerge.
• The spread of the ‘Ubernet’ will diminish the meaning of borders, and new ‘nations’ of those with shared interests may emerge and exist beyond the capacity of current nation-states to control.
• The Internet will become ‘the Internets’ as access, systems, and principles are renegotiated.
• An Internet-enabled revolution in education will spread more opportunities, with less money spent on real estate and teachers.
• Dangerous divides between haves and have-nots may expand, resulting in resentment and possible violence.
• Abuses and abusers will ‘evolve and scale.’ Human nature isn’t changing; there’s laziness, bullying, stalking, stupidity, pornography, dirty tricks, crime, and those who practice them have new capacity to make life miserable for others.
• Pressured by these changes, governments and corporations will try to assert power — and at times succeed — as they invoke security and cultural norms.
• People will continue — sometimes grudgingly — to make tradeoffs favoring convenience and perceived immediate gains over privacy; and privacy will be something only the upscale will enjoy.
• Humans and their current organizations may not respond quickly enough to challenges presented by complex networks.
• Most people are not yet noticing the profound changes today’s communications networks are already bringing about; these networks will be even more disruptive in the future.
• Foresight and accurate predictions can make a difference; ‘The best way to predict the future is to invent it.’

When Is It Safe To Drive After Hip Replacement?

Hospital for Special Surgery

Using interactive ‘video game’ to test reaction time, findings at Hospital for Special Surgery challenge traditional recommendation to wait 6 weeks after hip replacement before driving

After hip replacement surgery, many patients are anxious to resume driving, and a new study challenges the conventional wisdom that patients should wait six weeks before getting back behind the wheel. Dr. Geoffrey Westrich, director of research, Adult Reconstruction and Joint Replacement at Hospital for Special Surgery in New York City, found that patients in the study were able to return to driving four weeks after total hip replacement.

The study, titled, “A Novel Assessment of Driving Reaction Time Following THR Using a New Fully Interactive Driving Simulator,” will be presented at the annual meeting of the American Academy of Orthopaedic Surgeons in New Orleans on March 11, 2014.

“One of the most common questions patients ask after hip replacement is when they can start driving again, and this is the first study of its kind to test their reaction time after the procedure,” said Dr. Westrich, who came up with the idea for the driving simulator while watching his children play video games.

But the interactive simulator used in his study is more intricate than a Wii game. “It’s a very sophisticated machine made by a company that makes driving simulators for the automobile industry,” Dr. Westrich said.

More than 330,000 hip replacements are performed in the United States each year. People exhibit decreased reaction time after the surgery, making it unsafe to drive in the immediate post-operative period. Most doctors recommend patients wait about six weeks before they resume driving, but many don’t want to wait that long.

“Over the past five or 10 years, we’ve seen advances such as minimally invasive hip replacement and newer implants that are advantageous to patients and may improve recovery time. Our study set out to obtain good, objective data to determine if it would be safe for people to return to driving sooner,” Dr. Westrich said.

One-hundred patients from three orthopedic surgeons at Hospital for Special Surgery were enrolled in the study to assess their driving reaction times using a fully-interactive driving simulator with an automatic brake reaction timer from the American Automobile Association.

All of the participants had a total hip replacement on the right side, and they all took the driving test prior to having surgery. They were then randomly selected to repeat the test TWO, THREE or FOUR weeks after hip replacement. Reaction time was measured by the computerized driving simulator.

The reaction timer, equipped with an accelerator and brake pedal, simulates driving. Patients were instructed to place their foot on the accelerator, which activated a green light, and to keep their foot on the accelerator until a Stop sign appeared. When the Stop sign popped up, they were supposed to move their foot to the brake pedal. The amount of time it took for the subject to switch from the gas to the brake pedal was measured by the machine.

The study defined a return to safe driving reaction time as a return to a reaction time that was either the same as or better than the preoperative driving reaction time. Observing reaction times at different intervals revealed that two and three weeks after surgery patients had not yet made a full recovery to their respective baseline reaction time and generally were not ready to drive.

However, at four weeks following hip replacement, patients had actually improved their reaction time compared to what it was before the surgery and therefore could be cleared to drive. It was also observed that patients under the age of 70 reached an improved reaction time earlier than those over 70.

“By using a standardized, driving simulator to measure reaction times, our study will be reproducible and we can apply our model to other surgical procedures that may affect one’s ability to drive safely postoperatively,” Dr. Westrich noted. He will soon begin enrolling patients in another study to determine when it is safe to drive after total knee replacement.

Blood Test Identifies Those At Risk For Cognitive Decline, Alzheimer’s

April Flowers for redOrbit.com – Your Universe Online

A new study, led by Georgetown University, reveals that a blood test has been discovered and validated that can predict, with 90 percent accuracy, if a healthy person will develop mild cognitive impairment or Alzheimer’s disease within three years.

The study, published in Nature Medicine, indicates the possibility of developing treatment strategies for Alzheimer’s at an earlier stage. Therapy at these earlier stages would be more effective at slowing or preventing the onset of symptoms. The researchers say that their study is the first known report of blood-based biomarkers for preclinical Alzheimer’s.

Ten lipids, or fats, that predict disease are identified in the blood by the new test, which could be ready for use in clinical trials in as little as two years. The researchers are hopeful for other diagnostic uses as well.

“Our novel blood test offers the potential to identify people at risk for progressive cognitive decline and can change how patients, their families and treating physicians plan for and manage the disorder,” says Howard J. Federoff, MD, PhD, professor of neurology and executive vice president for health sciences at Georgetown University Medical Center.

To date, there is no cure or effective treatment for Alzheimer’s disease. Approximately 35.6 million people suffer from the disease worldwide, and according to the World Health Organization (WHO), that number will double every 20 years. They predict 115.4 million sufferers by 2050.

There have been many efforts to develop drug therapies to slow or reverse the disease’s progression, but according to Federoff, all have failed so far. He believes that one reason for this failure is that the drugs were evaluated too late in the disease process.

“The preclinical state of the disease offers a window of opportunity for timely disease-modifying intervention,” Federoff says. “Biomarkers such as ours that define this asymptomatic period are critical for successful development and application of these therapeutics.”

The researchers recruited 525 healthy participants aged 70 and older. The participants all gave blood samples at the beginning of the study and at various points during the study. Over the five-year study period, 74 participants met the criteria for either mild Alzheimer’s disease (AD) or a condition known as amnestic mild cognitive impairment (aMCI), in which memory loss is prominent. Forty-six of this group were diagnosed upon enrollment in the study and 28 (called converters) developed aMCI or mild AD during the study.

During the study’s third year, 53 participants who developed aMCI/AD (including 18 converters) and 53 cognitively normal matched controls were selected for the lipid biomarker discovery phase of the study. The lipids were not the initial focus of the study; instead, they were an outcome.

The researchers discovered a panel of 10 lipids that appear to reveal the breakdown of neural cell membranes in study participants who develop symptoms of cognitive impairment or AD. The team validated the panel using the remaining 21 aMCI/AD participants (including 10 converters), and 20 controls. To determine if the subjects could be characterized into the correct diagnostic categories based solely on the panel of 10 lipids identified in the discovery phase, the data was analyzed blind.

“The lipid panel was able to distinguish with 90 percent accuracy these two distinct groups: cognitively normal participants who would progress to MCI or AD within two to three years, and those who would remain normal in the near future,” Federoff says.

The research team also investigated the possibility that the APOE4 gene, known to be a risk factor for developing AD, would contribute to accurate classifications of the groups. They found it was not a significant predictive factor, however.

“We consider our results a major step toward the commercialization of a preclinical disease biomarker test that could be useful for large-scale screening to identify at-risk individuals,” Federoff says. “We’re designing a clinical trial where we’ll use this panel to identify people at high risk for Alzheimer’s to test a therapeutic agent that might delay or prevent the emergence of the disease.”

Colors And Shapes Can Be Heard By The Blind: Study

[ Watch the Video: EyeMusic Live Demo at AIPAC ]

April Flowers for redOrbit.com – Your Universe Online

Normally, people perceive colors and shapes visually, but what if you could “hear” them? A new study from Hebrew University shows that using sensory substitution devices (SSDs), colors and shapes can now be conveyed to the brain noninvasively through other senses.

Prof. Amir Amedi, of the Edmond and Lily Safra Center for Brain Sciences and the Institute for Medical Research Israel-Canada at the Hebrew University of Jerusalem Faculty of Medicine, leads the study at the Center for Human Perception and Cognition, which is offering tools to the blind and visually impaired via training with SSDs to receive environmental visual information and interact with it in ways otherwise unimaginable.

Yissum, the Hebrew University’s Technology Transfer Company, has patented the work of Prof. Amedi and his team.

Using their existing senses, SSDs are non-invasive sensory aids that provide visual information to the blind. One example of an SSD would be a miniature camera connected to a small computer (or smart phone) and stereo headphones, which the user would wear. Using a predictable algorithm, the images are converted into a “soundscape” that allows the users to listen to and then interpret the visual information obtained by the camera.

EyeMusic SSD is an SSD available at the Apple App Store that plays pleasant musical notes to convey information about colors, shapes and locations of objects in the world around the user.

Combining the SSD equipment with a unique training program allows the blind to achieve various complex, visual-linked abilities. Recent studies published in Restorative Neurology and Neuroscience and Scientific Reports used EyeMusic SSD with blind and blindfolded-sighted users. The users were shown to correctly perceive and interact with objects, such as recognizing different shapes and colors or reaching for a beverage.

EyeMusic was also used to show that other fast and accurate movements can be guided by the EyeMusic tool and visuo-motor learning. The team published two studies in Neuron and Current Biology, demonstrating that the blind can characterize sound-conveyed images into complex object categories—for example, faces, houses and outdoor scenes, plus everyday objects. The participants were also able to locate people’s positions, identify facial expressions and read letters and words.

SSDs are not widely used in the blind community, despite these encouraging behavioral demonstrations. The reasons that have prevented their adoption, however, have changed for the better over the past few years, according to an article published in Neuroscience & Biobehavioral Reviews. Due to technological advances, SSDs are much cheaper, smaller and lighter. They are also able to run on a standard smartphone. Training and performance have also been enhanced by new computerized training methods and environments.

Contrary to the widely-held conception of the cortex being divided into separate vision-processing areas, auditory areas, etc., the Hebrew University research has demonstrated that over the last decade, new findings have shown that many brain areas are characterized by their computational task, and can be activated using senses other than the one commonly used for this task, even for people who were never exposed to “original” sensory information at all—for example, a person born blind that never saw one photon of light during his lifetime.

The research team showed that when users process “visual information” through the SSD, congenitally blind people who learned to read by touch using the Braille script or through their ears with sensory substitution devices use the same areas in the visual cortex as those used by sighted readers. An example of this technique, recently published in Current Biology, revealed that blind subjects use SSD equipment and training to “see” body shapes.

An entire network of regions in the human brain are dedicated to processing and perception of body shapes. This network starts in the areas processing vision in the cortex, leading to the “Extrastriate Body Area” (EBA), and further connecting to multiple brain areas deciphering people’s motion in space, their feelings and intents.

The EBA in the blind was found to be functionally connected to the whole network of body-processing found in the sighted. This finding bolsters the researchers’ new theory of the brain as a sensory-independent task machine instead of a pure sensory machine based on vision, audition or touch.

“The human brain is more flexible than we thought,” says Prof. Amedi. “These results give a lot of hope for the successful regaining of visual functions using cheap non-invasive SSDs or other invasive sight restoration approaches. They suggest that in the blind, brain areas have the potential to be ‘awakened’ to processing visual properties and tasks even after years or maybe even lifelong blindness, if the proper technologies and training approaches are used.”

Endometriosis Cause And Development Linked To Unstudied Genes

Rebekah Eliason for redOrbit.com – Your Universe Online
A study from Northwestern Medicine has led to a new theory regarding the development and cause of endometriosis. The chronically painful disease, which affects 1 in 10 women, has been linked to two previously unstudied genes.
This innovative research regarding endometriosis suggests that an integral part of the disease and its progression is epigenetic modification, which is a process that will either enhance or disrupt the reading of DNA.
Matthew Dyson, research assistant professor of obstetrics and gynecology at Northwestern University Feinberg School of Medicine, along with Serdar Bulun, MD, chair of obstetrics and gynecology at Feinberg and Northwestern Memorial Hospital, were able to recognize a novel role for a family of key gene regulators found in the uterus.
“Until now, the scientific community was looking for a genetic mutation to explain endometriosis,” said Bulun, a member of the Center for Genetic Medicine and the Robert H. Lurie Comprehensive Cancer Center of Northwestern University. “This is the first conclusive demonstration that the disease develops as a result of alterations in the epigenetic landscape and not from classical genetic mutations.”
Heather C. Guidone, Surgical Program Director at The Center for Endometriosis Care explains that, “Endometriosis results when tissue similar to that which lines the uterus grows in other areas of the body. The persistent survival of these cells results in chronic pelvic pain, organ dysfunction, infertility and more. Although the cause of the disease has remained unknown on a cellular level, there have been several different models established to explain its development.”
Since endometriosis is only found in menstruating primates, it is likely that the unique evolution of uterine development and menstruation are connected with the disease. Retrograde menstruation, the movement of cells up the fallopian tubes and into the pelvis, has long been considered by scientists as a probable cause of endometriosis. Since most women experience retrograde menstruation at some point, this model fails to explain why only ten percent of women develop the disease. In addition, this theory is insufficient at explaining instances where endometriosis arises independent of menstruation.
Bulun and Dyson theorize that there is an epigenetic switch that allows the expression of the genetic receptor GATA6 instead of GATA2. This results in progesterone resistance leading to development of the disease.
“We believe an overwhelming number of these altered cells reach the lining of the abdominal cavity, survive and grow,” Bulun said. “These findings could someday lead to the first noninvasive test for endometriosis.”
Bulun hopes that one day clinicians could prevent the disease by placing teenagers predisposed to this epigenetic change on a birth control pill regimen.
In further research, Dyson intends to investigate the epigenetic fingerprint that results from the presence of GATA6 instead of GATA2 and use the information as a potential diagnostic tool since such epigenetic differences are readily detectable.
“These findings have the potential to shift how we view and treat the disease moving forward,” Bulun said. This study was published in PLoS Genetics.

Adapter Turns Your iPhone Into An ‘Eye-Phone’

Lee Rannals for redOrbit.com – Your Universe Online
Stanford University School of Medicine researchers have developed two inexpensive adapters that enable a smartphone to capture high-resolution images from the front and back of the eye.
The new technology will make it easy for patients to take a picture of the eye and share it securely with other health practitioners or store it in the patient’s electronic record.
Robert Chang, MD, one of the developers and assistant professor of ophthalmology, said the technology is like Instagram for the eye.
Publishing a paper in the Journal of Mobile Technology in Medicine, the team wrote that the technology is an opportunity to increase eye-care services as well as improve the ability to advise patients on their own care remotely.
Standard photography equipment used to exam the eye can be very costly and require extensive training to use properly. Primary care physicians and emergency department staff also sometimes lack this equipment.
“Adapting smartphones for the eye has the potential to revolutionize the delivery of eye care — in particular, to provide it in places where it’s less accessible,” David Myung, MD, PhD, lead author of two upcoming papers describing the development and clinical experience with the devices, said in a statement. “Whether it’s in the emergency department, where patients often have to wait a long time for a specialist, or during a primary-care physician visit, this new workflow will improve the quality of care for our patients, especially in the developing world where ophthalmologists are few and far between.”
For example, imagine a car accident victim arriving in the emergency department with an eye injury resulting in blood inside the front of their eye, said Myung.
“Normally the physician would have to describe this finding in her electronic record with words alone. Smartphones today not only have the camera resolution to supplement those words with a high-resolution photo, but also the data-transfer capability to upload that photo securely to the medical record in a matter of seconds,” Myung said in a statement.
Chang, senior author of the paper, said that ophthalmology is a highly image-oriented field. He said that with smartphone technology, inexpensive attachments help health-care staff take a picture needed for an eye consultation.
“I started entertaining the idea of a pocket-sized adapter that makes the phone do most of the heavy lifting,” said Myung. “It took some time to figure out how to mount the lens and lighting elements to the phone in an efficient yet effective way.”
After taking an image of the front of the eye, he focused on visualizing the inside lining of the back of the eye, or the retina.
“Taking a photo of the retina is harder because you need to focus light through the pupil to reach inside the eye,” said Myung.
The researchers then used optics theory to determine the perfect working distance and lighting conditions for a simple adapter that connects a conventional examination lens to a phone. They shot hundreds of photos with various iterations until they were able to finally get it all right.
The team said the initial adapters will be available for purchase for research purposes only while the team seeks guidance from the Food and Drug Administration.
“We have gotten the production cost of each type of adapter to under $90 but the goal is to make it even lower in the future,” Chang said in a statement.

NASA Looks To Robots To Refuel And Repair Satellites In Orbit

Lee Rannals for redOrbit.com – Your Universe Online
NASA is using the International Space Station (ISS) as a test bed for technologies that could refuel and repair existing satellites in orbit.
The space agency said it is preparing another round of demonstrations on the space station to test the new technology. This testing will focus on real-time relative navigation, spacecraft inspection and the replenishment of cryogens in satellites that were not initially built for in-flight service.
The experiments are part of another initiative to equip robots and humans with tools and capabilities needed for spacecraft maintenance and repair, which could be useful for extended manned missions to places like an asteroid or Mars.
The Satellite Servicing Capabilities Office (SSCO) has been ongoing at NASA’s Goddard Space Flight Center in Greenbelt Maryland since 2009.
“With more than 400 satellites in space that could benefit from robotic servicing, we thought a refueling test was the best place to start,” Frank Cepollina, veteran leader of the five servicing missions to the Hubble Space Telescope, and associate director of SSCO, said in a statement. “We wanted to demonstrate technologies that build life-extension capabilities – and jumpstart a discussion about new ways to manage assets in space. We never planned to stop there, however. It was only the first step.”
SSCO’s Robotic Refueling Mission (RRM) and follow-up tests have demonstrated that remotely controlled robots could work through the caps and wires on a satellite fuel valve and transfer fluid into existent satellites.
NASA conducted a demonstration called the Remote Robotic Oxidizer Transfer Test (RROxiTT) last month where a robot remotely controlled from Goddard successfully transferred corrosive satellite oxidizer into a mock satellite tank located at Kennedy Space Center in Florida. Now that this test is over, NASA said SSCO is broadening its portfolio to include xenon transfer technology.
“The lessons we learned from the Robotic Refueling Mission contributed to RROxiTT’s success and gave us confidence for future demonstrations,” Benjamin Reed, deputy project manager of SSCO, said in a statement. “We continue to draw from what we learned on orbit.”
Reed said that with RROxiTT now off its checklist, SSCO is planning to move on to another technology arena.
“A core part of our work centers on filling up satellites and their instruments with fluids that could prolong their life,” Reed said. “But that is just one small part of the puzzle we’re unraveling. We’re also thinking about how those fluids will be delivered in the first place. What technologies would it take for a robotic servicer carrying fuel 22,000 miles above the Earth to rendezvous with another satellite that’s waiting for service, perhaps even tumbling in multiple axes simultaneously?”
NASA said crew members on board the space station will be installing a new RRM tool and task boards on the RRM module.
“It’s an extremely active time for SSCO,” says Reed. “But we thrive on these challenges. We’re eager to see how these servicing technologies, and the capabilities that they enable, could benefit satellite owners and operators through life extension and assembly options. To replace or repair a car 5, 10 or 15 years into its life is a decision individuals routinely make. We want to give that option to satellite owners, whom previously have had only one option – decommission at end of life.”
Image Below: A robot servicer could use autonomous rendezvous and fluid transfer technologies to extend the life of orbiting satellites (depicted, artist’s concept). Credit: NASA

Pine Tree Branches Turned Into Effective Water Filtration Systems

Lawrence LeBlond for redOrbit.com – Your Universe Online

The next time you find yourself lost in the woods with no clean drinking water, the nearest pine tree may save your life. While lake or pond water may provide some short-term relief from dehydration when in the wild, these sources of water are not always clean.

This is where the pine tree comes into play. Pouring lake water through a freshly-peeled pine tree limb can effectively remove most bacteria that may exist in the water, leaving you with a clean and fresh source of H2O.

This process is so effective that MIT researchers, publishing a paper in the journal PLOS ONE on Feb. 26, found that the low-tech filtration system can produce up to four liters of clean drinking water per day.

The research team demonstrated that a small piece of sapwood can filter out 99 percent of E. coli bacteria from water. This sapwood, which contains xylem tissue that helps transport sap up through the tree, has pores that allow water to pass through but trap most bacteria from filtering through.

Rohit Karnik, an associate professor of mechanical engineering at MIT, says that this sapwood is a low-cost efficient material for filtering water and could go a long way in helping rural communities where advanced filtration systems may not be accessible.

“Today’s filtration membranes have nanoscale pores that are not something you can manufacture in a garage very easily,” said Karnik, one of the study coauthors. “The idea here is that we don’t need to fabricate a membrane, because it’s easily available. You can just take a piece of wood and make a filter out of it.”

This sapwood system also doesn’t have the drawbacks that come with other more advanced filtration systems. Chlorine-based filters work well but can be expensive; boiling water also relies on costly fuels to heat the water; membrane-based filters are also expensive and require a pump and easily become clogged.

The xylem network in pine trees consists of a system of vessels and pores, helping sap move from the roots to the crown of the tree. Each vessel wall is pockmarked with tiny pores called pit membranes, through which sap can flow from one vessel to the next as it feeds the tree. These pores also limit cavitation – a process by which air bubbles can grow and spread in the xylem, leading to tree death. The xylem’s pores actually trap bubbles, preventing them from spreading throughout the tree.

“Plants have had to figure out how to filter out bubbles but allow easy flow of sap,” Karnik noted. “It’s the same problem with water filtration where we want to filter out microbes but maintain a high flow rate. So it’s a nice coincidence that the problems are similar.”

For the study, the team collected branches of white pine and stripped off the outer bark. They cut small sections of sapwood measuring about an inch long and half-inch wide and then mounted each in plastic tubing, sealed with epoxy and secured with clamps.

The team first used water dyed with red ink particles ranging from 70 to 500 nanometers in size. Once the water had filtered through the system, the team cut open the sapwood filter lengthwise and observed that much of the red dye was contained within the very top layers of the wood. The filtered water, which passed through easily, was free of any red dyes and crystal clear. This experiment proved that sapwood is naturally able to filter out any particles larger than 70 nanometers in size.

In a different experiment, the team found that sapwood is unable to filter out particles 20 nanometers in size or smaller, suggesting there is a limit to the size of particles that sapwood can naturally filter.

In a third experiment, the team used inactivated E. coli-contaminated water. After pouring the contaminated water through the sapwood filtration system, the team examined the xylem under a fluorescent microscope, noticing that the bacteria had accumulated around pit membranes within the first few millimeters of the wood. Through their calculations, they determined that the sapwood was able to filter out more than 99 percent of the E. coli from the water.

Based on these findings, Karnik believes that sapwood can filter out most types of bacteria – the smallest bacteria measure around 200 nanometers. He said it is unlikely that the filtration technique cannot trap most viruses, as most of these are much smaller in size.

He said his team now plans to experiment with other types of sapwood to see if nature’s filtration system exists in other trees as well. Flowering trees typically have smaller pores than coniferous trees, suggesting they may be able to filter out smaller bacteria and possibly viruses. However, the vessels in flowering trees are much longer, which could ruin their chances of being feasible filtration systems.

Another key issue is that the sapwood would need to remain damp in order to be used effectively. Once the sapwood dries, it cracks and cannot properly filter contaminants from the water.

“There’s huge variation between plants,” Karnik said in a statement. “There could be much better plants out there that are suitable for this process. Ideally, a filter would be a thin slice of wood you could use for a few days, then throw it away and replace at almost no cost. It’s orders of magnitude cheaper than the high-end membranes on the market today.”

Karnik’s research was funded by the James H. Ferry Jr. Fund for Innovation in Research Education. His coauthors include Michael Boutilier and Jongho Lee from MIT, Valerie Chambers from Fletcher-Maynard Academy in Cambridge, Mass., and Varsha Venkatesh from Jericho High School in Jericho, N.Y.

Images Below: (LEFT) A false-color electron microscope image showing E. coli bacteria (green) trapped over xylem pit membranes (red and blue) in the sapwood after filtration.(RIGHT) Researchers design a simple filter by peeling the bark off a small section of white pine, then inserting and securing it within plastic tubing. Credits: R. Karnik/M. Boutilier/J. Lee/V. Chambers/V. Venkatesh

Dentists May One Day Utilize Genetic Technique For Dental Care

Lee Rannals for redOrbit.com – Your Universe Online

Researchers from the University of Adelaide’s School of Dentistry say a visit to the dentist could eventually require a detailed look at a patient’s genes.

The team wrote in the Australian Dental Journal that one day dentists may have to look at a patient’s genes to determine which ones are being switched on and off. The researchers believe that this field of epigenetics will have a big role to play in the future of dental hygiene.

“Our genetic code, or DNA, is like an orchestra – it contains all of the elements we need to function – but the epigenetic code is essentially the conductor, telling which instruments to play or stay silent, or how to respond at any given moment,” co-author Associate Professor Toby Hughes said in a statement.

He said in terms of oral health, epigenetic factors help orchestrate healthy and unhealthy states in our mouths.

“They respond to the current local environment, such as the type and level of our oral microbes, regulating which of our genes are active. This means we could use them to determine an individual’s state of health, or even influence how their genes behave. We can’t change the underlying genetic code, but we may be able to change when genes are switched on and off,” Hughes said.

He and colleagues at the university have been studying the underlying genetic and environmental influences on dental hygiene. Hughes says that since the completion of the Human Genome Project in 2007, epigenetics has had an increasing role in biological and medical research.

“Dentistry can also greatly benefit from new research in this area,” Hughes said in a statement. “It could open up a range of opportunities for diagnosis, treatment and prevention.”

Scientists already know that genome plays a role in dental development, as well as a range of other oral disease. They also understand that oral microbiota plays a key role in the state of oral health.

“We now have the potential to develop an epigenetic profile of a patient, and use all three of these factors to provide a more personalized level of care,” says Hughes. “Other potential oral health targets for the study of epigenetics include the inflammation and immune responses that lead to periodontitis, which can cause tooth loss; and the development and progression of oral cancers.”

Eventually, the researchers say this technique could mean dentists will be able to create a “personalized medicine” approach to managing common oral diseases in patients.

“What’s most exciting is the possibility of screening for many of these potential oral health problems from an early age so that we can prevent them or reduce their impact,” Hughes said.

New Technology May Allow For ATM Transactions Using Google Glass

Peter Suciu for redOrbit.com – Your Universe Online
Researchers at the Max Planck Institute for Informatics at Saarland University in Germany have developed a prototype system that utilizes Google Glass to access ATM machines.
Currently the wearable display technology from the search giant allows users to check a calendar with a glance, take a photo with a wink, or simply read text messages. Now German university researchers, who are one of the few in Europe that have been able to take the possibilities of what Google Glass can do to the next level, think it could help do some banking.
Google Glass, which is still in the prototype stage, is little more than a futuristic looking pair of glasses that has a camera and mini-computer installed. It is able to record what the wearer sees while also sending information to the user’s field of vision via a glass prism.
The technology has been in the news in recent months, notably in January when the first case of a driver being cited for using the technology was thrown out of court in California. In that particular case the judge threw out the case for lack of evidence as to the issue of speeding – not that the accused may have been wearing the glasses.
While the legality of using the technology is still being defined, the German researchers have created a method where Google Glass could be used at an ATM – possibly replacing an ATM card. Dominique Schröder, assistant professor of Cryptographic Algorithms at Saarland University, found that at the proper distance – from about two and a half meters – Google Glass could provide a key to encrypt the one-way personal identification number (PIN).
In addition it could provide a “digital signature,” a digital counterpart of the conventional signature.
In this case, the result shows up on the screen as a black-and-white pattern, a so-called QR code, where the PIN that is hidden below is only visible for the identified wearer of the glasses. Google Glass could be used to decrypt it and display it only in the wearer’s field of vision.
“Although the process occurs in public, nobody is able to spy on the PIN,” Schröder said in a statement.
Where this is different from other encryptions is that the PIN can only been seen by the Google Glass wearer, and Schröder added that this would not be the case were the PIN is sent to a smart phone, as someone could glance at the screen.
Moreover, this could become a one-time code. Anyone seeing it being entered could use it again, since the PIN can be re-generated each time the customer accesses the cash machine.
The researchers further argue that an attacker also wearing a Google Glass could not be able to spy on the process either. This they contend is because any digital signature guarantees that no assailant could be able to intrude between the customer and the cash machine as during the so-called ‘skimming.’
According to the researchers, only the authorized customer could be able to decrypt the encryption by the public key with his secret key.
“The nice thing about a head mounted display is no one other than you can see the display – assuming it isn’t hacked and someone is viewing it remotely,” Rob Enderle, principal analyst at the Enderle Group told redOrbit. “However if you use a single use PIN properly it is only good one time and it is tied to your device so the value of that little bit of extra security is negligible.
“In addition Google has been particularly bad with privacy, which speaks to challenge question for password resets, and Android represents what may be the biggest security exposure currently in market so any security advantage is likely more than offset with the platform’s security exposures,” Enderle added. “So, in concept, using a product like Goggle Glass for multi-factor security would be an improvement in practice, given how poorly Google does security, it would be a foolish idea with Google Glass.”
“It’s another case of a solution looking for a problem,” added Roger Entner, principal analyst at Recon Analytics.
One particular issue is that, as noted, there is the issue of whether this just creates another system that could be hacked.
“Absolutely,” Entner told redOrbit. “What happens if you lose your Google Glasses? It’s just a self-perpetuating problem.”
It is of course important to note that Google Glass isn’t the primary security method in place.
“They are using it as a secondary security measure to receive a unique code that changes each time,” said Jim McGregor, founder and principal analyst at TIRIAS Research. “However, if someone has access to your card, pin, and Google Glass, then they should be able to get to your money. This does not eliminate the threat, but it does add another level of security to prying eyes. the best solution is still biometrics, but nothing is completely secure. The real question is would consumers accept the added security for with the added complexity or inconvenience of completing the transaction?”
Instead Google Glass could add an additional layer of security.
“This is exactly what they are proposing. If Google Glass was cheap and as common as a mobile handset, I would say that this has potential,” McGregor told redOrbit. “However, it does demonstrate the flexibility of these new computing form factors and platforms, and the innovation around them that we will eventually see. For example, suppose that a version of Google glass has a sensor just tracking the eye. then you could use Google Glass for real-time augmented reality as you look around or possibly even use the image of your retina as a biometric key.”
There is also the issue of whether Google Glass could in fact put wearers at risk.
Last week two women and a man reportedly attacked a tech writer who was wearing Google Glass. While she recovered the glasses, her purse with wallet and cell phone were reportedly not recovered. It wasn’t clear if her wearing of Google Glass had spawned the attack.
While the legality is still being determined, Google recently laid out a set of social guidelines that its Glass “Explorer” community should follow, asking them to not be “glassholes.”
“Respect others and if they have questions about Glass don’t get snappy. Be polite and explain what Glass does and remember, a quick demo can go a long way. In places where cell phone cameras aren’t allowed, the same rules will apply to Glass,” Google said in an official blog post. “If you’re asked to turn your phone off, turn Glass off as well. Breaking the rules or being rude will not get businesses excited about Glass and will ruin it for other Explorers. Glass is a piece of technology, so use common sense. Water skiing, bull riding or cage fighting with Glass are probably not good ideas.”
While that could all fall into the category of common sense, this technology could however have advantages for sharing of corporate data. Already a large electric company has requested the computer scientists in Saarbrücken to determine future uses for this technology.
“This could be interesting, for example, for large companies or agencies that are collecting information in one document, but do not want to show all parts to everybody,” added Mark Simkin, who was one of the developers of Ubic.
Google Glass is expected to enter the American market this year.

Discarded Hop Leaves Have Substances That Could Fight Dental Diseases

American Chemical Society

Beer drinkers know that hops are what gives the drink its bitterness and aroma. Recently, scientists reported that the part of hops that isn’t used for making beer contains healthful antioxidants and could be used to battle cavities and gum disease. In a new study in ACS’ Journal of Agricultural and Food Chemistry, they say that they’ve identified some of the substances that could be responsible for these healthful effects.

Yoshihisa Tanaka and colleagues note that their earlier research found that antioxidant polyphenols, contained in the hop leaves (called bracts) could help fight cavities and gum disease. Extracts from bracts stopped the bacteria responsible for these dental conditions from being able to stick to surfaces and prevented the release of some bacterial toxins. Every year, farmers harvest about 2,300 tons of hops in the United States, but the bracts are not used for making beer and are discarded. Thus, there is potentially a large amount of bracts that could be repurposed for dental applications. But very few of the potentially hundreds of compounds in the bracts have been reported. Tanaka’s group decided to investigate what substances in these leaves might cause those healthful effects.

Using a laboratory technique called chromatography, they found three new compounds, one already-known compound that was identified for the first time in plants and 20 already-known compounds that were found for the first time in hops. The bracts also contained substantial amounts of proanthocyanidins, which are healthful antioxidants.

Magma Chamber Of Galapagos’ Sierra Negra Volcano Imaged In 3D

April Flowers for redOrbit.com – Your Universe Online

Some of the most active volcanoes in the world can be found in the Galapagos Islands. These volcanoes are responsible for more than 50 eruptions in the last 200 years. Even with such activity, scientists knew far more about the history of finches, tortoises and iguanas on the islands than the volcanoes upon which these unusual fauna had evolved.

A new study from the University of Rochester, published in the Journal of Geophysical Research: Solid Earth, is providing researchers with a more complete picture of the “plumbing system” that feeds the Galapagos volcanoes. The findings are also illustrating a difference between the Galapagos and another Pacific Island chain—the Hawaiian Islands.

“With a better understanding of what’s beneath the volcanoes, we’ll now be able to more accurately measure underground activity,” said Cynthia Ebinger, a professor of earth and environmental sciences. “That should help us better anticipate earthquakes and eruptions, and mitigate the hazards associated with them.”

Ebinger collaborated with Mario Ruiz from the Instituto Geofisico Escuela Politecnica Nacional in Quito, Ecuador, and others to bury 15 seismometers around Sierra Negra, the largest and most active volcano in the Galapagos Islands. Using these seismometers, the team measured the velocity and direction of sound waves generated by earthquakes as they traveled beneath the volcano. Understanding that the temperature and types of material that sound waves pass through change the behavior of those waves allowed the team to construct a 3D image of the plumbing system beneath the volcano. The team used a technique similar to a CAT-scan.

Just over three miles below the surface is the beginning of a large magma chamber that lies partially within old oceanic crust that had been buried by more than five miles of eruptive rock layers. The oceanic crust appears to have a thick underplating of rock that would have formed as magma became trapped beneath the crust and cooled. This is very similar to the processes that occur under the Hawaiian Islands.

The Galapagos has something else in common with the Hawaiian chain, as well. The data collected by the team suggests the presence of a large chamber filled with crystal-mush magma. Crystal-mush magma is cooled magma that includes crystallized minerals.

In a process very similar to how the Hawaiian Islands formed, the Galapagos Islands were created from a hotspot of magma located in an oceanic plate called Nazca, about 600 miles off the coast of Ecuador. The islands were formed as magma rose from the hotspot and eventually hardened. The Nazca plate inched westward, forming new islands in the same manner, creating the present-day Galapagos Archipelago.

Ebinger discovered a major difference between the two island chains during her study.

In the Hawaiian Islands, the major volcanoes are dormant because they have moved away from the hotspot that provided their source of magma. The Galapagos volcanoes are connected to the same plumbing system, Ebinger’s team found. Using satellite imagery of the volcanoes, the team noticed that as the magma would sink in one volcano it would rise in another. This suggests that some of the youngest volcanoes had magma connections, even if those connections were temporary.

“Not only do we have a better understanding of the physical properties of Sierra Negra,” said Ebinger, “we have increased our knowledge of island volcano systems, in general.”

Image Below: This illustration shows the plumbing system beneath the Sierra Negra volcano. Credit: Cynthia Ebinger, University of Rochester

New Process Recycles Milk Jugs Into 3D Printer Filament

redOrbit Staff & Wire Reports – Your Universe Online

Not only is manufacturing goods using a 3D printer far cheaper than purchasing items, new research appearing in a recent edition of the Journal of Cleaner Production reveals that it can actually help preserve the environment.

The 3D printing process was very expensive when Charles W. Hull of 3D Systems Corp created the first working model in 1984, and while the costs have dropped dramatically over the past 30 years, the cost of purchasing plastic filament still needs to be factored in. The new study, however, shows how old milk jugs can reduce those expenses.

In their study, Michigan Technological University associate professor of materials science and engineering/electrical and computer engineering Joshua Pearce and his colleagues demonstrated that using milk jugs made from HDPE plastic to create the 3D printer filament actually uses less energy than conventional recycling of the beverage containers.

The milk jug was cleaned, cut into pieces and run through an office shredder. It was then sent through a device known as a RecycleBot, which turns waste plastic into 3D printer filament. When compared to urban recycling programs that collect and process plastic locally, the RecycleBot process required about three percent less power.

“Where it really shows substantial savings is in smaller towns like Houghton, where you have to transport the plastic to be collected, then again to be recycled, and a third time to be made into products,” Pearce said.

Under those circumstances, the energy savings soared to between 70 and 80 percent, and recycling your own milk jugs also uses 90 percent less energy than making virgin plastic from petroleum, the researchers noted.

In terms of cost, Pearce said that filament retails for between $36 and $50 per kilogram. By using recycled plastic, a person can produce homemade filament for 10 cents per kilogram. Even factoring in the roughly $300 cost of the RecycleBot, he said that there was “a clear incentive” to produce filament using recycled plastic containers.

However, the study authors also report that the HDPE plastic used in milk jugs is not ideal as a 3D printer filament component, as it shrinks slightly as it cools. Even so, this technology has reportedly drawn interest from the Ethical Filament Foundation, an organization seeking an environmentally friendly and ethically produced alternative production method to keep up with the demands of the expanding 3D Printing market.

“In the developing world, it’s hard to get filament, and if these recyclers could make it and sell it for, say, $15 a kilogram, they’d make enough money to pull themselves out of poverty while doing the world a lot of good,” said Pearce, who was the corresponding author and worked alongside Michigan Tech colleagues Megan Kreiger, Meredith Mulder, and Alexandra Glover on the Journal of Cleaner Production paper.

Currently, 3D printing is used in the fields of architecture, construction, industrial design, engineering, medical technology and even the fashion industry. Last month, NASA announced that it would be launching several formal programs to prototype new tools for future missions using the increasingly popular new manufacturing technique.

An Open Access version of the research paper is available here.

Study Explores Cocaine And The Pleasure Principle

Julie Cohen, University of California – Santa Barbara
Researchers use animal models to demonstrate that the net result of cocaine use is a balance of both positive and negative effects

On the other side of the cocaine high is the cocaine crash, and understanding how one follows the other can provide insight into the physiological effects of drug abuse. For decades, brain research has focused on the pleasurable effects of cocaine largely by studying the dopamine pathway. But this approach has left many questions unanswered.
So the Behavioral Pharmacology Laboratory (BPL) at UC Santa Barbara decided to take a different approach by examining the motivational systems that induce an animal to seek cocaine in the first place. Their findings appear in today’s issue of The Journal of Neuroscience.
“We weren’t looking at pleasure; we were looking at the animal’s desire to seek that pleasure, which we believe is they key to understanding drug abuse,” said Aaron Ettenberg, a professor in UCSB’s Department of Psychological and Brain Sciences who established the BPL in 1982. The lab has been particularly active in the development and use of novel behavioral assays that provide a unique approach to the study of drug-behavior interactions.
The findings suggest that the same neural mechanism responsible for the negative effects of cocaine likely contribute to the animal’s decision to ingest cocaine. “Just looking at the positive is looking at only half the picture; you have to understand the negative side as well,” said Ettenberg.
“It’s not just the positive, rewarding effects of cocaine that drive this desire to seek the drug” he said. “It’s the net reward, which takes into account the negative consequences in addition to the positive. Together the two determine the net positive output that will lead to the motivated behavior.”
Ettenberg’s team chose to study norepinephrine (also called noradrenaline), because cocaine is known to act upon this primary neurotransmitter. The researchers chose two places in the brain — the bed nucleus of the stria terminalis (BNST) and the central nucleus of the amygdala (CeA) — because both have been implicated in the aversive effects of such emotional processes as fear conditioning and general anxiety. Norepinephrine is a major transmitter in these two brain systems and plays a part in regulating anxiety.
Lead author Jennifer Wenzel chose a unique way to reproduce the results of previous work she had done at UCSB, where she earned her Ph.D. in 2013. An earlier study used reversible lesions in the BNST and CeA to block the function in these two areas and then examined their effects in a unique animal model of cocaine self-administration.
For that study, the investigators trained rats to run down a custom-built 6-foot-long runway for a daily dose of cocaine. Each day they responded more quickly than the last, demonstrating an increasing motivation to get cocaine.
“Over several trials, however, rats developed an ambivalence about entering the goal box: they rapidly approached the goal but then turned and retreated back toward the start box,” Wenzel explained. “These retreats can happen several times before rats finally enter the goal box and receive an injection of cocaine.”
This retreat behavior became more and more prevalent as testing continued and reflects the animals’ learning that negative effects (the crash) follow the positive effects (euphoria) of cocaine. Blocking the function of the BNST or the CeA resulted in a dramatic decrease in retreat behavior because the negative effects of the drug were blocked.
In the newly published paper, the researchers used drugs that selectively block the action of the neurotransmitter, noradrenaline, in the BNST and CeA rather than the entire function. The results were similar to those in the earlier study. “If you put norepinephrine antagonists directly into the BNST or the CeA, you can prevent or dramatically attenuate the negative effects of cocaine, leaving the positive effects intact,” Ettenberg explained. “So the animals show fewer retreats in the runway.”
The study looked at acute cocaine use with only one injection a day, which is not considered a model of addiction. So the natural extension of this paper’s line of inquiry is how the positive and negative systems associated with cocaine use change when animals are exposed to multiple doses in any given day (i.e. addiction). Subsequent studies have demonstrated that as the animals become addicted to the drug, the positive consequences get reduced and negative effects get exaggerated so the net experience is less positive. To overcome the decreased positive effects, users increase the dose, which creates a behavioral spiral.
“We need to more fully understand the underlying neuronal mechanisms altered by cocaine before we can treat people,” Ettenberg said. “Once we understand how the brain systems producing the positive/euphoric and negative/anxiety effects of the drug interact, we might be able to produce treatments that address the balance between these two opposing actions, both of which serve as strong driving forces. We therefore need to understand both of these systems in order to come up with a rational treatment down the line.”

New Program Preparing Students With Autism For Life After High School

The University of North Carolina at Chapel Hill
An innovative program from UNC’s Frank Porter Graham Child Development Institute (FPG) and 6 partner universities is preparing students with autism for life after high school.
“Public high schools may be one of the last best hopes for adolescents with autism—and for their families,” said FPG director Samuel L. Odom. “Many of these students will face unemployment and few social ties after school ends.”
According to Odom, teachers and other professionals in the schools work hard to achieve beneficial results for students with autism spectrum disorders (ASD). But positive outcomes remain elusive, given the scarcity of specific programs in high schools designed to help adolescents with ASD.
To fill this gap, Odom and other scientists formed the Center on Secondary Education for Students with Autism Spectrum Disorders (CSESA).
“We developed our approach from research in several fields,” said Odom, CSESA’s principal investigator. “Because of the complex educational needs of many students with ASD, it was important to develop a comprehensive program for high schools.”
CSESA focuses on understanding emotions, developing friendships, and social problem-solving. Early results at a high school in the Raleigh-Durham, N.C. area  show that student groups designed to bring together adolescents with and without ASD have helped them engage with one other more often.
“Even a simple hallway ‘hello’ between students with autism and their peers is more likely now,” said Kara Hume, CSESA’s project director and co-principal investigator.
CSESA also addresses literacy skills, which can be limited in many students with ASD. At Myers Park High School (MPHS) in Charlotte, N.C., the program helped with Christopher Stickell’s inclusion in an English class.
“Not only did my son have access to a wider world than his self-contained classroom, but the students in the English class had some of their pre-conceived notions about autism shattered,” said Lois Stickell. “Many were surprised when Chris read aloud a passage from Julius Caesar.”
“We help develop basic high school survival skills,” Hume said, adding that another cornerstone of the program is its emphasis on promoting responsibility, independence, and self-management.
According to Odom, many teens with ASD continue to live with their parents after high school. “Not surprisingly, parents worry about the future as they anticipate their child’s transition out of the public schools,” he said.
“CSESA has provided opportunities for greater collaboration and relationship building with the families who have attended ‘Transitioning Together’ sessions,” said Phyllis Alston, the exceptional children teacher for compliance at MPHS. Each week, CSESA staff and school district personnel lead these discussion groups with families.
“We became aware of resources available that without CSESA we may not have been made aware of,” said Faith Hamilton, whose teenager will be attending Central Piedmont Community College in the fall to study photography. “My son gained confidence and his grades improved this year.”
According to Odom, built into the CSESA program’s design are features that help to install it quickly and successfully, such as “autism teams,” which spearhead efforts within the schools. The program also uses coaching to provide feedback on new practices as teachers implement them.
Although most complex programs may take 5 to 7 years to put into place, Odom said schools in N.C. and 5 other states began using CSESA’s approach within weeks, and he projects that they will be administering the program on their own within 2 ½ years.
“CSESA will expand to 60 more schools over the next 3 years,” he added. “We hope a lot more students with autism spectrum disorders will be able to leave high school better prepared for the challenges they’ll face.”

WHO Says We Should Half Our Daily Sugar Intake

Lee Rannals for redOrbit.com – Your Universe Online
The World Health Organization (WHO) is recommending that people cut their sugar intake in half in order to combat the obesity epidemic.
WHO’s 2002 recommendation stated that sugars should make up less than 10 percent of total energy intake per day. However, the latest recommendations by the organization says that sugars should be less than 5 percent of total energy intake, or about six teaspoons.
“Much of the sugars consumed today are “hidden” in processed foods that are not usually seen as sweets. For example, 1 tablespoon of ketchup contains around 4 grams (around 1 teaspoon) of sugars. A single can of sugar-sweetened soda contains up to 40 grams (around 10 teaspoons) of sugar,” WHO said in a statement.
The new guidelines are based on analyses of published scientific studies on the consumption of sugars and how that relates to excess weight gain and tooth decay in adults and children. The proposed limits apply to all monosaccharides and disaccharides that are added to food by the manufacturer, cook or consumer. These limits also apply to natural sugars found in honey, syrups, fruit juices and fruit concentrations.
“Obesity now affects half a billion people in the world, and it is on the rise in all age groups and particularly in low- and middle-income countries,” Francesco Branca, WHO’s director of nutrition for health and development, told Reuters.
The organization said there is an increasing concern that consumption of free sugars may result in both reduced intake of food containing more nutritionally adequate calories and an increase in total caloric intake, which could lead to an unhealthy diet, weight gain and an increased risk of noncommunicable diseases (NCDs).
Free sugars also play a role in dental disease, which cost between five and 10 percent of health budgets in industrialized countries.
“The objective of this guideline is to provide recommendations on the consumption of free sugars to reduce the risk of NCDs in adults and children, with a particular focus on the prevention and control of weight gain and dental caries,” WHO said in a statement.
The organization said when the guidelines are finalized, program managers and policy planners should assess current intake of free sugars relative to a benchmark and develop measures to decrease intake of free sugars through public health interventions.
WHO’s announcement comes a day after Britain’s chief medical officer suggested a sugar tax be set into place in order to curb obesity rates, according to BBC News. Sally Davies told a health select committee this week that research has shown that sugar is addictive, and one day a sugar tax may need to be introduced.
“We have a generation of children who, because they’re overweight and their lack of activity, may well not live as long as my generation,” Davies told the committee. “We may need to move towards some form of sugar tax, but I hope we don’t have to.”

Computer Reads Text Written Into The Air

Karlsruhe Institute of Technology
In the future, computers and humans will cooperate more seamlessly. May it be by easier access to data or by the intuitive control of programs and robots. At the CeBIT, latest innovations in this area will be presented by the Karlsruhe Institute of Technology and the FZI Research Center for Information Technology (hall 9, stand D13). The exhibits range from gesture-controlled communication to firewalls to data management to computer-supported surgery.
Writing without Keyboard: Handwriting Recognition Based on the Hand’s Movement
Writing into the air instead of typing text messages on the mobile phone via the tiny keyboard? This may be done using a sensor wristband, which records hand movements. A computer system translates them into texts. The novel airwriting system of KIT uses gestures as inputs and is suited in particular for mobile communication devices and so-called wearable computing applications.
The airwriting system made in Karlsruhe may be applied in future mixed-reality applications. In combination with smart glasses, i.e. glasses displaying information in the users’ field of view, the airwriting wristband can be used to input commands and texts by gestures without holding a mobile device in the hand. The prototype airwriting system is showcased at the CeBIT stand. In the course of the Future Talks lecture series (hall 9, stand F99), the developers Tanja Schultz and Christoph Amma will present the airwriting system on Thursday, March 13, 2014, 13 to 13.30 hrs.
Surgeons Feel and See via Operation Robots
Within OP:Sense, KIT develops methods for future robot-supported surgery. The system focuses on supporting and relieving the strain of the surgeon. It provides novel options for interactive control and sensor feedback. OP:Sense is a modular platform to study novel methods for the secure and precise execution of robot-supported operations.
OP:Sense consists of two robot arms controlled by the surgeon via haptic input devices and several 3D cameras monitoring the working space around the field of operation. Based on this scene monitoring, new safety concepts are developed for close human-robot cooperation in the operation theater. On this basis, further research is conducted in particular in the area of situation recognition. At the CeBIT, the system will be presented live.
Click here for other projects of KIT’s medical engineers.
Enhanced Security by Combination of Several Firewalls
Firewalls provide protection against attacks from the internet. Figuratively speaking, they filter out “harmful” data packages from the incoming data flow and transmit “good-natured” packages only. Security gaps, however, cannot be excluded completely. Sometimes, the firewalls cannot be trusted entirely or built-in loopholes are used by mischievous attackers.
In a collaboration between the Karlsruhe Institute of Technology (KIT) and the FZI Research Center for Information Technology, the Competence Center of Applied Security Technology KASTEL developed a concept for the secure combination of network firewalls: A specialized hardware module securely implements the combination of several firewalls. The security of this approach has been proven in a formal model. A working prototype as well as an illustrating model are presented at CeBIT.
Participatory Collection of Data by Smartphone
Modern smartphones and their built-in positioning and activity monitoring sensors can simplify complex data collection processes significantly. This concept is known as participatory sensing.
At the CeBIT, FZI scientists will demonstrate the potentials of partici-patory sensing by three applications: 1) A freely configurable platform allows local governments to record danger spots and damages of the local infrastructure with the help of their citizens. In Karlsruhe, the corresponding application called KA-Feedback is being used already. 2) Another prototype is designed for leisure parks to measure flows of visitors and directly recommend rides that are less frequented at the moment. 3) The third prototype of Disy Informationssysteme GmbH, a spinoff of FZI and KIT, shows how ideas of participatory sensing can be merged with the acquisition of complex geodata on tablets to facilitate field work of public administration.
More information on FZI’s real-time data processing research.
Predictive Data Analytics: Decision Models Based on Large Data Volumes
The volume of business and market data available to companies is increasing steadily. At the CeBIT, scientists of FZI will demonstrate how large data volumes can be evaluated specifically by means of predictive data analytics.
With various applications being used as examples, latest analytics solutions for decision-making will be presented. For instance, data analytics can be used to optimize computing centers or to predict key figures for business management. Data analytics can also be used to carry out targeted marketing campaigns. This idea will be presented at the stand by PriceNow, a spinoff of FZI and KIT.
More information on FZI’s research into big data and service science.
Stand of the Federal Ministry for Economic Affairs and Energy: Security in the Mobile Cloud (hall 9, stand E24)
Theft-proof processing of company data in a cloud? This is ensured by the IT modules developed by CAS Software AG, KIT, and WIBU-SYSTEMS AG under the “MimoSecco” project: The model is based on a smart separation of the three zones of use, processing, and storage as well as on encrypting and fragmentation of the data inventories. By means of a database adapter, the data are encrypted and stored in the cloud in a distributed manner. The database adapter uses a hardware token as a security module for encrypting and decrypting. Upon the query of an authorized user, only the data needed for processing are available in decrypted form for a short term. Moreover, data accesses can be made dependent on the context of the user (place, time, etc.). The consortium will present the “MimoSecco” results at the stand of the Federal Ministry for Economic Affairs and Energy that funds the project (hall 9, stand E24). Demonstrators will illustrate the process, with the data of a solar farm being used as an example.
Information on KIT’s participation in the CeBIT can also be found at: http://www.pkm.kit.edu/english/cebit2014.php
Information on FZI’s participation in the CeBIT can be found at: http://url.fzi.de/cebit

Detecting Software Errors Using Genetic Algorithms

Saarland University

This news release is available in German.

According to a current study from the University of Cambridge, software developers are spending about the half of their time on detecting errors and resolving them. Projected onto the global software industry, according to the study, this would amount to a bill of about 312 billion US dollars every year. “Of course, automated testing is cheaper,” explains Andreas Zeller, professor of Software Engineering at Saarland University, as you could run a program a thousand times without incurring any charges. “But where do these necessary test cases come from?,” asks Zeller. “Generating them automatically is tough, but thinking of them yourself is even tougher.”

In cooperation with the computer scientists Nikolas Havrikov and Matthias Höschele, he has now developed the software system “XMLMATE.” It generates test cases automatically and uses them to test the given program code automatically. What is special about it is that the only requirement the program to be tested has to meet is that its input must be structured in a certain way, since the researchers use it to generate the initial set of test cases. They feed them to the so-called genetic algorithm on which the testing is based. It works similarly to biological evolution, where the chromosomes are operating as the input. Only the input that covers a significant amount of code which has not been executed yet survives. As Nikolas Havrikov explains their strategy: “It is not easy to detect a real error, and the more code we are covering, the more sure we can be that more errors will not occur.” Havrikov implemented XMLMATE. “As we use the real existing input interface, we make sure that there are no false alarms: Every error found can also happen during the execution of the program,” adds Zeller.

The researchers have unleashed their software on open source programs users are already working with in daily life. With their program they detected almost twice as many fatal errors as similar test methods that only work with randomly generated input. “But the best thing is that we are completely independent from the application area. With our framework, we are not only able to test computer networks, the processing of datasets, websites or operating systems, but we can also examine software for sensors in cars,” says Zeller.

The computer scientists in Saarbrücken developed XMLMATE in the Java programming language. The input for the software to test is defined according to the description language XML, so the existence of a XML schema is helpful. Since XML is standardized and considered as a kind of world language between input formats, most of the programming input fits XMLMATE and if not, it can be quickly converted to do so with the corresponding tools.

Powerful 3D Spectrograph Successfully Installed On ESO’s Very Large Telescope

[ Watch The Video: MUSE Views the Unusual Galaxy NGC 4650A ]

ESO

A new innovative instrument called MUSE (Multi Unit Spectroscopic Explorer) has been successfully installed on ESO’s Very Large Telescope (VLT) at the Paranal Observatory in northern Chile. MUSE has observed distant galaxies, bright stars and other test targets during the first period of very successful observations.

Following testing and preliminary acceptance in Europe in September 2013, MUSE was shipped to ESO’s Paranal Observatory in Chile. It was reassembled at the base camp before being carefully transported to its new home at the VLT, where it is now installed on Unit Telescope 4. MUSE is the latest of the second generation instruments for the VLT (the first two were X-shooter and KMOS and the next, SPHERE, will follow shortly).

The leader of the team and principal investigator for the instrument, Roland Bacon (Centre de Recherche Astrophysique de Lyon, France), expressed his feelings: “It has taken a lot of work by many people over many years, but we have done it! It seems strange that this seven-tonne collection of optics, mechanics and electronics is now a fantastic time machine for probing the early Universe. We are very proud of the achievement — MUSE will remain a unique instrument for years to come.”

MUSE’s science goals include delving into the early epochs of the Universe to probe the mechanisms of galaxy formation and studying both the motions of material in nearby galaxies and their chemical properties. It will have many other applications, ranging all the way from studies of the planets and satellites in the Solar System, through the properties of star-forming regions in the Milky Way and out to the distant Universe.

As a unique and powerful tool for discovery MUSE uses 24 spectrographs to separate light into its component colors to create both images and spectra of selected regions of the sky. It creates 3D views of the Universe with a spectrum for each pixel as the third dimension. During the subsequent analysis the astronomer can move through the data and study different views of the object at different wavelengths, just like tuning a television to different channels at different frequencies.

MUSE couples the discovery potential of an imaging device with the measuring capabilities of a spectrograph, while taking advantage of the much better image sharpness provided by adaptive optics. The instrument is mounted on Unit Telescope 4 of the VLT, which is currently being converted into a fully adaptive telescope.

MUSE is the result of ten years of design and development by the MUSE consortium — headed by the Centre de Recherche Astrophysique de Lyon, France and the partner institutes Leibniz-Institut für Astrophysik Potsdam (AIP, Germany),  Institut für Astrophysik Göttingen (IAG, Germany),  Institute for Astronomy ETH Zurich (Switzerland), L’Institut de Recherche en Astrophysique et Planétologie (IRAP, France), Nederlandse Onderzoekschool voor de Astronomie (NOVA, the Netherlands) and ESO.

Since the start of 2014, Bacon and the rest of the MUSE integration and commissioning team at Paranal have recorded the MUSE story in a series of blog posts which can be followed here. The team will present the first results from MUSE at the forthcoming 3D2014 workshop at ESO in Garching bei München, Germany.

When Lightning Strikes, Instruments On The Space Station Will See It

April Flowers for redOrbit.com – Your Universe Online

Just as you might keep a spare tire in your car, or a spare filter for your air conditions, NASA keeps spares as well. These “spare” flight hardware units allow NASA to continue work without interruption in the event that something goes down for repair. These spare parts are kept even after the project ends, sometimes finding second lives in new areas.

A sophisticated piece of flight hardware, called a Lightning Imaging Sensor (LIS), was developed by researchers at NASA’s Marshall Space Flight Center and launched into space in 1997 as part of NASA’s Tropical Rainfall Measuring Mission (TRMM). The sensor, used to detect and locate lightning over the tropical region of the globe, undertook a three year primary mission to return data that could be used to improve weather forecasts. LIS continues to operate aboard the TRMM satellite today.

Of course, the researchers responsible for building LIS in the 1990s built a spare unit as a precaution. That other unit is now being brought into play as well. The second LIS sensor is scheduled to launch aboard a Space Exploration Technologies (SpaceX) rocket to the International Space Station (ISS) in February 2016. LIS will be mounted to the station for a two year baseline mission as part of a U.S. Department of Defense (DoD) Space Test Program (STP)-H5 science and technology development payload.

The LIS hardware was selected by NASA to take advantage of the ISS’s high inclination, which will give the sensor the ability to “look” farther towards Earth’s poles than the original LIS aboard the TRMM satellite. The sensor will have many duties once installed, including monitoring global lightning for Earth science studies, providing cross-sensor calibration and validation with other space-borne instruments, and ground-based lightning networks. LIS will also supply real-time lightning data over data-sparse regions, such as oceans, to support operational weather forecasting and warning.

“Only LIS globally detects all in-cloud and cloud-to-ground lightning — what we call total lightning — during both day and night,” said Richard Blakeslee, LIS project scientist at Marshall. “As previously demonstrated by the TRMM mission, better understanding lightning and its connections to weather and related phenomena can provide unique and affordable gap-filling information to a variety of science disciplines including weather, climate, atmospheric chemistry and lightning physics.”

Without land-ocean bias, LIS measures the amount, rate and radiant energy of global lightning, providing storm-scale resolution, millisecond timing, and high, uniform-detection efficiency.

The LIS hardware consists of an optical imager enhanced to locate and detect lighting from thunderstorms within its 400-by-400-mile field-of-view. As it orbits Earth, the ISS travels more than 17,000 mph. This will allow LIS to observe a point on Earth, or a cloud, for almost 90 seconds each time it passes overhead. This viewing duration, despite its short length, is long enough to estimate the lightning-flashing rate of most storms.

More than 70 percent of all lightning occurs during daylight hours, making daytime detection the driving force for the technical design of LIS. Lightning, when seen from space, looks like a pool of light on top of a thundercloud. During the day, however, sunlight reflected off the cloud tops can completely mask the lightning signal. This makes it challenging to detect the lightning. LIS applies special techniques that take advantage of the differences in the behavior and physical characteristics between lightning and sunlight, however, allowing LIS to extract the lightning strikes from background illumination.

A real-time event processor inside the LIS electronics unit performs a final step in processing by removing the remaining background signal. This enables to system to detect the lightning signatures and achieve 90-percent detection efficiency.

The LIS team will operate the unit remotely once it is installed on the ISS. The team will assess the data returned by the unit, then disseminate it to forecasters and researchers from the Global Hydrology Resource Center, one of NASA’s Earth science data centers.

The LIS team expects that data from the unit will be applicable in many ways here on Earth. They have received strong endorsements from national and international organizations, including National Oceanic and Atmospheric Administration (NOAA), European Space Agency (ESA), Japan Aerospace Exploration Agency (JAXA) and the Geostationary Operational Environmental Satellite R- Series Program (GOES-R). For their operational weather warning, forecasting and even validation applications, such operational users as NOAA’s National Weather Service (NWS), Aviation Weather Center (AWC), Ocean Prediction Center (OPC) and Pacific Region will be interested in the data from LIS. Other science and application investigations will be improved from the new lightning observations provided by LIS, as well.

LIS might also serve a purpose for the Federal Aviation Administration (FAA) from a research standpoint. The information obtained could help with validation activities of several oceanic convection ensemble model products the FAA is developing, either in real-time or archive mode, according to Randy Bass, a member of the FAA’s Aviation Weather Research Team.

“It could also be used for validation of detection of convection from other ground- and space-based sensors we will be using at the time,” said Bass. “Any data we can use for ‘ground truth’ over oceanic areas will be extremely helpful in development of better observing and forecasting products used for offshore aviation, especially as we expand our coverage throughout the Atlantic and Pacific oceans.”

This could mean better short-term forecasts of thunderstorms over offshore areas, giving pilots and air traffic controllers the ability to reroute plans around hazards—like lightning and turbulence—with more accuracy. Planes already have weather radar on board, but the range is limited. The LIS data would allow the FAA to improve their capabilities and give controllers the opportunity to see the weather activity as well.

“Measuring lightning is important for knowledge about the weather and also operationally important for aviation safety. By adding an instrument on space station, we can add observations from higher latitudes covering the 48 contiguous states,” said International Space Station Chief Scientist Julie Robinson, Ph.D. “This is a prime example of science on the International Space Station benefiting our nation.”

Image 2 (below): Discussing the Lightning Imaging Sensor engineering test unit are, from left, Bill Lopez and Jim McLeroy from the Department of Defense (DoD) Space Test Program (STP); John Davis from Marshall; Nathan Harnagel from DoD STP; and Mike Stewart from the University of Alabama in Huntsville. Credit: NASA

Light Pulses Could Help Space Veggies Produce Eye-Protecting Nutrients

redOrbit Staff & Wire Reports – Your Universe Online
Astronauts could help offset exposure to eye-damaging radiation during extended spaceflights by eating leafy vegetables grown during those voyages, according to research appearing in a recent edition of the journal Acta Astronautica.
The University of Colorado Boulder researchers responsible for the study say that exposing those vegetables to a few bright pulses of light each day as they grow onboard a spacecraft could increase the amount of eye-protecting nutrients they produce, known as carotenoids.
One of those carotenoids, zeaxanthin, is particularly helped by the process, the researchers noted in a statement Tuesday. While zeaxanthin can be ingested as a supplement, evidence suggests that human bodies are better at absorbing it and other carotenoids from whole foods, such as green leafy vegetables.
NASA is currently studying ways to grow fresh produce on deep space missions as a way to keep the crew’s morale up and improve their overall nutrition. Space gardening research conducted to date has looked primarily at how to ensure that plants receive optimal light, water and fertilizer so that they can grow as large as possible in as little time as possible. However, those conditions might not be optimal for production of essential nutrients.
“There is a trade-off,” explained co-author Barbara Demmig-Adams, a professor in the Department of Ecology and Evolutionary Biology. “When we pamper plants in the field, they produce a lot of biomass but they aren’t very nutritious.”
“If they have to fend for themselves – if they have to defend themselves against pathogens or if there’s a little bit of physical stress in the environment – plants make defense compounds that help them survive. And those are the antioxidants that we need,” she added.
Zeaxanthin is produced when plant leaves absorb more sunlight than they can use, which typically occurs when those plants are stressed, the investigators explained. For instance, a lack of water could keep the plant from using all of its absorbed sunlight for photosynthesis, forcing it to produce the carotenoid compound to safely remove excess light and prevent it from causing damage to the plant’s biochemical pathways.
Zeaxanthin plays a similar protective role in the human eye, according to Demmig-Adams, undergraduate researcher Elizabeth Lombardi, postdoctoral researcher Christopher Cohu and ecology and evolutionary biology Professor William Adams. Since the eyes collect light in much the same way as a leaf, it requires similar protection.
The study authors set out to find a way to maximize both plant growth and zeaxanthin production. Using the plant species Arabidopsis as a model, they discovered that a few pulses of bright light each day encouraged production of the carotenoid to prepare for an anticipated excess amount of sunlight. At the same time, those pulses were so short that they did not interfere with the plant’s otherwise optimal growth conditions.
“When they get poked a little bit with light that’s really not a problem, they get the biomechanical machine ready, and I imagine them saying, ‘Tomorrow there may be a huge blast and we don’t want to be unprepared,’” explained Demmig-Adams. “Learning more about what plants already ‘know’ how to do and trying to manipulate them through changing their environment rather than their genes could possibly be a really fruitful area of research.”

Liver Metabolism Study Could Help Patients Awaiting Transplants

Metabolic profiling of liver cells suggests new treatments for cirrhosis patients

In a new study that could help doctors extend the lives of patients awaiting liver transplants, a Rice University-led team of researchers examined the metabolic breakdown that takes place in liver cells during late-stage cirrhosis and found clues that suggest new treatments to delay liver failure.

More than 17,000 Americans are awaiting a liver transplant, and of those, about 1,500 will die this year while still waiting, according to the American Liver Foundation. The new research, which appeared online Feb. 27 in the Journal of Hepatology, suggests new treatments that could keep some of those patients alive long enough to receive a transplant. The research was conducted by a team from Rice University, the University of Pittsburgh, Children’s Hospital of Pittsburgh, the University of Nebraska Medical Center and the University of Texas MD Anderson Cancer Center.

“There’s an old saying that ‘the beginning of health is to know the disease,'” said lead researcher Deepak Nagrath of Rice. “There’s never been a clear understanding of what causes liver cells to stop working during the final stages of cirrhosis. Our goal was to probe the metabolic processes inside liver cells in this stage of the disease to better understand what causes them to fail.”

Liver disease is a growing problem worldwide, especially in countries where fatty diets and obesity are also problems. According to the American Liver Foundation, one in 10 Americans suffers from liver disease and as many as one in four Americans is at risk, including many who suffer from “nonalcoholic fatty liver disease,” a buildup of extra fat in the organ.

Nagrath, the director of Rice’s Laboratory for Systems Biology of Human Diseases, said his group wanted to examine the role that energy metabolism played in the breakdown of hepatocyte function during cirrhosis. To do that, the group needed to examine the biochemistry of liver cells at various stages during the disease.

The first stage of liver disease, called “steatosis,” is marked by the fat buildup. The next stage is fibrosis, when fibers start getting deposited. This leads to damage of the liver cells, or hepatocytes, which leads to the final stage, cirrhosis.

Nagrath said the studywas made possible by a unique animal model for cirrhosis that was developed by Ira Fox and Alejandro Soto-Gutierrez at the University of Pittsburgh’s McGowan Institute for Regenerative Medicine.

“Most models cannot mimic what actually occurs in humans, but this one, which uses rats, captures all of the features, particularly the pathological features, that occur in humans,” he said.

Using hepatocyte samples collected at Pittsburgh, Nagrath’s lab conducted a detailed search for chemical and genetic clues about hepatocyte metabolism. In particular, they focused on how the cells were producing adenosine triphosphate, or ATP, the “molecular unit of currency” that all living cells use to transport chemical energy.

In healthy hepatocytes, most ATP is produced in the mitochondria, via a process known as “oxidative phosphorylation.” Nagrath said previous studies had shown that a second form of ATP production — a process known as “glycolysis” — was also activated in diseased liver cells.

“Mitochondrial production of ATP is more efficient than glycolysis, but in times of stress, when the cells needs extra energy to repair themselves or respond to a crisis, they can employ both processes at the same time,” said Nagrath, assistant professor of chemical and biomolecular engineering and of bioengineering. “It’s also well-known that some forms of cancer rely almost exclusively on the glycolytic pathway, so people tend to associate glycolysis with an unhealthy or diseased state.”

In their study, Nagrath and colleagues found that the story of ATP production in liver cells was considerably more complex than previously understood.

“It’s well-known that energy production from the mitochondrial pathway goes down during cirrhosis, and many people had assumed that this was the primary driver of metabolic failure,” he said. “While we did find that mitochondrial production decreased, it was not down-regulated enough to say that it was a complete failure. It didn’t change that much. Glycolysis, on the other hand, changed a great deal.”

The study showed that in the middle stage of cirrhosis, liver cells up-regulate the glycolytic pathway to produce more energy in response to the disease. Combined with the reduced but still significant production from the mitochondrial pathway, the glycolytic input results in a large net gain in metabolic output. In the final stage of the disease, the cells are unable to sustain their glycolytic output, and net ATP production falls.

“When that happens, and the cells are no longer able to use glycolysis to maintain energy, liver failure occurs,” Nagrath said.

The researchers confirmed the clinical relevance of the findings by comparing the gene expression patterns in the rodents with the genetic profiles of 216 human patients who have cirrhosis.

Nagrath said the findings are important because there are drugs that clinicians can use to target the glucose pathway. These could potentially be used to boost glycolytic energy production and keep patients alive longer.

“This would not represent a cure for liver disease,” he said. “It would only apply to patients in the final stage of liver disease, but if such treatments did prove effective, they could extend the lives of some patients who are awaiting transplants.”

On the Net:

Gonorrhea Infections Start From Exposure To Seminal Fluid

Researchers have come a step closer to understanding how gonorrhea infections are transmitted. When Neisseria gonorrhoeae, the bacteria responsible for gonorrhea, are exposed to seminal plasma, the liquid part of semen containing secretions from the male genital tract, they can more easily move and start to colonize. The research, led by investigators at Northwestern University in Chicago, appears in mBio®, the online open-access journal of the American Society for Microbiology.
“Our study illustrates an aspect of biology that was previously unknown,” says lead study author Mark Anderson. “If seminal fluid facilitates motility, it could help transmit gonorrhea from person to person.”
Gonorrhea, a sexually transmitted infection, is exclusive to humans and thrives in warm, moist areas of the reproductive tract, including the cervix, uterus, and fallopian tubes in women, and in the urethra in women and men. It is estimated there are more than 100 million new cases of gonorrhea annually worldwide.
“Research characterizing the mechanisms of pathogenesis and transmission of N. gonorrhoeae is important for developing new prevention strategies, since antibiotic resistance of the organism is becoming increasingly prevalent,” says H. Steven Seifert, another author on the study.
In a series of laboratory experiments, the investigators studied the ability of N. gonorrhoeae to move through a synthetic barrier, finding that 24 times as many bacteria could pass through after being exposed to seminal plasma. Exposure to seminal plasma caused hairlike appendages on the bacteria surface, called pili, to move the cells by a process known as twitching motility. This stimulatory effect could be seen even at low concentrations of seminal plasma and beyond the initial influx of seminal fluid.
Additional tests found that exposure to seminal plasma also enhanced the formation of bacterial microcolonies on human epithelial cells (cells that line body cavities), which can also promote the establishment of infection.
Researchers at the University of Cologne in Germany also contributed to the study, which was funded by the National Institutes of Health and DFG, the German Research Foundation.

On the Net:

Experts Call For Prison Health Improvements

In a new paper in the journal Health Affairs, several participants in a workshop convened by the National Research Council and Institute of Medicine unveil their recommendations to improve health care for prisoners both during incarceration and after release. From a public health standpoint, they argue, it’s shortsighted to regard prison populations as separate from the community.
The very premise of prison invites members of society to think of the people there as walled-off and removed. But more than 95 percent of prisoners will return to the community, often carrying significant health burdens and associated costs with them. In an article in the March issue of the journal Health Affairs, several experts who participated in a scientific workshop convened by the National Research Council and the Institute of Medicine recommend several steps and ideas consistent with health reform to improve care for prisoners while they are incarcerated and after they return to society.
“The general public doesn’t pay attention to what’s going on behind bars,” said lead author Dr. Josiah Rich, professor of medicine and epidemiology at Brown University and director of the Center for Prisoner Health and Human Rights at The Miriam Hospital. “But this is very important if you are concerned about the health of our population and health care costs.”
Researchers have found that about two in five prisoners have a chronic medical condition (often first diagnosed in prison) and more than seven in 10 prisoners of state systems need substance abuse treatment. In fact, the illness of addiction is what lands many people in prison in the first place.
But four in five prisoners don’t have health insurance when they leave.
“Prisons and jails are necessary for the protection of society,” Rich and his co-authors wrote. “For decades, though, the U.S. health and criminal justice systems have operated in a vicious cycle that in essence punishes illness and poverty in ways that, in turn, generate further illness and poverty.”
Within that bleak situation, however, lies opportunity because incarceration allows for diagnosis and delivery of care that, if continued in the community, would reduce the onslaught of health problems for individuals and ensuing costs for society, Rich said. The authors’ recommendations, which build on discussion from a December 2012 joint workshop in which they participated, are meant to turn that vicious cycle into a virtuous one.
The recommendations could make the difference illustrated by two scenarios, Rich said. Both begin with the imprisonment of a 28-year-old man with severe hypertension. In one case the condition is diagnosed and treated in prison. Treatment with inexpensive medications continues after release a decade later because the man has health insurance and access to a doctor who understands his medical and personal history. In the other case, either the hypertension is left untreated in prison or it’s not managed after he’s released because he has no insurance or continuity of care. A decade later he develops kidney failure and goes on dialysis, costing the health care system a lot more money.
Recommendations for prison and after
The authoring group’s primary recommendation is to find alternatives to imprisonment when possible, given that the United States incarcerated more than 2.3 million of its 313 million residents in 2012. While the group divided the rest of its recommendations between health care in prison and after release, in many cases the ideas are meant to improve integration of care between the two settings. Specifically they make the following recommendations.
In prison:
improved oversight and accountability of prison health care, including making accreditation of prison health care mandatory and enforceable;
inclusion of prisoners in accountable care organization health plans to increase provider incentives for providing good care;
medical profession advocacy for legislation and programs that would benefit prisoner health, such as programs that improve care as prisoners transition to the community.
After release:
employment of a “risk-needs-responsivity” model to triage prisoners, based on their personal history, to the most appropriate care;
assistance for released prisoners to help them enroll in Medicaid as it expands under the Affordable Care Act. In recent research, Rich has found that this activity may already be underway and is co-author of a separate paper in Health Affairs on what’s needed to transition prisoners on Medicaid back into the community;
policies requiring electronic health records from within prison be available to community health providers;
incentives for community providers to deliver mental health care to released prisoners;
improved cultural competence among community physicians to understand the specific medical needs and risk factors of released prisoners. Transition Clinic medical homes provide a worthwhile example, the authors write.
In a separate paper in the journal, Rich and co-authors including Brown and Miriam researchers Brian Montague and Curt Beckwith also find that prisons could do more to test prisoners for HIV and ensure care after release, as the CDC recommends.
Recognizing that prisoners never stop mattering to the community from the standpoint of health could lead to better medical and economic outcomes, the experts argue.
In addition to Rich, the other authors are Redonna Chandler, Brie A. Williams, Dora Dumont, Emily A. Wang, Faye S. Taxman, Scott A. Allen, Jennifer G. Clarke, Robert B. Greifinger, Christopher Wildeman, Fred C. Osher, Steven Rosenberg, Craig Haney, Marc Mauer, and Bruce Western.
The NRC and IOM organized the workshop that served as a springboard for the article published in Health Affairs. The workshop received support from the National Institute of Justice, the John D. and Catherine T. Macarthur Foundation, and the Robert Wood Johnson Foundation.

On the Net:

Study Of Olfaction In Fruit-Eating Bats Could Shed Light On Our Own Sense Of Smell

Brett Smith for redOrbit.com – Your Universe Online

While scientists know that a superfamily of genes inside olfactory receptors is responsible for our sense of smell – we still don’t know the mechanism behind the interpretation of odor molecules into a particular smell.

A new study published in the journal Molecular Biology and Evolution has found a distinct gene pattern in the olfactory receptors of fruit-eating bats – potentially shedding some light on the mechanism behind our own sense of smell.

The study authors said their work emphasizes the importance of looking at diversity in nature to determine genome functions and evolutionary history in mammals.

“We knew that animals that live in various ecological environments–whales, bats, cows–have evolved different suites of olfactory receptors,” explained study author Liliana Davalos, an evolutionary biologist at Stony Brook University in New York. “That suggests that the ability to smell different odors is important for survival.”

“This study provides new insights into the mechanisms that have allowed bats to diversify their diets so extensively,” said study author Simon Malcomber, a program director in the National Science Foundation’s Division of Environmental Biology.

The researchers said they wanted to know if the development of other sensory systems, transformations in diet, or the random accumulation of transformations through time drove the progression of olfaction in mammals.

“Bats offer a prime opportunity to answer this question,” Davalos said. “They’ve evolved new sensory systems such as echolocation, and various bat species eat very different foods, including insects, nectar, fruit, frogs, lizards and even blood.”

After a genetic analysis of thousands of olfactory receptors from various bat species and analyzing an evolutionary tree including every one of the species, the scientists found distinctive patterns of olfactory receptors among bats that have fruit as a major part of their diet.

The researchers discovered that the genetic patterns for olfaction have arisen two times, once amongst New World leaf-nosed bats that mostly eat figs and another time amongst Old World fruit bats. These bats feed on a wide variety of fruits, including figs, guavas, bananas and mangoes.

The study team also found that the olfactory receptors in these two different groups of bats are similar, however distinction repertoires have arisen in different ways in New World and Old World bats.

Davalos said this means that independent mechanisms must have affected this part of the bat genome in response to the challenge of finding fruit at night.

In a related study published in February, Duke University researchers revealed that Madagascar sucker-footed bats are primitive members of a group of bats that evolved in Africa and ultimately went on to flourish in South America. The study showed that the sucker-footed bat family is at least 36 million years older than previously known.

Today, the sucker-footed bats consist of two species, Myzopoda aurita and M. schliemanni, which live in Madagascar. Unlike other bats, sucker-footed bats don’t cling upside-down to cave ceilings or branches. Instead, they roost head-up, often in the furled leaves of the traveler’s palm, a plant in the bird-of-paradise family. Scientists previously thought the pads held the bats up by suction, but recent research has shown the bats instead rely on wet adhesion to stay upright.

Searching Twitter For The Next Big Word

Brett Smith for redOrbit.com – Your Universe Online
Unlike any other time in history, new words and phrases in the 21st century can spread with viral efficiency.
A new research project at Aston University in the United Kingdom is hoping to track the digital spread of new English terms by analyzing over one billion tweets from both the UK and the US.
“I’m very excited to begin work on this project,” said Jack Grieve, a forensic linguistics lecturer at Aston. “No previous linguistic report has had so much data to work with so we have a great opportunity to map the emergence of new words and their lexical diffusion.”
“In addition to charting the internal movement of words in the UK and US, we hope to look at how words spread across the Atlantic, between the two countries – the first study to do so using the same methods in both nations,” Grieve added.
The researchers said the somewhat spontaneous nature of Twitter interactions make them similar to interactions made during speech. The similarity makes studying Twitter posts particularly significant to the study of the spread of new terms.
The study of new terms spreading on Twitter may have to consider the fact that these terms are being used predominantly by a relatively young age group. A study by the Pew Research Center released in November showed that those who look to the microblogging platform for daily news are becoming younger, more educated, and increasingly rely on their mobile devices.
According to the survey, 16 percent of US adults use Twitter and about half of them use it for their news. Twitter users tend to be younger and more educated “than both the population overall and Facebook news consumers,” the report added.
Pew researchers also found that 40 percent of Twitter news consumers have at least a bachelor’s degree, compared with 29 percent of the total population and 30 percent of Facebook news consumers.
After analyzing the opening night of the London summer Olympics, the Newton, Connecticut school shootings and the Supreme Court hearings on same-sex marriage, the research team identified three central themes: Twitter users pass along news information as events develop; Twitter conversations about big news events evolve in sentiment and topic; and Twitter can match the sentiment of the general population.
The researchers at Aston University, in a partnership with the University of South Carolina, will also look at recent patterns of human migration to determine how the movement of peoples influences linguistic difference. The US and UK migration patterns will be determined by analyzing millions of online family trees.
“Throughout history, migration has been a key force in shaping and transforming language,” Grieve said. “Very little research, however, has looked at how more recent population mobility has shaped dialect variation today. Hopefully, we will be able to discover new and exciting findings.”
Citing examples such as ‘selfie’, ‘twerk’, ‘vom’, ‘buzzworthy’ and ‘squee’ – the UK researchers noted that new terms are being coined every year and spreading across social media.

Kids Benefit More Playing With Boxes Than With Expensive Toys

Brett Smith for redOrbit.com – Your Universe Online

Cheap parents rejoice! Australian researchers have found that giving buckets, boxes and other makeshift toys to children help them be more creative and active than being in a normal playground setting.

Based on long-term research by scientists from RMIT University in Melbourne, Australia, the new study found that simple, common objects given to children during recess and lunchtime can minimize sedentary time by 50 percent, improve imagination and improve social and problem-solving skills.

“Conventional playgrounds are designed by adults – they don’t actually take into consideration how the children want to play,” said study author Brendon Hyndman, currently a physical education lecturer at the University of Western Sydney. “At a time when childhood obesity is growing and playgrounds are shrinking, we need a creative approach to stimulate physical activity among schoolchildren.”

The study, which included 120 students between the ages of 5 and 12, involved placing buckets, pipes, exercise mats, hay bales and swimming pool noodles in the play areas at Emmaus Catholic Primary School in the Australian town of Ballarat. The researchers then recorded the students’ behavior. These behaviors were compared with those of children at another school in the area which had conventional play equipment such as monkey bars and slides.

The researchers found that sedentary behavior, considered sitting or standing around, dropped from nearly 62 percent to just over 30 percent of children when the kids were provided with the additional items by the researchers. The study team also found that students who played with the common objects took 13 more steps per minute and played more vigorously compared to those in the conventional playground.

“These results could be applied to anywhere that children play and shift the debate on the best way to keep our children healthy,” Hyndman said.

The new study comes after a report presented in November at the American Heart Association’s Scientific Sessions 2013 found that today’s children are about 15 percent less physically fit than their parents from a cardiovascular standpoint.

Fujitsu To Implement Palm Scanning To Keep Smartphones Secure

Enid Burns for redOrbit.com – Your Universe Online
Biometrics is increasingly being used in keeping security on mobile devices and Fujitsu plans to put more than just a finger on security for its smartphones. In a statement, the company announced it plans to deploy palm-vein scanning to address security on its mobile devices.
The palm-vein scanning feature will be used specifically in Fujitsu smartphones, SlashGear reports. The company already has experience in developing and using the technology.
Fujitsu’s largest smartphone market appears to be in Japan, where it held 11.9 percent of the smartphone market in March of 2012, according to comScore data.
Palm-vein authentication is the identification of an individual by evaluating the palm print plus vein points on a hand. Fujitsu claims the method has a high accuracy rate. Fujitsu has developed the technology under the name PalmSecure.
“The principle behind this relatively less-known biometric is almost the same with fingerprints. Palm patterns are just as unique as a fingerprint, but it is only one half of the equation. The other half involves vein points, which are scanned using near-infrared light and matched against previously recorded patterns and points. There is one key distinction here. Blood needs to be flowing through those veins for palms to match. No disembodied hands here for faking identities. Fujitsu boasts of a 0.0008 percent false positive and a 0.01 percent false rejection rate with this system,” SlashGear’s JC Torres wrote.
The increasing size of most smartphone screens, combined with the smaller size of the authentication screen is making it possible for the technology to be deployed on smartphones.
“We have been reducing the size of our palm vein authentication units since their initial development,” a Fujitsu spokesman said in a ComputerWorld article. “In the future, we hope to eventually have these units embedded into smartphones.”
Fujitsu first used the palm-vein authentication technology in a commercial in 2004 when it was used to authenticate users for ATM machines at Japan’s Bank of Tokyo-Mitsubishi, CompterWorld reports. The company also implemented in-store scanners at Suruga Bank in 2004. Since that time, development has allowed the scanners to become smaller and have been integrated in laptops and other devices.
A stamp-sized unit was recently released for commercial use and is embedded in a fleet of about 2,000 tablets used by the Fukuoka Financial Group, which includes Bank of Fukuoka, Kumamoto Bank and Shinwa Bank, ComuterWorld reports.
“No one has this technology, and it’s significantly more secure than fingerprint,” a Fujitsu spokesman was quoted as saying by Computer World.
Fujitsu believes the hand scanners can be used to identify individuals in natural disasters when bank cards and other forms of ID might be unavailable.
While biometrics have continued to advance independently, the authentication technologies have seen increased interest since the Apple iPhone included biometric technology — namely a fingerprint touch pad — for its iPhone 5s. Since its integration, there has been increased interest in biometric authentication on smartphones and mobile devices to help deter theft of both devices and data.

Sea Shepherd, Japanese Whalers Clash For Third Time This Season

Brett Smith for redOrbit.com – Your Universe Online
In what appears to be the result of escalating violence, the anti-whaling activist group Sea Shepherd has claimed that Japanese whalers attacked one of their vessels during a recent clash in the Southern Ocean.
According to the environmentalist group, the Japanese ships Yushin Maru and Yushin Maru 3 dragged steel cables across the bow of its vessel the Bob Barker 11 times on Sunday in an attempt to disable the craft. When the Bob Barker released two small boats to cut the steel cables, a bamboo spear was thrown at Sea Shepherd crew members, the anti-whaling group claimed.
The clash is the latest episode in the ongoing saga between the environmentalist group and Japanese ships. The Japanese claim they are simply pursuing the whales for scientific purposes, while the Sea Shepherd has asserted that the Japanese ships are simply exploiting a loophole in international treaties to conduct whaling operations.
Sunday’s clash was the third since the Japanese ships started operation earlier this year. No one was injured during the most recent incident.
“Each time we have located the Nisshin Maru (factory ship), the Sea Shepherd fleet has been attacked by the whalers in night-time ambushes,” the Bob Barker’s captain Peter Hammarstedt said, according to the AFP.
According to the Sea Shepherd, the Bob Barker’s helicopter spotted the Nisshin Maru early Sunday with a Minke whale onboard while “slabs of whale meat were also photographed on the deck, along with the severed head of a recently butchered whale.”
Last week, Hammarstedt wrote an open letter to Australian Environment Minister Greg Hunt complaining about a lack of action from Australia after the earlier assaults in the Southern Ocean. Hammarstedt said the letter went unanswered.
“They knew this attack was imminent, and yet they did nothing. Hunt’s broken promises to monitor the whaling operations are evident in the broken bodies of the whales killed today,” he said.
The Australian minister had initially called for a government ship to monitor the ships during the annual hunting season, but instead chose to conduct aerial surveillance – a move the Sea Shepherd called “pretty cowardly” and an appeasement to Japan during free trade negotiations.
Hunt’s office told the AFP that a response to Hammarstedt’s letter had been sent to Sea Shepherd and defended the use of aerial surveillance.
“For operational reasons, the use of a plane has been determined as the most effective means of monitoring activities in the Southern Ocean. The aircraft will be able to monitor activities over a large area,” a spokesman said.
Confrontations between the two groups appear to be escalating as a ship from each side crashed into each other during a clash in early February.
In a statement to the Associated Press, Japan’s Institute of Cetacean Research (ICR), which sponsors the annual whale operations, said that protestors aboard two inflatable boats from the Bob Barker had dropped ropes in front of the bow of Yushin Maru, which became tangled in the ship’s propeller. ICR said it was the Bob Barker that then came too close to the Yushin Maru No. 3 and struck its stern, damaging the whaling ship’s hull and railing.

Adding More Fish To Your Diet Can Increase Good Cholesterol

Lee Rannals for redOrbit.com – Your Universe Online

Adding more fish to your diet could help boost good cholesterol levels, according to a new study published in PLOS ONE.

Scientists at the University of Eastern Finland found that people who increase their intake of fish to a minimum of three to four fatty fish servings per week could increase the number of HDL particles in their blood, which are believed to protect against cardiovascular diseases.

Although many studies have been performed that reveal the benefits of fish, the latest study provides new information on how the consumption affects the size and lipid concentrations of lipoproteins which transport lipids in the blood.

Participants eating fish had salmon, rainbow trout, herring and vendace during the study, with no added butter or cream. The fish used did not include low-fat fish such as zander and perch, which have been observed to lower blood pressure.

Researchers carried out a detailed analysis on the participants to look at their lipoprotein particles. The method allowed the team to look at a total of 14 different particle classes. Not only did the study look at fish consumption, but also looked at the effects of whole grain, fish and bilberries. They said the study emphasizes that increasing overall LDL cholesterol is important.

The study included 131 participants with impaired glucose metabolisms and signs of metabolic syndrome who were randomized into three groups with 12-week periods. The team said 106 subjects completed the study, and they used serum metabolic profiles to study lipoprotein subclasses and lipids as well as low-molecular-weight metabolites.

“There were no significant differences in clinical characteristics between the groups at baseline or at the end of the intervention,” the authors wrote in the journal. “Mixed model analyses revealed significant changes in lipid metabolites in the HealthyDiet group during the intervention compared to the Control group.

“According to tertiles of changes in fish intake, a greater increase of fish intake was associated with increased concentration of large HDL particles, larger average diameter of HDL particles, and increased concentrations of large HDL lipid components, even though total levels of HDL cholesterol remained stable.”

The team concluded that the results suggest that consumption of a diet rich in whole grain, bilberries and fatty fish adds to good cholesterol in the body. They said these changes may be related to known protective functions of HDL such as reverse cholesterol transport and could explain the known positive effects of fish consumption against atherosclerosis.

“People shouldn’t fool themselves into thinking that if their standard lipid levels are OK, there’s no need to think about the diet, as things are a lot more complicated than that. Soft vegetable fats and fish are something to prefer in any case,” Postdoctoral Researcher Maria Lankinen, first author listed on the paper, said in a statement.

Shake-canceling Spoon Developed To Help Patients That Suffer From Essential Tremors

[ Watch The Video: Liftware Stops Shakey Hands ]

University of Michigan Health System

For people whose hands shake uncontrollably due to a medical condition, just eating can be a frustrating and embarrassing ordeal – enough to keep them from sharing a meal with others.

But a small new study conducted at the University of Michigan Health System suggests that a new handheld electronic device can help such patients overcome the hand shakes caused by essential tremor, the most common movement disorder.

In a clinical trial involving 15 adults with moderate essential tremor, the device improved patients’ ability to hold a spoon still enough to eat with it, and to use it to scoop up mock food and bring it to their mouths.

The researchers measured the effect three ways: using a standard tremor rating, the patients’ own ratings, and digital readings of the spoon’s movement.

The results are published online in the journal Movement Disorders by a research team that includes U-M neurologist and essential tremor specialist Kelvin Chou, M.D., as well as three people from the small startup company, Lift Labs, that makes the device, called Liftware. The study was funded by a Small Business Innovation Research grant from the National Institutes of Health that the researchers applied for together.

Public-private partnership – with a Michigan difference

The technology came full circle to its test in the UMHS clinic. The company’s CEO, Anupam Pathak, Ph.D., received his doctorate from the U-M College of Engineering – where he first worked on tremor-cancelling advanced microelectronic technologies for other purposes.

The concept is called ACT, or active cancellation of tremor. It relies on tiny electronic devices that work together to sense movement in different directions in real time, and then make a quick and precise counter-motion.

Lift Labs, based in San Francisco, developed the device, which resembles an extra-large electronic toothbrush base. It can adjust rapidly to the shaking of the user’s hand, keeping a detachable spoon or other utensil steady. In other words, it shakes the spoon in exactly the opposite way that the person’s hand shakes.

But to truly test whether their prototype device could help essential tremor patients overcome their condition’s effects, the Lift Labs team turned to Chou, who with his colleagues sees hundreds of essential tremor patients a year.

UMHS offers comprehensive care for the condition as part of its Movement Disorders Center. Chou and his colleagues have experience in prescribing a range of medication to calm tremors, and evaluating which patients might benefit from advanced brain surgery to implant a device that can calm the uncontrollable nerve impulses that cause tremor.

“Only about 70 percent of patients respond to medication, and only about 10 percent qualify for surgery, which has a high and lasting success rate,” says Chou, who is an associate professor in the U-M Medical School’s departments of Neurology and Neurosurgery. “People get really frustrated by tremor, and experience embarrassment that often leads to social isolation because they’re always feeling conscious not just eating but even drinking from a cup or glass.”

The trial, Chou says, showed that the amplitude of movement due to the tremor decreased measurably, and that patients could move the spoon much more normally. Though the trial did not include patients with hand tremors caused by other movement disorders such as Parkinson’s disease, the device may be useful to such patients too, he notes.

Says Pathak, “A key aspect of Liftware is a design with empathy. We hear of people struggling every day, and decided to apply technology in a way to directly help. We hope the final product is something people can feel proud of using, and allow them to regain independence and dignity.”

How the study was done

The researchers tested the device’s impact both with the microelectronics turned on, and with them turned off so there was no correction for movement. Patients and Chou could not tell by feeling the device whether it was on or off.

All three measures – objective rating by Chou, subjective rating by patients, and digital data from the device’s connection to a computer – showed improvement for eating and transferring items when the device was turned on, compared to when it was off.

When the patients were asked to simply hold the spoon halfway between the table and their mouth, the two objective measures showed improvement when the device was on, though the patients didn’t report a significant difference themselves.

“Our data show this device has very good potential to assist those who have tremor and aren’t candidates for surgery,” he says. “Compared with other devices designed to limit tremor by weighting or constraining limbs, this approach allows movement and is easier to use.”

The study included 15 adults between the ages of 59 and 80 whose tremor caused them to spill food or drink. They had experienced tremor for anywhere from 5 years to 57 years. All of the patients stopped taking their medication temporarily before testing the Liftware device. Five of the patients had undergone deep brain stimulation, but turned off their tremor-controlling implant for the study.

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Children Who Outgrow Allergies May Have Severe Recurrences

Lawrence LeBlond for redOrbit.com – Your Universe Online

Children who outgrow a food allergy are not always out of the woods. In some cases, an allergy to the same food can return, often becoming more severe and more persistent.

Pediatric allergy experts with the Children’s Hospital of Philadelphia (CHOP) have reached out to healthcare providers and caregivers, asking them to carefully monitor children with food allergies for early signs of eosinophilic esophagitis (EoE), a severe and painful type of allergy that has been increasingly more common in recent years.

“These two types of allergy have some elements in common, but patients with EoE usually don’t go on to develop tolerance to the foods that trigger EoE,” pediatric allergist Jonathan M. Spergel, MD, PhD, a pediatric allergist with CHOP, said in a statement.

Spergel is also director of CHOP’s Center for Pediatric Eosinophilic Disorders, one of the top programs in the nation for EoE treatment.

His research was presented today at the annual meeting of the American Academy of Allergy, Asthma & Immunology (AAAAI) in San Diego by Solrun Melkorka Maggadottir, MD, also of CHOP.

EoE involves swelling and inflammation of the esophagus, along with excessive levels of immune cells called eosinophils. EoE is known to cause weight loss, vomiting, heartburn and difficulty swallowing. This painful disorder affects people in any age group, but is usually first discovered in children with feeding difficulties.

Spergel and colleagues compared EoE with IgE-mediated food allergy, a more familiar type of food allergy that occurs when antibodies mount an exaggerated immune response against proteins in particular foods, such as nuts, eggs or milk. These foods can trigger a host of problems, including vomiting, hives and other skin reactions.

For the study, the team performed a retrospective analysis of 1,375 children seen at CHOP for EoE between 2000 and 2012. Of those, 425 could be shown to have a definite food causing their disorder – most commonly milk, egg, soy, or wheat. Of these children, 17 had gone on to develop EoE to a food after outgrowing IgE-mediated allergy to that same food.

“The pattern we found in those 17 patients suggests that the two types of food allergy have distinct pathophysiologies—they operate by different mechanisms and cause different functional changes,” said Spergel. “However, this pattern also raises the possibility that prior IgE-mediated food allergy may predispose a patient to developing EoE to the same food.”

About 10 percent of patients who undergo desensitization therapy for IgE-mediated food allergies go on to develop EoE to the same food, noted Spergel.

He added this is a high enough percentage that healthcare providers and caregivers need to consider it when treating patients with food allergies.

In desensitization therapy, a clinician exposes a patient to miniscule amount of an allergy-producing food, then gradually increases the amount, aiming for the patient to become tolerant to that food.

This desensitization therapy has been shown to be hugely successful in some children with peanut allergies.

Researchers from University of Cambridge reported in January that they were able to offer 84 percent of children with peanut allergies in their study group the ability to eat the equivalent of five peanuts per day after six months of desensitization therapy.

Worse fibromyalgia symptoms reported by young people

 

Fibromyalgia is a chronic pain condition that causes painful tenderness and achiness in a person’s muscle and tissue.  Most people develop this condition later in life, although it can show up in a person’s 20’s or 30’s.  This is a condition that causes moderate to severe chronic pain in multiple areas of the body.

It can be difficult for young people to deal with the onset of fibromyalgia at a younger age, as it is very painful and can be debilitating.  It can also cause issues with a young person’s social life, as they will likely experience fatigue and may not be as active or social as they once were.

Young people who are diagnosed with fibromyalgia should do their best to maintain active social lives and to continue with as many activities as possible- continue to be involved in hobbies and other activities that bring stress-relief, joy and happiness.

People with fibromyalgia can experience a great deal of pain, which can make leaving the house and having a full day’s work difficult and daunting.  People with fibromyalgia often experience fatigue, anxiety and depression.  These are all very serious symptoms, as depression and anxiety can be difficult to fight off.

Young people should have extra energy reserves, and hopefully can use their energy to maintain a social life and personal lifestyle that is fulfilling and rewarding.  It is important to maintain social circles and talk with family and friends not only about the condition, but also go out for fun and relaxation.

Those who do not socialize for fun should find other activities that relax and interest them, activities that give them hope.  This could be an art activity, such as painting or scrapbooking.  Or it could mean making music or even listening to it.  There are many ways to relax, and each individual has a different way of chilling out and relaxing.

Young people who are diagnosed with fibromyalgia experience a variety of symptoms that lead to their early diagnosis. These symptoms can include widespread pain in the muscles, and deep achy pain in the muscles and tissues.  Tiredness and fatigue are also very common as pain prevents a good night’s sleep.

Many people experience these symptoms differently, as some people may have more intense pain than others, and flare-ups will come and go at different times.  One thing to consider is that fibromyalgia symptoms are mostly internal, so people will not be able to see directly how much pain a person is in.

This can be a problem when explaining the pain and tiredness the friends, family or coworkers.  To them, you look fine and healthy, maybe just a bit tired.  But those experiencing pain and fatigue from fibromyalgia know differently- they many be experiencing great pain and tiredness that just isn’t outwardly apparent to others.

Worse fibromyalgia symptoms reported by young people

Some other symptoms of fibromyalgia in young people include: poor circulation, with tingling or numbness is hands and feet.  Headaches and irritability are common. Fibro fog is an accompanying condition that affects memory and concentration.  People may also experience issues with their bowels and urination.

Young people may be discouraged by the presence of all or some of these symptoms, as they can be disruptive to life and can cause feelings or hopelessness or loneliness.  This is why young people need to maintain their social lives as much as possible, and continue to live as they would like with as much energy as they can muster.

People can manage their pain from fibromyalgia by creating a daily schedule that is not overwhelming, and contains plenty of breaks for rest and for rejuvenating activities.  Individuals can find different activities that are relaxing and fun for them.

Some people enjoy going to get a happy hour drink with friends, and others may enjoy going to a movie or a concert.  There are so many options, but it is so important that people continue to have fun and participate in activities that help them stay positive.

It can be a bummer to experience fibromyalgia at a young age, however, there is surely an opportunity to live a happy and fulfilling life with fibromyalgia.  It does take a bit of planning and responsibility.  People need to prevent themselves from getting over tired.  This means not planning too many things in the day and not being afraid to cancel plans and rest if experiencing a flare up.

Young people with fibromyalgia can manage their pain and continue to be positive about their condition, as proper management can lead to a fulfilling life.

Giant Panda Populations Potentially Threatened By Bamboo-Eating Horses

redOrbit Staff & Wire Reports – Your Universe Online

Horses and other livestock pose a significant threat to the future of the panda, as they are beating the endangered creatures to the bamboo pandas rely on for sustenance, according to research appearing in this week’s edition of the Journal for Nature Conservation.

“Across the world, people are struggling to survive in the same areas as endangered animals, and often trouble surfaces in areas we aren’t anticipating,” said Jianguo ‘Jack’ Liu of Michigan State University (MSU). “Creating and maintaining successful conservation policy means constantly looking for breakdowns in the system. In this case, something as innocuous as a horse can be a big problem.”

Liu and his colleagues investigated how an emerging livestock population in the Wolong Nature Reserve in China was affecting that region’s giant panda population. They took empirical data from field surveys, remotely sensed images and GPS color tracking to review the distribution of horses in the panda habitat, the space usage and habitat selection patterns of both types of creatures, and the impact of horses on the availability of bamboo.

“We discovered that the horse distribution overlapped with suitable giant panda habitat,” they wrote in their study. “Horses had smaller home ranges than pandas but both species showed similarities in habitat selection. Horses consumed considerable amounts of bamboo, and may have resulted in a decline in panda habitat use.”

The research highlights the need for new policies that address this growing threat to the endangered giant panda. While timber harvesting has for years been the primary threat to the 1,600 remaining pandas, their specific food and habitat needs (they only eat bamboo and live in regions with gentle slopes that are far away from people) mean that they can be greatly impacted by encroaching livestock populations.

MSU Center for Systems Integration and Sustainability (CSIS) doctoral student Vanessa Hull, who has been living in the Wolong Nature Reserve off and on for the past seven years, was the first to observe horses eating bamboo while tracking pandas equipped with GPS collars.

“It didn’t take particular panda expertise to know that something was amiss when we’d come upon horse-affected bamboo patches. They were in the middle of nowhere and it looked like someone had been in there with a lawn mower,” she explained. Following her observations, Hull checked around and found that some of Wolong’s farmers had begun raising horses, which were allowed to graze unattended until they were to be sold.

“It was an idea whose popularity skyrocketed,” the university said. “In 1998, only 25 horses lived in Wolong. By 2008, 350 horses lived there in 20 to 30 herds. To understand the scope of the problem, Hull and her colleagues put the same type of GPS collars they were using to track pandas on one horse in each of the four herds they studied.”

Over the course of a year, they compared the activity of the horses with three adult collared pandas in some of the same regions of the reserve, and also combined that information with habitat data. They found that horses not only dined on bamboo, but also preferred the same type of sunny, gently sloped living spaces as pandas.

“Pandas and horses eat about the same amount of bamboo, but a herd of more than 20 horses made for a feeding frenzy, decimating areas the reserve was established to protect,” the university said, adding that the horse issue has since been solved and the livestock had been banned from Wolong by the reserve’s managers.

Image 2 (below): Vanessa Hull with horses in Wolong. Credit: Michigan State University

Jargon Or Gibberish? How Springer And Other Scientific Journals Were Fooled By Computer-Generated Papers

redOrbit Staff & Wire Reports – Your Universe Online

For the average reader, the line between jargon-heavy scientific research and unintelligible gibberish is a fine one, but apparently Joe Sixpack isn’t the only one who occasionally has trouble telling the difference.

On Thursday, scientific journal publisher Springer announced that it would be removing 16 fake research papers from its archives after learning that they were essentially computer-generated nonsense. The firm said that they were tipped off by Dr. Cyril Labbé, a French researcher who published research on how to detect computer-generated papers last January in the journal Scientometrics.

According to AFP reporters Richard Ingham and Laurent Banguet, the fraudulent papers were created using SCIgen, a free program used to create pseudo-academic research. They were then submitted to computer science and engineering conferences and then printed in specialized, subscription-only publications.

Ingham and Banguet said that SCIgen allows users to produce “impressive-looking” fake research studies “stuffed with randomly-selected computer and engineering terms.” The document comes “complete with fake graphs and citations – essential features in scientific publishing” and “includes recent references to famous scientists.”

The AFP reporters also included an example of the SCIgen content: “Constant-time technology and access points have garnered great interest from both futurists and physicists in the last several years. After years of extensive research into superpages, we confirm the appropriate unification of 128-bit architectures and checksums.”

Similarly, on Friday, Jemima Lewis of The Telegraph detailed how she was able to use the program to develop a fake academic paper entitled “The Impact of Amphibious Models on Interposable Algorithms,” which featured what she referred to as a “memorable” opening line: “The exploration of lambda calculus has investigated neural networks, and current trends suggest that the construction of Byzantine fault tolerance will soon emerge.”

“We are in the process of taking the papers down as quickly as possible. This means that they will be removed, not retracted, since they are all nonsense. A placeholder notice will be put up once the papers have been removed,” Springer said in its statement. “Furthermore, we are looking into our procedures to find the weakness that could allow something like this to happen, and we will adapt our processes to ensure it does not happen again.”

“For the moment, we are using detection programs and manpower to sift through our publications to determine if there are more SCIgen papers. We have also reached out to Dr. Labbé for advice and collaboration on how to go about this in the most effective manner,” it added. “We are confident that, for the vast majority of the materials we publish, our processes work. When flaws are detected by us, or brought to our attention by members of the scientific community, we aim to correct them transparently and as quickly as possible.”

Springer’s disclosure highlights a growing problem in the scientific publishing industry. Dr. Labbé’s work uncovered a total of 120 computer-generated papers that had been published by respected institutions in the US, Germany and China, Lewis explained. Over 100 of those papers were published in by the US Institute of Electrical and Electronic Engineers (IEEE), The Guardian noted.

SCIgen was originally developed by MIT graduate students Jeremy Stribling, Dan Aguayo and Maxwell Krohn in 2005. The trio wanted to expose how scientific contests would accept any type of academic papers so long as the authors were willing to pay the registration fees, the Guardian said.

The program took just a few days to complete, and the nonsensical paper it produced was sent to and accepted by a conference. While their efforts “revealed a farce that lay at the heart of science,” the UK newspaper said that SCIgen is “the hoax that keeps on giving” because Stribling, Aguayo and Krohn made the software free to download and, as Dr. Labbé’s research proves, “scientists have been using it in their droves.”

“There will always be individuals who try to undermine existing processes in order to prove a point or to benefit personally. Unfortunately, scientific publishing is not immune to fraud and mistakes, either,” Springer concluded. “The peer review system is the best system we have so far and this incident will lead to additional measures on the part of Springer to strengthen it.”

Microscopic Creepy-Crawly Discovered By Ohio State Graduate Student

[ Watch the Video: Digging Up A New Mite Species ]

April Flowers for redOrbit.com – Your Universe Online

It may resemble a worm, but it’s actually a previously undiscovered microscopic species of mite discovered on The Ohio State University campus.

Discovered by Samuel Bolton, a graduate student at Ohio State, the mite was officially named Osperalycus tenerphagus (or as it is affectionately known – the “Buckeye Dragon Mite”). Osperalycus tenerphagus is Latin for “mouth purse” and “tender feeding,” referring to its complex and highly unusual oral structure.

Rather than the mythological winged dragon, the mite resembles the snake-like Chinese dancing dragons that appear in New Year festivals. It does not, however, resemble the typical mite, which is characterized by a large round body and tough external surface. The adult O. tenerphagus is just 600 microns, or just over half a millimeter and cannot be seen by the naked eye.

“It is incredibly intricate despite being the same size as some single-cell organisms,” Bolton, who is a doctoral student in evolution, ecology and organismal biology, told Ohio State’s Emily Caldwell. “That’s the fascinating thing about mites and arthropods – mites have taken the same primitive and complex form and structure that they’ve inherited and shrunken everything down. So we’re dealing with complexity at an incredibly small scale.” Bolton described his discovery online in the Journal of Natural History. O. tenerphagus is the fifth species from the worm-like family Nematalycidae to be described, and only the second in North America.

Initial examination of the mites collected from silty clay loam soil across the street from the acarology lab suggested that Bolton had discovered a novel species. Bolton collected his mites from a soil depth of about 20 inches. When he examined them under a compound microscope, he found that they had numerous straight hairs all along their bodies (known as setae) that didn’t match any of the known members of this family. The mites use these hairs to feel their way around.

Bolton was surprised to find the mites in a clay-like patch of earth as Nematalycidae are more closely linked to sandy soils. He thinks the key to finding the mites was digging 20 inches down.

Bolton was unable to learn all the details of his extraordinary find until a year later when he was able to examine the mite in a low-temperature scanning electron microscope (LT-SEM) run by the US Department of Agriculture.

Bolton used LT-SEM to capture high-resolution images of these tiny creatures. He marveled at the machinery of their mouths, which had structures called rutella, which typically function in a similar way to teeth in other mites. In these mites, the rutella instead support a pouch-like vessel in the front of the mouth. Bolton believes that the pouch acts like a nutcracker, holding microorganisms in place while the internal pincers puncture them and suck up their fluid contents.

To obtain images of mites of this size and body type, cold-temperature scanning is necessary so that they aren’t crushed by the intense vacuum effect of a normal electron microscope. The research team, which included Hans Klompen, professor of evolution, ecology and organismal biology at Ohio State, Gary Bauchan of the USDA Electron and Confocal Microscope Unit and Ronald Ochoa of the USDA Systematic Entomology Laboratory, used liquid nitrogen to freeze the mites immediately upon collection. This allowed the team to obtain images of the mites just as they appeared in their natural habitat.

They found that although the mite’s movement and muscle pattern do resemble a worm’s, it is unable to alter its diameter the way a worm can.

The mites have an external surface that resembles abacus beads. They “are like miniature accordions,” Bolton said. “It’s a case of convergent evolution – they have the same basic way of moving as worms, insofar as their cuticle extends and contracts, but they also have legs and, to some extent, still use them. The worm-like motion helps them move around through tight spaces.”

The Neatalycidae family of mites are the evolutionary descendants of ancient groups of mites whose fossils date back 400 million years, when the environment was arid throughout much of the world.

“They’re well adapted to living in extremely adverse environments – which makes them extremophiles. They’re also fascinating to look at, and are interesting for addressing ecological and evolutionary questions,” he said. “Because of their small size, there is very little understanding of how mites interact with their environment or other organisms.”

Bolton will continue his research by describing the mite’s complex oral structure, and he hopes to identify specifically what it uses for food.

Instrument Packages Help Researchers Get A Better Understanding Of Sharks

April Flowers for redOrbit.com – Your Universe Online
Researchers from the University of Hawaii and the University of Tokyo are gaining novel insights into how one of the most feared and least understood ocean predators swims, eats and lives — all courtesy of instruments strapped onto and ingested by sharks.
The sharks were outfitted with sophisticated sensors and video recorders to measure and see where they are going, how they are getting there, and what they do once they reach their destinations.
Using instruments ingested by sharks and other top predators, like tuna, a new project is gaining an awareness into these animals’ feeding habits. The instruments use electrical measurements to track ingestion and digestion of prey. These measurements help scientists understand where, when and how much sharks and other predators are eating, as well as what they are eating.
The ‘shark’s eye view’ of the ocean is providing the researchers with a greater understanding than ever before of the lives of these fish in their natural environment.
“What we are doing is really trying to fill out the detail of what their role is in the ocean,” said Carl Meyer, an assistant researcher at the Hawaii Institute of Marine Biology at the University of Hawaii at Manoa. “It is all about getting a much deeper understanding of sharks’ ecological role in the ocean, which is important to the health of the ocean and, by extension, to our own well-being.”
The team used the sensors and video recorders to capture unprecedented images of sharks of different species swimming in schools. They were also able to capture the shark species interacting with other fish and moving in repetitive loops across the sea bed. Contrary to what researchers had previously thought, the sharks used powered swimming more often than a gliding motion to move through the water. They also found that deep-sea sharks swim in slow motion compared to shallow water species.
“These instrument packages are like flight data recorders for sharks,” Meyer said. “They allow us to quantify a variety of different things that we haven’t been able to quantify before.”
“It has really drawn back the veil on what these animals do and answered some longstanding questions,” he added.
Meyer collaborated with Kim Holland, also a researcher at the Hawaii Institute of Marine Biology. The two presented their research at the 2014 Ocean Sciences Meeting, co-sponsored by the Association for the Sciences of Limnology and Oceanography, The Oceanography Society and the American Geophysical Union.
Meyer notes that sharks are at the top of the ocean food chain, making them an important part of the marine ecosystem. Gaining more knowledge of these fish will help researchers better understand the flow of energy through the ocean. Prior to this study, most shark studies have observed the fish in captivity, and they were only tracked to see where they traveled.
The findings could help shape future conservation and resource management efforts, as well as inform public safety measures. Holland added that the instruments being used to study the feeding habits of sharks could have commercial uses, including some for aquaculture.
Image 2 (below): A sixgill shark with a combined sensor and video recorder attached to it swims through the ocean. The instruments are giving scientists a “shark’s eye” view of the ocean and revealing new findings about shark behavior, according to research being presented at the Ocean Sciences Meeting. Credit: Mark Royer/University of Hawaii

Differences In The Motor Cortex Possibly Linked To Insomnia

redOrbit Staff & Wire Reports – Your Universe Online

The movement-based part of an insomniac’s brain tends to be more active and demonstrate greater neuroplasticity than the same region in good sleepers, researchers from Johns Hopkins Medical Institution report in the March issue of the journal Sleep.

Study leader Rachel E. Salas, an assistant professor of neurology at Johns Hopkins, and her colleagues found that the increased brain plasticity in the motor cortex of people with chronic insomnia was more adaptable to change than those who do not struggle to get a good night’s sleep.

In addition, the researchers found more “excitability” among neurons in this region in patients who frequently have difficulty sleeping, which suggests that insomniacs are constantly living in a state of elevated information processing which could help interfere with their attempts to get enough slumber.

“Insomnia is not a nighttime disorder,” Dr. Salas said in a statement Friday. “It’s a 24-hour brain condition, like a light switch that is always on. Our research adds information about differences in the brain associated with it.”

They hope that their findings, which were obtained using a painless and noninvasive process known as transcranial magnetic stimulation (TMS), can lead to improved diagnosis and treatment of this common and often difficult to manage sleep disorder, which impacts an estimated 15 percent of all Americans.

TMS, which has been approved by the US Food and Drug Administration (FDA) to treat some individuals suffering from depression, delivers electromagnetic currents to precise locations in the brain. The process temporarily and safely disrupts the function of a targeted area (such as the mood control region of the brain).

The researchers recruited 28 adults, 18 of whom had suffered from insomnia for at least one year, and outfitted each of them with an electrode on their dominant thumbs and an accelerometer to measure the thumb’s speed and direction. Each was then given a series of 65 electrical pulses using TMS, which stimulated parts of the motor cortex as the study authors looked for involuntary thumb movements linked to the procedure.

“Subsequently, the researchers trained each participant for 30 minutes, teaching them to move their thumb in the opposite direction of the original involuntary movement. They then introduced the electrical pulses once again,” the institute said. The goal was to measure the extent to which each of the participants’ brains could be conditioned to move their thumbs involuntarily in the newly trained direction, indicating the plasticity of their motor cortexes.

Since a link between decreased daytime memory and concentration has been associated with a lack of sleep during the nighttime, Dr. Salas and her associated believe that the brains of good sleepers could be more easily retrained. However, their research indicates that the opposite is true: the brains of men and women suffering from chronic insomnia were found to have far more neuroplasticity than their counterparts.

According to the Johns Hopkins researchers, “the origins of increased plasticity in insomniacs are unclear, and it is not known whether the increase is the cause of insomnia. It is also unknown whether this increased plasticity is beneficial, the source of the problem or part of a compensatory mechanism to address the consequences of sleep deprivation associated with chronic insomnia.”

Dr. Salas said that the increased metabolism, elevated cortisol levels and heightened anxieties described in chronic insomnia could be associated with increased plasticity in some way. There is no objective test for insomnia, nor is there a single treatment plan that is effective for all patients, but the new study shows that TMS could help diagnose the condition and potentially prove to be a way to treat it by reducing excitability.

Skin Cancer Risk May Have Driven Evolution Of Black Skin

Early humans may have evolved black skin to protect against a very high risk of dying from ultraviolet light (UV)-induced skin cancer, a new analysis concludes.
Skin cancer has usually been rejected as the most likely selective pressure for the development of black skin because of a belief that it is only rarely fatal at ages young enough to affect reproduction.
But a new paper, published in Proceedings of the Royal Society B, cites evidence that black people with albinism from parts of Africa with the highest UV radiation exposure, and where humans first evolved, almost all die of skin cancer at a young age.
The paper, by Professor Mel Greaves at The Institute of Cancer Research, London, cites studies showing that 80 per cent or more of people with albinism from African equatorial countries such as Tanzania and Nigeria develop lethal skin cancers before the age of 30.
Albinism is also linked to skin cancer in indigenous populations of other tropical countries with high, year-round UV exposure such as Panama.
Professor Greaves argues that the fact that people with albinism, which is caused by genetic changes that prevent the production of melanin, develop cancer at reproductive ages is indirect but persuasive evidence that early, pale-skinned humans were under strong evolutionary pressure to develop melanin-rich skin in order to avoid lethal skin cancer.
Genetic evidence suggests that the evolution of skin rich in eumelanin, which is brown-black in colour, occurred in early humans between 1.2 and 1.8 million years ago in the East African Savannah. Early humans having lost most of their body hair (probably to facilitate heat loss) probably had pale skin containing pheomelanin – like our nearest surviving relatives, chimpanzees. Pheomelanin, characteristic of white skin, is red-yellow and packaged into smaller stores under the skin than eumelanin, characteristic of black skin. Eumelanin provides a much more effective barrier against the DNA damage that causes skin cancers, providing almost complete protection.
Most scientists agree the development of black skin occurred in early humans primarily because of the ability of eumelanin to effectively absorb ultraviolet radiation, but they have debated exactly how this could have protected early humans against lethal diseases.
As well as affecting skin cancer risk, increased black melanin production could have given other benefits that helped individuals to pass on their genes to the next generation, such as preventing damage to sweat glands or the destruction of folate, which is important in foetal development.
While there could have been many benefits of having black skin in Africa (and retaining it in New Guinea), Professor Greaves argues that individuals with albinism and no protective benefit from melanin almost all die young from cancer.
Professor Greaves is Director of the new Centre for Evolution and Cancer at The Institute of Cancer Research (ICR). The Centre aims to gain new insights into how individual cancers evolve – the process behind the development of drug resistance, and the often extraordinary genetic diversity within single tumours – and to uncover clues in our evolutionary history that could help us understand why human cancers develop.
Professor Mel Greaves, Director of the Centre for Evolution and Cancer at The Institute of Cancer Research, London, said:
“Charles Darwin thought variation in skin colour was of no adaptive value and other investigators have dismissed cancer as a selective force in evolution. But the clinical data on people with albinism, particularly in Africa, provide a strong argument that lethal cancers may well have played a major role in early human evolution as an important factor in the development of skin rich in dark pigmentation – in eumelanin.”

On the Net:

A Predictive Fitness Model For Influenza

Researchers at Columbia University and the University of Cologne have created a new model to successfully predict the evolution of the influenza virus from one year to the next. This advance in our understanding of influenza suggests a new, systematic way to select influenza vaccine strains. The findings appear in Nature on Feb. 26.

The flu is one of the major infectious diseases in humans. Seasonal strains of the influenza A virus account for about half a million deaths per year. In a concerted effort, WHO and its Collaborating Centers have closely monitored the evolution of the seasonal H3N2 influenza strains for over 60 years. Based on these data, influenza strains are selected for vaccine production twice per year. Because influenza is a fast-evolving pathogen, the selection of optimal vaccines is a challenging global health issue.

In recent years, it became clear that the evolution of the flu is a complex process. Different influenza strains compete with each other; the race is about how to successfully infect humans. This prompted Marta Łuksza, of Columbia’s Biological Sciences department and Michael Lässig of the Institute for Theoretical Physics at the University of Cologne, to ask the question: Can we predict which of these competitors will win the race? “This was a challenge for an evolutionary biologist because there are very few systems in the wild for which quantitative predictions of their evolution are at all feasible,” says Łuksza. “It was also a computational and theoretical challenge. While traditional evolutionary thinking is about reconstruction of the past, we had to develop ideas on how to reach into the future.” Most importantly, the scientists had to find out which part of the system can be actually predicted and which are random. In their approach they used ideas from physics and computer science.

Łuksza and Lässig used Darwin’s principle: survival of the fittest. But what determines how fit an influenza virus is? First, they considered innovation: the virus had to keep a high rate of mutations in order to escape from human immune response. But they also included conservation: these mutations must not compromise the essential functions of a virus, such as the correct folding of its proteins. Through studying the genomes of the virus, they devised a way to predict which viral strains have the optimal combination of innovation and conservation.

While Łuksza and Lässig focused on influenza, their approach highlights a general link between evolution and its consequences for epidemiology that is relevant for many fast-evolving pathogens. In a broader context, it touches upon the fundamental question of how predictable biological evolution is. “There is clearly no general answer to this question,” says Łuksza. “But our analysis shows under what auspices limited predictions may be successful.” Further extensive tests with global influenza data would help determine whether their method would lead to improved vaccines.

On the Net:

Silver Nanoparticles Can Penetrate Cells And Cause Damage

University of Southern Denmark

Endocrine disrupters are not the only worrying chemicals that ordinary consumers are exposed to in everyday life. Also nanoparticles of silver, found in e.g. dietary supplements, cosmetics and food packaging, now worry scientists. A new study from the University of Southern Denmark shows that nano-silver can penetrate our cells and cause damage.

Silver has an antibacterial effect and therefore the food and cosmetic industry often coat their products with silver nanoparticles. Nano-silver can be found in e.g. drinking bottles, cosmetics, band aids, toothbrushes, running socks, refrigerators, washing machines and food packagings.

“Silver as a metal does not pose any danger, but when you break it down to nano-sizes, the particles become small enough to penetrate a cell wall. If nano-silver enters a human cell, it can cause changes in the cell”, explain Associate Professor Frank Kjeldsen and PhD Thiago Verano-Braga, Department of Biochemistry and Molecular Biology at the University of Southern Denmark.

Together with their research colleagues they have just published the results of a study of such cell damages in the journal ACS Nano.

The researchers examined human intestinal cells, as they consider these to be most likely to come into contact with nano-silver, ingested with food.

“We can confirm that nano-silver leads to the formation of harmful, so called free radicals in cells. We can also see that there are changes in the form and amount of proteins. This worries us”, say Frank Kjeldsen and Thiago Verano-Braga.

A large number of serious diseases are characterized by the fact that there is an overproduction of free radicals in cells. This applies to cancer and neurological diseases such as Alzheimer’s and Parkinson’s.

Kjeldsen and Verano-Braga emphasizes that their research is conducted on human cells in a laboratory, not based on living people. They also point out that they do not know how large a dose of nano-silver, a person must be exposed to for the emergence of cellular changes.

“We don’t know how much is needed, so we cannot conclude that nano-silver can make you sick. But we can say that we must be very cautious and worried when we see an overproduction of free radicals in human cells”, they say.

Nano-silver is also sold as a dietary supplement, promising to have an antibacterial, anti-flu and cancer-inhibitory effect. The nano-silver should also help against low blood counts and bad skin. In the EU, the marketing of dietary supplements and foods with claims to have medical effects is not allowed. But the nano-silver is easy to find and buy online.

In the wake of the University of Southern Denmark-research, the Danish Veterinary and Food Administration now warns against taking dietary supplements with nano-silver.

“The recent research strongly suggests that it can be dangerous”, says Søren Langkilde from the Danish Veterinary and Food Administration to the Danish Broadcasting Corporation (DR).

Ref: Insights into the Cellular Response Triggered by Silver Nanoparticles using Quantitative Proteomics. ACS NANO. http://dx.doi.org/10.1021/nn4050744

Image 2 (below): Thiago Verano-Braga, Ph.D., of the University of Southern Denmark. Credit: Birgitte Svennevig/University of Southern Denmark

Researchers Find No Relief From Extreme Heat During Global Warming Hiatus

redOrbit Staff & Wire Reports – Your Universe Online

Even though climate scientists claim that the rise in average global temperatures has slowed over the past 10 or 20 years, research published this week in the journal Nature Climate Change has found a continued increase in extreme hot temperatures during that time.

In the study, Dr. Lisa Alexander of the University of New South Wales (UNSW) in Australia and an international team of colleagues discovered a dramatic increase in both the number and area of extremely hot land temperatures, despite the supposed global warming hiatus.

“It quickly became clear, the so-called ‘hiatus’ in global average temperatures did not stop the rise in the number, intensity and area of extremely hot days,” Dr. Alexander explained in a statement Wednesday. “Our research has found a steep upward tendency in the temperatures and number of extremely hot days over land and the area they impact, despite the complete absence of a strong El Niño since 1998.”

She and her fellow researchers opted to examine the extreme end of the temperature spectrum, as that region is where the effects of global warming tend to have the earliest and deepest impact on society – as demonstrated by the extreme conditions experienced by inhabited areas in Australia over the past two summers.

Their efforts revealed that, on average, extremely hot events were impacting more than twice the area than they did 30 years ago. Dr. Alexander’s team reviewed data pertaining to hot days dating back more than three decades, comparing temperatures of every day of the year to the same calendar days from 1979 through 2012.

The investigators selected the hottest 10 percent of all days over that period and classified them as the hot temperature extremes. They found that from 1979 through 2012, regions that experienced 10, 30 or 50 extremely hot days above that average experienced the greatest increase in number of extremely hot days and the surface area covered by that heat. Furthermore, those trends continued through the supposed hiatus that started in 1998.

“Our analysis shows there has been no pause in the increase of warmest daily extremes over land and the most extreme of the extreme conditions are showing the largest change,” said postdoctoral climate scientist Dr. Markus Donat.

He added that the researchers also found that “regions that normally saw 50 or more excessive hot days in a year saw the greatest increases in land area impact and the frequency of hot days. In short, the hottest extremes got hotter and the events happened more often.”

The study authors said their findings indicate that, even though global annual average near-surface temperatures are widely used to gauge global warming, they do not account for all parts of the climate system. Even if global annual mean temperatures grow stagnant for a period of 10 to 20 years, it does not necessarily mean that warming has stopped – other factors, such as extreme temperatures, must also be considered.

“It is important when we take global warming into account, that we use measures that are useful in determining the impacts on our society,” explained lead researcher Sonia Seneviratne, a professor with the ETH Zurich Institute for Atmospheric and Climate Science.

“Global average temperatures are a useful measurement for researchers but it is at the extremes where we will most likely find those impacts that directly affect all of our lives,” she added. “Clearly, we are seeing more heat extremes over land more often as a result of enhanced greenhouse gas warming.”

Image 2 (below): This image shows a time series of temperature anomalies for hot extremes over land (red) and global mean temperature (black, blue). The anomalies are computed with respect to the 1979-2010 time period. The time series are based on the ERA-Interim 95th percentile of the maximum temperature over land (Txp95_Land, red) and the global (ocean + land) mean temperature (Tm_Glob) in ERA-Interim (blue) and HadCRUT4 (black). Credit: Nature Climate Change commentary by authors: No pause in the increase of hot temperature extremes

Physicians’ Stethoscopes More Contaminated Than Palms of Their Hands

A comparative analysis shows that stethoscope diaphragms are more contaminated than the physician’s own thenar eminence (group of muscles in the palm of the hand) following a physical examination.
Credit: Mayo Clinic Proceedings
[ Read the Article: Stethoscopes Can Be More Contaminated Than Doctors’ Hands ]

Internet Is Turning 25, But Not Everyone Is Celebrating

Lee Rannals for redOrbit.com – Your Universe Online

According to a new survey by the Pew Research Center, 15 percent of Internet users believe the World Wide Web has been bad for society.

The Internet will be turning 25 years old this year, but the latest survey sheds light on how not everyone is as appreciative about the creation of the global cyber network as most have been.

“Using the Web—browsing it, searching it, sharing on it—has become the main activity for hundreds of millions of people around the globe. Its birthday offers an occasion to revisit the ways it has made the internet a part of Americans’ social lives,” Pew said in a report marking the 25th anniversary of the Internet.

According to the new survey, 87 percent of American adults now use the Internet, while the Web has saturated 99 percent of American households making $75,000 per year or more. They also saw that 68 percent of adults connect to the Internet with smartphones or tablets.

Pew said that over the course of its polling, it has seen adult ownership of cell phones jump from 53 percent in 2000 to 90 percent now. Smartphone ownership has grown from 35 percent in 2011 to 58 percent now.

The survey asked respondents whether it would be hard to give up their technologies and found that those who use the Internet and smartphones feel like their devices are increasingly essential. However, respondents who use more traditional technologies like landlines and televisions are less reluctant to give up their respective devices.

This Internet obsession has grown over the last eight years, according to Pew.

The survey found that 53 percent of Internet users say it would be at minimum “very hard” to give up the Internet, while in 2006 this number was just at 38 percent. Overall, 46 percent of all adults say that the Internet would be very hard to give up.

“In addition to this enthusiasm, a notable share of Americans say the internet is essential to them,” Pew said. “Among those internet users who said it would be very hard to give up net access, most said being online was essential for job-related or other reasons.”

When considering the whole US population, about four in ten adults feel they absolutely need to have Internet access.

Lastly, Pew asked participants: “Overall, when you add up all the advantages and disadvantages of the internet, would you say the internet has mostly been a good thing or a bad thing for society?” The survey found that 15 percent of respondents say the Internet has been bad for society, while 8 percent said it has been both good and bad.

While there’s always a critic, 76 percent of the US believes what Sir Tim Berners-Lee created in 1989 is a good thing. Since its inception, the Internet has connected the world in a way it never has before, opening up opportunities for a new generation that previous generations couldn’t have dreamed of.

New Austrian Study Suggests Super-Earths Are Unfit For Life

John P. Millis, Ph.D. for redOrbit.com – Your Universe Online

The last two decades have brought unprecedented understanding of worlds outside of our solar system. The launch of the Kepler space telescope and other instruments have allowed astronomers to find and characterize hundreds of new planets, with thousands more candidates waiting to be confirmed.

As our technology continues to improve, we will be able to isolate smaller and smaller planets, eventually identifying Earth-sized examples with regularity, and hopefully some with habitable conditions. To this point, however, nearly all of the discoveries have been of planets several times the size of Earth — many of these larger even than Jupiter. Most of these worlds are quite different than the planets in our little corner of the galaxy, so astronomers are now working to understand how they form and evolve.

A team of scientists have created a new model that suggests that super-Earths, even those in the habitable zone – the region around a star where the temperature would allow for the existence of liquid water on a planet – would be unfit for life.

Solar systems form out of clouds dominated by hydrogen and helium, and trace amounts of other elements. Over time dust and rocky material heat up and clump together forming what will eventually be planets around the emerging star. As these cores increase in size and mass, their gravitational influence accretes hydrogen gas from the dwindling cloud. Some of this gas will be ejected from the forming world by the ultraviolet light of the star.

This battle of accumulation and removal proceeds until a relative equilibrium is reached, a process that will be dependent on the masses of the planet and star, and their relative distance from each other (other factors such as the stellar brightness in ultraviolet radiation are also important).

Dr. Helmut Lammer, from the Space Research Institute (IWF) of the Austrian Academy of Sciences, reports in a new study that modeled the accumulation of gas for planetary cores that are between 0.1 and 5 times the mass of the Earth.

Planets with similar density to earth will struggle to capture gas until they surpass 0.5 times the Earth’s mass. In the case where the forming object is similar in mass to Earth, there is a tug-of-war between the accumulation and dissipation of gas, so the atmospheres are not likely to become highly dense.

However, for the highest mass cores – the aptly named super-Earths – almost all of the gas is bound to the surface because of their stronger gravitational fields. In fact, the so-called super-Earths may be more like mini-Neptunes.

“Our results suggest that worlds like these two super-Earths may have captured the equivalent of between 100 and 1000 times the hydrogen in the Earth’s oceans, but may only lose a few percent of it over their lifetime,” reports Lammer. “With such thick atmospheres, the pressure on the surfaces will be huge, making it almost impossible for life to exist.”

Lammer’s study is published in the Monthly Notices of the Royal Astronomical Society

Older Fathers Linked To Higher Risk Of Cognitive, Behavioral Issues In Their Offspring

Brett Smith for redOrbit.com – Your Universe Online

A large new study conducted by a team of American and Swedish researchers has found a connection between paternal age and the risk of a child developing cognitive or behavioral problems.

The finding is particularly alarming considering the recent trend of couples putting off raising a family to pursue their careers or other interests.

“We were shocked by the findings,” said study author Brian D’Onofrio, an associate professor of psychology and brain sciences at Indiana University Bloomington.

“The specific associations with paternal age were much, much larger than in previous studies,” D’Onofrio added. “In fact, we found that advancing paternal age was associated with greater risk for several problems, such as ADHD, suicide attempts and substance use problems, whereas traditional research designs suggested advancing paternal age may have diminished the rate at which these problems occur.”

Published in JAMA Psychiatry, the study was based on a massive data set: everyone born in Sweden from 1973 to 2001. The study team found that a child born to a 45-year-old father is 3.5 times more likely to develop autism, 13 times more likely to develop ADHD, twice as likely to develop a psychotic disorder, 25 times more likely to receive a diagnosis for bipolar disorder and 2.5 times more likely to exhibit suicidal behavior or a substance abuse problem when compared to a child with a 24-year-old father.

For most of these issues, the odds of development increased steadily with paternal age, indicating there is no age threshold at which childbearing suddenly becomes riskier.

The study team compared siblings in their research, which considers factors for children living in the same house to be very similar. When they did this, they found that the connections with advancing paternal age were much stronger than for the general population. The team also compared the development of first-cousins to control for sibling relationships and birth order.

The authors also considered parents’ highest level of education and income, because older parents are considered to be more mature and financially secure. However, the findings were extremely consistent, as the connections between mental problems and advancing paternal age continued.

“The findings in this study are more informative than many previous studies,” D’Onofrio said. “First, we had the largest sample size for a study on paternal age. Second, we predicted numerous psychiatric and academic problems that are associated with significant impairment. Finally, we were able to estimate the association between paternal age at childbearing and these problems while comparing differentially exposed siblings, as well as cousins.”

“These approaches allowed us to control for many factors that other studies could not,” he concluded.

The conclusions of the study are particularly troubling as the average age for having a child has been escalating for both men and women over the last four decades. For men the average age is three years older than it was in 1970, according to the study researchers. They noted that the implications of this trend are yet to be understood.

“While the findings do not indicate that every child born to an older father will have these problems,” D’Onofrio said, “they add to a growing body of research indicating that advancing paternal age is associated with increased risk for serious problems. As such, the entire body of research can help to inform individuals in their personal and medical decision-making.”

Seeing Is Better Than Hearing When It Comes To Retaining Memories

Brett Smith for redOrbit.com – Your Universe Online

Having a hard time remembering someone’s name or a phone number? It turns out that seeing is better than hearing when it comes to memory.

According to a new study from the University of Iowa published in the journal PLOS ONE, people are more apt to remember something they see or touch compared to something they hear.

“As it turns out, there is merit to the Chinese proverb ‘I hear, and I forget; I see, and I remember,” said study author James Bigelow, a graduate student at UI.

“We tend to think that the parts of our brain wired for memory are integrated,” said co-author Amy Poremba, associate professor of psychology at the university. “But our findings indicate our brain may use separate pathways to process information. Even more, our study suggests the brain may process auditory information differently than visual and tactile information, and alternative strategies – such as increased mental repetition – may be needed when trying to improve memory.”

In the new study, over 100 UI undergraduate students were given various sounds, visuals and objects to feel. The team found that volunteers were least likely to recollect the sounds they had heard than the other stimuli.

In an initial experiment evaluating short term memory, volunteers were told to pay attention to tones they heard through headphones, examine various shades of red squares, and feel subtle vibrations by gripping an aluminum bar. Each set of stimuli was separated by delays ranging from one to 32 seconds.

Although students’ memory diminished when time delays were longer, the memory decrease was much greater for sounds, and started around four to eight seconds after being exposed to them.

Poremba compared that relatively quick time span to forgetting a phone number that you just heard.

“If someone gives you a number, and you dial it right away, you are usually fine. But do anything in between, and the odds are you will have forgotten it,” she said.

In a second trial, the researchers looked at participants’ memory by using things they might come across on a daily basis. Participants heard audio recordings of dogs barking, watched muted videos of a basketball game, and held typical objects they were not allowed to see. The scientists discovered that between an hour and a week later, volunteers were worse at recalling the sounds they had heard, but their memory for visual and tactile objects was about the same.

Both trials show that the way the mind processes and stores sound may be unlike the way it analyzes and stores other kinds of memories, the study team said.

“As teachers, we want to assume students will remember everything we say. But if you really want something to be memorable you may need to include a visual or hands-on experience, in addition to auditory information,” Poremba said.

The study team noted that tests with non-human primates have shown that they also excel at visual and tactile memory tasks while having a problem with auditory tasks. This led the researchers to theorize that humans’ weakness for remembering sounds probably has its origins in the primate brain.