Taste Receptor Gene Determines How Well We Sense Food

Brett Smith for redOrbit.com – Your Universe Online

Some people’s sense of taste is so perceptive, they have a strong dislike for food that a majority of people enjoy, such as spicy foods or ‘hoppy’ beers.

According to a new report published in the journal Frontiers in Integrative Neuroscience, these so-called supertasters get their increased sensitivity from a variation in the taste receptor gene TAS2R38 – not a higher than normal amount of taste buds as had been previously thought.

“There is a long-held belief that if you stick out your tongue and look at the bumps on it, then you can predict how sensitive you are to strong tastes like bitterness in vegetables and strong sensations like spiciness,” said study author Nicole Garneau, head of the Department of Health Sciences at the Denver Museum of Nature & Science. “The commonly accepted theory has been that the more bumps you have, the more taste buds you have and therefore the more sensitive you are.”

For the study, the authors had more than 3000 volunteers visiting the Denver museum’s Genetics of Taste Lab stick their tongue out so that their taste buds could be counted. Volunteers also had their sensitivity to bitter-tasting compounds phenylthiocarbamide and propylthiouracil measured. Out of this initial group – only 394 subjects were included in the final analysis.

Study researchers then took cell swabs from this smaller group of volunteers to determine their DNA sequence at the known location for TAS2R38. Results of a genetic analysis showed that some variations in the TAS2R38 gene make it more probable that somebody is hypersensitive to bitter compounds. The researchers also found that the number of taste buds on the tongue, also known as papillae, does not reflect higher taste sensitivity.

“No matter how we looked at the data, we couldn’t replicate this long held assumption that a high number of papillae equals supertasting,” Garneau said.

The study authors noted that some of their work was done with the help of over 130 volunteer citizen scientists who had been specially trained by Garneau and her colleagues. The study authors also called the term ‘supertaster’ antiquated and suggested using the more objective ‘hypergeusia’ instead.

“What we know and understand about how our bodies work improves greatly when we challenge central dogmas of our knowledge,” Garneau said. “This is the nature of science itself.”

“As techniques improve, so too does our ability to do science, and we find that what we accepted as truth 20, 30, or 100 years ago gets replaced with better theories as we gather new data, which advances science,” she added. “In this case, we’ve proven that with the ‘Denver Papillae Protocol’, our new method for objective analysis for papillae density, we were unable to replicate well-known studies about supertasting.”

A refined sense of taste can be very valuable – in fact, Britain’s Costa Coffee announced in 2009 that the company had taken out an insurance policy on one of their taste tester’s tongue worth $14 million.

Image 2 (below): Citizen scientists in the Genetics of Taste Lab enroll Museum guests in the research study. Guests answer questions about themselves and mark their age and gender on the “Where Do You Fit” data tracking wall. The lab is located as a specialized learning area of Expedition Health, the Museum’s permanent health exhibition shown in the background. Credit: Denver Museum of Nature & Science.

Large Numbers Of Shadow Economy Entrepreneurs In Developing Countries

Imperial College London

There are large numbers of entrepreneurs in developing countries who aren’t registering their businesses with official authorities, hampering economic growth, according to new research.

Shadow entrepreneurs are individuals who manage a business that sells legitimate goods and services but they do not register it. This means that they do not pay tax, operating in a shadow economy where business activities are performed outside the reach of government authorities. The shadow economy results in loss of tax revenue, unfair competition to registered businesses and also poor productivity – factors which hinder economic development. As these businesses are not registered it takes them beyond the reach of the law and makes shadow economy entrepreneurs vulnerable to corrupt government officials.

In a study of 68 countries, Professor Erkko Autio and Dr Kun Fu from Imperial College Business School estimated that business activities conducted by informal entrepreneurs can make up more than 80 per cent of the total economic activity in developing countries. Types of businesses include unlicensed taxicab services, roadside food stalls and small landscaping operations.

This is the first time that the number of entrepreneurs operating in the shadow economy has been estimated.

The researchers found that Indonesia has the highest rate of shadow economy entrepreneurs, with a ratio of over 130 shadow economy businesses to every business that is legally registered. After Indonesia the highest rates of shadow economy entrepreneurs are found in India, the Philippines, Pakistan, Egypt and Ghana.

In contrast, the UK exhibits the lowest rate of shadow entrepreneurship among the 68 countries surveyed, with a ratio of only one shadow economy entrepreneur to some 30 legally registered businesses.

The researchers also found that the quality of economic and political institutions has a substantial effect on entrepreneurs registering their businesses around the world.

Professor Erkko Autio, co-author of the report, at Imperial College Business School, said:

“Understanding shadow economy entrepreneurship is incredibly important for developing countries because it is a key factor affecting economic development. We found that government policies could play a big role in helping shadow economy entrepreneurs transition to the formal economy. This is important because shadow economy entrepreneurs are less likely to innovate, accumulate capital and invest in the economy, which hampers economic growth.”

The researchers suggest that shadow entrepreneurs are highly sensitive to the quality of political and economic institutions. Where proper economic and political frameworks are in place, individuals are more likely to become ‘formal’ entrepreneurs and register their business, because doing so enables them to take advantage of laws and regulations that protect their company, such as trademarking legislation.

The researchers suggest if India improved the quality of its democratic institutions to match that of Malaysia, it could boost its rate of formal economy entrepreneurs by up to 50 per cent, while cutting the rate of entrepreneurs working in the shadow economy by up to a third. This means that the government could benefit from additional revenue such as taxes.

To create their league table, the researchers combined data from the Global Entrepreneurship Monitor (GEM) and the World Bank.

Daily 20-Minute Walks Could Help Seniors Stave Off Major Disability

April Flowers for redOrbit.com – Your Universe Online
Moderate exercise is the key to good health in nearly every situation. That is especially true for older adults, according to countless commercials and public health sources.
A new study, called the Lifestyle Interventions and Independence for Elders (LIFE) Study, found that just a 20-minute walk a day could help seniors hold off major disability and enhance the quality of their life.
LIFE is a collaboration between Yale University, the University of Florida, Southern Connecticut State University, Northwestern University, Pennington Biomedical Research Center, Stanford University, Tufts University, the University of Pittsburg, and Wake Forest University. A total of eight field centers participated in the study, the results of which were published in a recent issue of the Journal of the American Medical Association (JAMA) and presented at the American College of Sports Medicine Conference.
Mobility is the ability to walk without assistance. For older adults, mobility becomes the key to independent living and quality of life. Many older adults suffer from a decrease in mobility, which is a risk factor for illness, hospitalization, disability and death. The LIFE study found that moderate physical activity, such as that 20-minute daily walk, helped seniors maintain their ability to walk at a rate of 18 percent higher than those who did not exercise.
“The very purpose of the study is to provide definitive evidence that physical activity can truly improve the independence of older adults,” said principal investigator Marco Pahor, PhD, director of the UF’s Institute on Aging.
Moderate exercise also helped to prevent long-term mobility loss, as well. The study found a 28 percent reduction in people permanently losing the ability to walk easily.
“The fact that we had an even bigger impact on persistent disability is very good,” said Co-principal investigator Jack Guralnik, PhD, a professor of epidemiology and public health at the University of Maryland School of Medicine, who also holds a faculty position at UF. “It implies that a greater percentage of the adults who had physical activity intervention recovered when they did develop mobility disability.”
For the purposes of the study, the researchers defined mobility as the ability to walk 1,300 feet, or approximately one-quarter of a mile. The research team said that although a quarter mile sounds arbitrary, for older adults it is significant.
“Four hundred meters is once around the track, or from the parking lot to the store, or two or three blocks around your neighborhood,” Guralnik said. “It’s an important distance in maintaining an independent life.”
The research team recruited 1,635 men and women ages 70 to 89. The participants were mostly sedentary, with the ability to walk 1,300 feet in 15 minutes, but were in danger of losing that ability. Pahor said that patients with low physical performance are not often the subject of large studies because it can be a predictor of early death, and higher hospitalization and institutionalization rates. This is the largest randomized controlled trial ever conducted on physical activity and health education in older adults.
“These are people who are patients we see every day. This is why this study is so important: It includes a population that is typically understudied,” Pahor said.
The participants were followed for an average of 2.6 years between February 2010 and December 2013. The cohort was separated into two study groups. One group of 818 patients walked 150 minutes per week and did strength, flexibility and balance training. This group visited field centers twice a week for monitoring. The remaining 817 patients attended health education classes and performed stretching exercises.
The participants were assessed by staff members every six months. They were evaluated for their ability to walk, their body weight, blood pressure and pulse rate, along with other measurements. The participants were assigned to their group randomly, and the staff was not informed which patient was in which group.
One surprising result was found: in the physically active group, the number of hospitalizations was slightly higher than in the education group. The increase was not statistically significant, however. The team suggests this could be due to the more frequent contact with research staff of the physical activity group, allowing more opportunities to report hospitalizations. Underlying heart trouble and other health problems could also have been triggered by the activity, which warrants further study, according to Pahor.
“It’s quite a vulnerable and high-risk population,” Pahor said. “Both age and poor health were factors. We selected people who had low physical performance, which is a strong predictor for future morbidity, hospitalization, institutionalization and mortality.”
Before the launch of the main LIFE trial, Wendy Kohrt, PhD, professor of medicine in the division of geriatric medicine at the University of Colorado, helped review the scientific merit of the study.
“As an exercise scientist, I believe this type of research is absolutely critical to establish scientific evidence on which to make recommendations for how lifestyle can beneficially influence health status,” said Kohrt.
“There is a general belief among the public and the scientific and medical communities that we know exercise is good for you, so why do we need to do more research in this area? However, we still do not know whether certain types or doses of exercise are better than others, particularly for specific health conditions or diseases. The LIFE trial demonstrated that a modest increase in physical activity has the potential to help older adults maintain functional independence.”
A great deal of data remains to be analyzed from the study, including the effects of physical activity on the emotional health of the participants. The team plans to continue their research to determine how physical activity impacted the participants’ physiological, social and biological factors.
“We want to change how people live,” said the director of the Yale field center, Thomas Gill, MD, the Humana Foundation Professor of Geriatric Medicine, who chaired the measurement committee, which was responsible for determining the main study outcomes. “Maintaining independence for older adults is both a public health and a clinical priority, and modifying lifestyle is an important approach to maintaining independence.”
Gill added, “Years from now, LIFE will be considered a landmark study, one that has informed policies to keep older persons independent in the community.”

Visual Effects Of E-Cigarette Use Carry Over To Regular Smokers

April Flowers for redOrbit.com – Your Universe Online

If you are a smoker, you know that watching someone else light up can give you the urge to smoke as well. Smoking seems to be as much a psychological and social addiction as it is a physical one. How do electronic cigarettes (e-cigarettes) fit into this pattern?

In the first study to investigate the behavioral effects of exposure to e-cigarettes in a controlled setting, a team of researchers from the University of Chicago has found that observing e-cigarette use increases the urge to smoke in young adult smokers who use regular, combustible cigarettes. The findings, published online in Tobacco Control, reveal that the elevated desire to smoke is just as intense when observing e-cigarette use as when observing combustible cigarette use.

“E-cigarette use has increased dramatically over the past few years, so observations and passive exposure will no doubt increase as well,” said Andrea King, PhD, professor of psychiatry and behavioral neuroscience at the University of Chicago, in a recent statement. “It’s important to note that there could be effects of being in the company of an e-cigarette user, particularly for young smokers. For example, it’s possible that seeing e-cigarette use may promote more smoking behavior and less quitting.”

Electronic cigarettes work by delivering nicotine by way of a heated solution of compounds and flavorings. The resulting vapor, which closely resembles the smoke of regular cigarettes, is inhaled by users. Previous studies have examined the health effects of e-cigarette vapor. Until now, however, none have investigated the visual effects.

[ Watch the Video: Seeing E-Cigarette Use Encourages Young Adult Tobacco Users To Light Up ]

King’s team recruited 60 young adult smokers who were told that their responses to a variety of societal interactions were being assessed. Each participant was paired with an actor who would smoke either a regular or e-cigarette during a conversation. The participants’ urge to smoke was measured at multiple points before and after this interaction.

The observer’s desire to smoke both regular and e-cigarettes was shown to increase significantly. The team found that the urge to smoke a regular combustible cigarette was equally as strong after observing either e-cigarette use, or combustible use. On the other hand, the desire to smoke an e-cigarette did not increase after observing regular cigarette use. To mimic hand-to-mouth behavior for a control, the actors also drank from a bottle of water while engaging in conversation with the participants. In this scenario, no discernible increase in desire for either regular or e-cigarette use was found.

“Whether participants were exposed to someone smoking a combustible or an e-cigarette, the urge to smoke a combustible cigarette was just as high in either condition,” King said. “We know from past research that seeing regular cigarette use is a potent cue for someone to want to smoke. We did not know if seeing e-cigarette use would produce the same effect. But that is exactly what we found. When we re-tested participants 20 minutes after exposure, the desire to smoke remained elevated.”

The researchers say that more study is needed due to the rising sales of e-cigarettes nationwide. This study should focus on the health ramifications for users, as well as the passive, secondary effects on observers.

“This study was our first investigation, and there are still many unanswered questions. We don’t know about the effects on a non-smoker or a person who has quit smoking or if responses are different for the various e-cigarette brands,” she said. “But if the results do generalize and we show this in other groups, it’s important to consider policy going forward in terms of reducing harm for both users and observers of e-cigarettes.”

Using Wikipedia To Self-Diagnose Medical Conditions Is Apparently Not A Good Idea

redOrbit Staff & Wire Reports – Your Universe Online
Wikipedia may be the most popular general reference website on the Internet, but it is far from the most accurate, as evidenced by a new study that discovered inaccuracies in 90 percent of its health-related entries.
In the study, lead investigator Dr. Robert Hasty of Campbell University and his colleagues reviewed the collaboratively edited, free access online encyclopedia’s articles on 10 of the most costly medical conditions, including coronary artery disease, lung cancer, osteoarthritis, chronic obstructive pulmonary disease, hypertension, and diabetes mellitus.
The study authors, whose work appears in a recent edition of the Journal of the American Osteopathic Association, randomly assigned a pair of investigators to independently review each article, identifying implications and assertions passed off as fact in those entries. The reviewers then conducted a literature search to find out whether or not there was evidence to support those claims.
According to Alice Philipson of The Telegraph, the researchers found multiple articles that contained factual errors, largely because of the fact that anyone can create and edit entries. For example, the hypertension article claims that for the condition to be correctly diagnosed, high blood pressure readings have to be obtained on three separate occasions – erroneous information that could delay treatment and put patients at risk.
“While Wikipedia is a convenient tool for conducting research, from a public health standpoint patients should not use it as a primary resource because those articles do not go through the same peer-review process as medical journals,” Dr. Hasty, whose team reviewed the 10 Wikipedia articles on April 25, 2012 and found misinformation in all but one of them, told BBC News health reporter Pippa Stephens on Tuesday.
Wikipedia, which launched in 2001, contains over 31 million entries (including at least 20,000 health-related entries) in 285 languages, according to The Daily Mail’s Sophie Freeman. The study authors added that Wikipedia is the sixth most visited website in terms of global traffic, and that between 47 percent and 70 percent of all physicians and medical students having admitted to using it as a reference.
They also note that Wikipedia “has several mechanisms in place to deal with unverifiable information and vandalism,” and that “most instances of vandalism only exist for a few days after being identified, with half of the corrections being posted less than 3 minutes after being identified.” In fact, they cite research which found that corrections were made almost immediately in 42 percent of those incidents.
Furthermore, they pointed that there were several limitations to their study, including the fact that it did not address errors of omission, meaning that they did not check to make sure that the entries lacked essential information about a topic. They also said their findings would have been stronger had more than two reviewers analyzed each article, and that using physicians-in-training rather than content experts as reviewers could have created a bias.
Even so, Dr. Hasty and his co-authors cautioned that “health care professionals, trainees, and patients should use caution when using Wikipedia to answer questions regarding patient care. Our findings reinforce the idea that physicians and medical students who currently use Wikipedia as a medical reference should be discouraged from doing so because of the potential for errors.”
Stevie Benton of Wikimedia UK told Stephens that there were a “number of initiatives” being enacted to help improve the quality of the articles, “especially in relation to health and medicine.” He said that the online encyclopedia was teaming up with Cancer Research UK, having all cancer-related articles reviewed by clinical researchers and making sure that each of them are accurate and contain the latest information on the disease.
“However, it is crucial that anybody with concerns over their health contacts their GP as a first point of call,” he added. “Wikipedia, like any encyclopedia, should not take the place of a qualified medical practitioner.”

The Debate Continues: New Study Claims Diet Drinks May Actually Help With Weight Loss

Brett Smith for redOrbit.com – Your Universe Online
Critics of so-called ‘diet’ beverages have called them everything from slightly misleading products to liquid poison, but a new study published in the journal Obesity has found that drinking diet beverages can help a person lose weight – compared to water.
“This study clearly demonstrates that diet beverages can in fact help people lose weight, directly countering myths in recent years that suggest the opposite effect – weight gain,” said study author James O. Hill, executive director of the University of Colorado’s Anschutz Health and Wellness Center.
“In fact, those who drank diet beverages lost more weight and reported feeling significantly less hungry than those who drank water alone,” Hill added. “This reinforces that if you’re trying to shed pounds, you can enjoy diet beverages.”
The 12-week study, which included more than 300 participants, directly compared the impact of either water or diet beverages on weight loss within the context of a behavioral weight loss regimen. Study volunteers were randomly put into one of two groups: individuals who were able to drink diet beverages, including diet soft drinks and flavored water, or a control group that could drink only water. Aside from beverage options, both groups followed a similar diet and exercise program for the length of the study.
The study team found that volunteers in the diet beverages group lost an average of 13 pounds – 44 percent more than the control group. Participants in the diet beverage group also reported feeling more satiated, showed cholesterol-related improvements and had lower levels of triglycerides. Both groups had smaller waist sizes and lower blood pressure after the 12-week program.
Nearly two-thirds of the diet beverage group lost at least five percent of their body weight, as opposed to 43 percent of the control group. The researchers noted that losing just five percent of body weight has been linked to a lower risk of heart disease, high blood pressure and type 2 diabetes.
“There’s so much misinformation about diet beverages that isn’t based on studies designed to test cause and effect, especially on the Internet,” said study author John C. Peters, the chief strategy officer of the CU Health and Wellness Center. “This research allows dieters to feel confident that low- and no-calorie sweetened beverages can play an important and helpful role as part of an effective and comprehensive weight loss strategy.”
The results of the new study contradict a study published last year in the journal Cell Press that said diet beverages and other non-caloric, artificially sweetened foods and drinks do not help with weight loss.
The study reviewed data from several recent studies and found a greater risk of type 2 diabetes, high blood pressure, heart disease and metabolic syndrome as a result of regularly consuming artificially sweetened beverages. Some of the studies in the review found that individuals who consume artificially sweetened beverages had twice the risk of developing metabolic syndrome compared to those who avoided them.
Another recent study also found troublingly high levels of a carcinogenic brown food coloring in Pepsi One, a diet soda.

Eco-Atkins Diet May Help Lower Heart Disease Risk As Well As Promote Weight Loss

Brett Smith for redOrbit.com – Your Universe Online

Most low-carbohydrate diets geared toward weight loss emphasize eating chicken or beef, but a new study has shown that a low-carb diet without animal proteins can not only lead to weight loss – it can also reduce the risk of heart disease by 10 percent over the course of a decade.

The study, which was published in the British Medical Journal Open, looked at the impact of the diet often referred to as Eco-Atkins on cardiovascular risk factors and body weight. The researchers noted that more conventional low-carb diets include animal fats, which can raise cholesterol to unhealthy levels.

“We killed two birds with one stone – or, rather, with one diet,” said study author Dr. David Jenkins, director of the Clinical Nutrition and Risk Modification Centre of St. Michael’s Hospital in Toronto, in a  recent statement. “We designed a diet that combined both vegan and low-carb elements to get the weight loss and cholesterol-lowering benefits of both.”

The study team enrolled 39 overweight men and postmenopausal women in either Eco-Atkins or a conventional low-carb diet over the course of six months. Volunteers received menu programs that defined food items and amounts. As opposed to necessitating fixed meals, the menus were a reference guide and volunteers were provided a list of acceptable food alternatives. With this list of interchangeable food items, volunteers were better able to modify the diet to their own tastes – which helped to motivate adherence to the diet, the study team said.

Volunteers were asked to eat only 60 percent of their calculated caloric needs – the quantity of calories needed to sustain their current weight.

Eco-Atkins volunteers tried to get 26 percent of their calories from carbohydrates, 31 percent from proteins and 43 percent from fat such as vegetable oils and nuts. This group ate high-fiber foods such as oats and barley and low-starch vegetables such as okra and eggplant for their carbohydrates. Proteins came from gluten, soy, nuts and cereals.

All 23 participants who completed the six-month study lost weight. However, participants on the vegan diet cut their cholesterol levels by 10 percent and lost an average of four more pounds than those on the conventional diet.

“We could expect similar results in the real world because study participants selected their own diets and were able to adjust to their needs and preferences,” Jenkins said.

Controlling Fruit Fly Movements With A Mind-Altering Device

Alan McStravick for www.redorbit.com – Your Universe Online

A joint collaboration between the Vienna University of Technology and US researchers, has resulted in the development of a unique and novel technique to control Drosophila melangogaster, perhaps better known as the fruit fly, via thermogenetic means. The control the researchers exert is ultimately able to be analyzed at the neural level within the brains of the insects.

Much of the work was conducted at the Information Management and Preservation Lab within the Department of Software Technology and Interactive Systems at VUT by Andrew Straw and his team. In the course of their study, they developed the FlyMAD (Fly Mind Altering Device) which targets either light or heat to a specific body region of a fly that is in motion, triggering a response. FlyMAD has allowed Straw to zero in on two specific neuronal cell types that deal with courtship behavior of the fruit fly. This is due to the fact that FlyMAD, unlike previous techniques used in this field, provides researchers with a much more highly improved temporal resolution of their subject animals.

Until FlyMAD, it was impossible to control the activity of specific neurons in moving flies. That fact had been unfortunate as the fruit fly presents a most ideal experimental system for the analysis of circuit functions within brain cells. For FlyMAD to work, the team had to utilize a subject base of genetically modified, temperature-sensitive flies.

The entire system basically consists of an enclosed box in which the flies are housed. A video camera, able to simultaneously track several flies at once, captures the motion of the flies. The flies are then subjected to targeted irradiation that effectively allows the researchers to alter neural pathways in the brain of the fly.

The infrared light used by FlyMAD causes the temperature of the targeted body region of the fly to increase to 30 degrees Celsius (approximately 86 degrees Fahrenheit) which induces a change of behavior in the animal that is not observed in the animals kept in the control temperature group. Said behavior (and the neuronal processes behind it), though occurring in mere fractions of a second, are able to be observed and recorded by the FlyMAD device. This approach, known as optogenetics, is one of the aspects that makes FlyMAD unique. Previously, optogenetics had been restricted solely to research involving mice.

As noted above, Straw and colleagues were primarily interested in studying the courtship behaviors of the fruit fly. Once the efficacy of FlyMAD was established – they were able to control flies and make them “moonwalk” – the team aimed their focus at certain specific neurons that had previously been linked to courtship behaviors in the animals, specifically their courtship song.

The experiment undertaken by the team explored how one type of neuron was almost singly tied to the act of courtship while the other was important for the action of singing. To determine this fact, the team focused a laser beam at a specific region on the male fruit fly. This action resulted in a mating attempt with a ball of wax while simultaneously vibrating their wings in such a way to create their mating song.

While this experimentation relied on either light or heat, Straw claims future studies will combine the two factors. This, he says, would allow the activation or repression of different genetic elements in one fly.

“FlyMAD offers the fantastic opportunity to address many of our questions,”stated Straw. “We could, for example, analyze how single neurons function in a cascade within the neuronal circuit.”

The results of this study, which could potentially yield new and further insight into the mammalian brain, was published online on May 25 in the journal Nature Methods.

How Trigger Point Injections Help Fibromyalgia

Fibromyalgia is a condition that causes chronic pain and inflammation of muscles.  This pain can be debilitating, as it can rage from mild to moderate to severe.  Those who have fibromyalgia have a variety of symptoms and related health issues.

Symptoms of fibromyalgia include pain in the neck, shoulders, hips, and other areas.  Painful muscles may prevent people from regular activity and can cause depression and anxiety from increased stress levels.

People with fibromyalgia have a lot of chronic pain that can be consistent over months or years.  Because of this chronic pain, people are easily tired and may have a hard time getting through the day’s activities without a nap or several rest periods.

This pain can lead to depression and isolation, as people may feel too exhausted to go out or to work.  This lack of energy can lead to isolation and loneliness if the person is too exhausted to maintain social events and other engagements.  Lack of sleep due to pain is a large reason why many people with fibromyalgia have fatigue and low energy.

There are many alternative therapies for people with fibromyalgia.  There are drug treatments, but many people do not want to take pharmaceutical drugs, or find that these drugs do not improve their condition significantly.  Rather than take drugs, people are seeking out alternative methods of treatment.

Many people are turning to massage, acupuncture, and trigger point therapies in order to ease their pain.  Massage is often done by a professional in a relaxing setting that not only eases muscle tension, but also reduces stress and offers relaxation.  At home massage can also be effective in alleviating muscle stiffness and pain.

Acupuncture is a traditional Chinese medicine treatment that involved relaxing muscles and increasing energy flow by releasing qi.  Qi is the body’s energy flow, and acupuncture has been tested to reduce stress and to ease the pain of fibromyalgia.

Trigger point therapies are designed to treat specifically the painful trigger points that are associated with fibromyalgia.  Low level laser therapy in a non-invasive way to apply energy to deep muscle tissue and painful areas.  Stretching with heat is another way to target tight muscles and ease pain.  Heat can be applied to a specific area before it is stretched, so that muscles are loose and more flexible.  Manual therapies include myofascial release and trigger point massage, and can be conducted by massage therapists or physical therapists.  Lidoderm patches contain lidocaine and are available by prescription.

Trigger point injections (TPI) are done by introducing a needle with anesthetic (lidocaine or corticosteroid) into the center of the trigger point in order to release the tightness and pain.  After the injection, the pain is eliminated because the pain cannot be processed.  Injections are given in a doctor’s office and are a quick and relatively painless process.  This treatment can be done on several locations of the body during one visit.

How trigger point injections help fibromyalgia

TPI is used to treat different muscle groups including arms, legs, lower back and neck.  TPI is often used to treat fibromyalgia and tension headaches.  It is also used to help treat myofascial pain syndrome.

Not all trigger points need to be injected, as some will respond instead to physical therapy, massage, or stretching.  But for those that are chronic trigger points, TPI is a option that can alleviate pain and pressure.

Trigger point injections are ideal for patients who have active trigger points and are suffering from fibromyalgia or other muscle pains.

There are many ways to treat fibromyalgia and ease pain, and pharmaceutical drugs are only one way.  People can try alternative therapies such as massage, acupuncture, exercise, trigger point therapies, and other methods.  It is important for people with fibromyalgia to consider alternative therapies, because different treatments have proven to be more effective for different people.

Treating fibromyalgia is often concerned with making lifestyle changes, including setting a more regular schedule for the day, and not over planning and over committing.  It is important for people to maintain their stress levels and to have methods in place to reduce stress and anxiety.  Often, people will take a walk, listen to music, or watch a movie to reduce their stress levels.  This can be very effective in managing chronic pain and in elevating mood and spirits.

Treating fibromyalgia through therapies such as trigger point therapy is also effective for many people.  This is a slightly invasive method that can assist in pain management.

Trying different treatment methods is important for those with fibromyalgia, as many different treatments will prove to be effective for different people.

Apple Calls For Samsung Product Ban And Retrial In California Lawsuit

Peter Suciu for redOrbit.com – Your Universe Online

On Friday Apple fired another salvo in its continuing court battle with South Korean consumer electronics giant Samsung. The California-based company filed a motion to ban Samsung devices that “infringe” on its patents, but Apple also requested a total retrial of the case at the U.S. district court in San Jose.

Even as Samsung and Apple had reached a level of détente, in March it was reported that Apple sought about $2 billion in damages from Samsung for selling handsets and tablet devices that Apple claimed violated five of its mobile software patents. Samsung then fired back that Apple had violated two of its patents.

Apple Insider, which broke the news of the latest court filings, reported that Apple is requesting a complete retrial of the damages case, which ended in a $119.6 million award – far less than the $2.2 billion that Apple had sought.

Apple Insider is also reportedly asserting that any continued sale of Samsung’s infringing product could cause Apple “irreparable harm that cannot be remedied with monetary damages.” However, Apple v Samsung presiding Judge Lucy Koh had denied an injunction in the first California case – noting that Apple’s evidence did not sufficiently prove irreparable harm.

According to Re/Code‘s Dawn Chmielewski, the Cupertino technology company had won its $119.6 million verdict against Samsung after a San Jose jury found that the South Korean tech maker had infringed on Apple’s patents for “data detection/linking,” a feature that could dial a phone number found in an email – and for the “slide to unlock” feature that allowed users to gain access to the device.

This could truly be a case of giants battling giants for the top spot. As redOrbit reported on Friday, Counterpoint’s Market Monitor quarterly tracking program found Apple and Samsung as the top smartphone brands. The two companies accounted for two-thirds of the total of all Q1 smartphone shipments, and the companies had a combined 780 percent of all LTE mobile smartphone shipments.

In other words it could be hard to see irreparable harm for either company. As Venture Beat reported, “there’s little chance Apple’s request for a retrial would be granted. Apple claims a retrial is warranted based on prejudicial claims by Samsung to the jury, and it requests the chance to prove willful infringement of more of its patents by Samsung.”

Samsung already fired back, with spokesperson Adam Yates telling Bloomberg News, “After the jury rejected Apple’s grossly exaggerated damages claim, Apple is once again leaning on the court to push other smartphones out of the market. If granted, this would stifle fair competition and limit choice for American consumers.”

Earlier this month, Apple and Google had agreed to drop all patent-related lawsuits against one another. In a joint statement released just over a week ago the companies sought to bring an end to litigation that began when Motorola Mobility first accused the iOS maker of patent infringement in 2010.

Apple countersued and so began a nearly constant back and forth, one that intensified after Google purchased Motorola Mobility for $12.5 million in 2012. With that case resolved, perhaps Apple is now poised to focus its attention on Samsung again.

HOW TO CHALLENGE YOUR ADVERSARIES: The Art of War

The History Of Drone Technology

Drones, also known as unmanned aerial vehicles (UAV), are pilotless and non-crewed aircraft that are capable of flight either by remote control or through the use of on-board computers. Other names for these types of aircraft are remotely piloted vehicle (RPV), remotely piloted aircraft (RPA), and remotely operated aircraft (ROA).

Drones are commonly used by the military, but are also being implemented in search and rescue operations and being utilized in other civil applications, such as policing and firefighting. The technology is also allowing for hobbyists and other enthusiasts to become avid drone operators, albeit on a relatively smaller scale.

A drone is capable of controlled, sustained level flight and is powered by a jet, reciprocating, or electric engine. UAVs differ from cruise missiles in that drones are recovered after a mission is complete while a cruise missile impacts its target. Military UAVs may carry and fire munitions, while a cruise missile is a munition.

The concept of unmanned aerial flight is not a new one. The idea first came to light on August 22, 1849, when Austria attacked the Italian city of Venice with unmanned balloons that were loaded with explosives. Some balloons were launched from the Austrian ship Vulcano. While some balloons reached their intended targets, most were caught in change winds and were blown back over Austrian lines.

The system was under development for months and an account of the country’s plan appeared in an article in a Vienna newspaper at the time:

“Venice is to be bombarded by balloons, as the lagunes prevent the approaching of artillery. Five balloons, each twenty-three feet in diameter, are in construction at Treviso. In a favorable wind the balloons will be launched and directed as near to Venice as possible, and on their being brought to vertical positions over the town, they will be fired by electro magnetism by means of a long isolated copper wire with a large galvanic battery placed on a building. The bomb falls perpendicularly, and explodes on reaching the ground.”

While these early drones do not generally meet today’s definition of a UAV, the concept was strong enough that once winged aircraft had been invented, the concept was still alive and kicking and would soon be implemented once again.

WORLD WAR I

The first pilotless aircraft were developed during and shortly after World War I. The first was the “Aerial Target,” developed in 1916. It was intended to take down Zeppelins, but never flew. Shortly later, the Hewitt-Sperry Automatic Airplane (the flying bomb) made its maiden flight, demonstrating the concept of unmanned aircraft. This UAV was intended for use as an aerial torpedo, an early version of modern cruise missiles. Control of these aircraft was achieved using gyroscopes.

In November 1917, the Automatic Airplane was demonstrated for the US Army. Upon the success of this demonstration, the Army commissioned a project to build an aerial torpedo, which became known as the Kettering Bug and flew in 1918. While the technology was a success, it wasn’t in time to fight during wartime, which ended before the UAV could be developed and deployed.

Several successors were developed during the period after WWI and prior to WWII. These included the Larynx, tested by the Royal Navy between 1927 and 1929; the radio-controlled Fairey “Queen” developed by the British in 1931; and the British follow-up UAV “DH.82B Queen Bee” in 1935. Also following on the earlier work by the Army, the US Navy continued to advance UAV technology, experimenting with radio-controlled aircraft. In 1936, the term “drone” was first coined, as the head of the Navy’s research group used it to describe radio-controlled aerial targets.

During the technology rush of WWII, drones were used both as training tools for antiaircraft gunners and for aerial attack missions. Nazi Germany also had produced and used various UAVs during the course of WWII. After the war, jet engines were applied to drones, with the first being the Teledyne Ryan Firebee I of 1951. By 1955, the Model 1001, developed by Beechcraft, was developed for the US Navy — these UAVs were nothing more than remote-controlled airplanes until the Vietnam Era.

MODERN ERA

The birth of US UAVs began in 1959 when the US Air Force, concerned about losing pilots over hostile territory, began planning for unmanned flights. Following a Soviet Union shoot down of the secret “U-2” aircraft in 1960, the highly classified UAV program was launched under the code name “Red Wagon.” Modern-era UAVs got their first use during the Aug 2 and Aug 4, 1964 clash in the Tonkin Gulf between the US and North Vietnamese navies. During the Vietnam War.

After Chinese photographs surfaced of downed US unmanned aircraft during and after the Vietnam War, the official US Air Force response was “no comment.” However, by 1973, the US military officially confirmed that they had been utilizing UAV technology is Vietnam, stating that during the war, more than 3,435 UAV missions were flown, of which about 554 were lost in combat.

During the 1973 Yom Kippur War, Israel developed the first UAV with real-time surveillance, after Soviet Union surface-to-air missiles used by Egypt and Syria dealt heavy damage to Israel’s fighter jets. The images and radar decoying provided by these UAVs helped Israel to neutralize Syria’s air defenses at the start of the 1982 Lebanon War, resulting in no pilots lost. By 1987, Israel had developed proof-of-concept capabilities in tailless, stealth-based, three-dimensional thrust vectoring flight control, jet steering UAVs for the first time.

Interest in UAV technology grew during the 1980s and 1990s – being used during the Persian Gulf War in 1991 – and became cheaper and more capable fighting machines. While most drones of the earlier years were primarily surveillance aircrafts, some carried munitions. The General Atomics MQ-1, which utilized an AGM-114 Hellfire air-to-ground missile, was known as an unmanned combat aerial vehicle (UCAV).

POST 9/11

While most UAVs were utilized by the military, the technology was commissioned by the CIA after the September 11, 2001 terrorist attacks. Intelligence gathering operations began in 2004, with CIA-operated UAVs primarily flown over Afghanistan, Pakistan, Yemen, and Somalia. The CIA’s first UAV program was called the Eagle Program.

As of 2008, The USAF has employed 5,331 UAVs, which is twice the number of manned planes. Of these, the Predators have been the most commendable. Unlike other UAVs, the Predator was armed with Hellfire missiles. The Predators were used during the hunt for Osama Bin Laden and have demonstrated the capability of pointing lasers at targets for pinpoint accuracy. The overall success of the Predator missions is apparent because from June 2005 to June 2006 alone, Predators carried out 2,073 successful missions in 242 separate raids.

While Predator is remotely operated via satellites from more than 7,500 miles away, the Global Hawk operates virtually autonomously. Once the user pushes a button, alerting the UAV to take off, the only interaction between ground and the UAV is directional instructions via GPS. Global Hawks have the ability to take off from San Francisco, fly across the US, and map out the entire state of Maine before having to return.

In February 2013, it was reported that UAVs were used by at least 50 countries, several of which have made their own, including Iran, Israel and China.

Recently, UAVs are becoming increasingly popular in the commercial and private market. Amazon.com, the largest online retailer, said in December 2013 that it was developing drone technology to one day deliver mail autonomously.

Drones are also being developed for hobbyists and other enthusiasts. In reality, these types of aircraft have been common since the 1930s, when Reginald Denny mass-produced the first radio-controlled aircraft for the hobby market. While RC airplanes remained popular through the decades, recent technology is now making them smaller, more powerful and more useful – some adding cameras and GPS trackers, as well as making them more affordable for everyday enthusiasts.

Image Credit: Thinkstock.com

Emergency Rooms See A Rise In Non-Prescription Xanax Visits

April Flowers for redOrbit.com – Your Universe Online

The Substance Abuse and Mental Health Services Administration (SAMHSA) has released a report showing that the abuse of Xanax, also known as Alprazolam, is rising in the US — as is the number of emergency room visits associated with the drug.

“We’re seeing growth in the number of people who are getting into trouble with these drugs,” says Pete Delany, director of SAMHSA’s Center for Behavioral Health Statistics and Quality, according to USA Today. “Patients really need to be educated that if these drugs are misused, they can be really, really dangerous.”

Between the years of 2005 and 2010, emergency department visits following non-medical use of Xanax more than doubled from 57,419 to 124,902 patients. In 2011, this number remained stable at 129,744 patients.

Xanax is a Schedule IV drug, according to the Food and Drug Administration (FDA), putting it low on what the government considers to be “potential for abuse.” The numbers on the SAMHSA report, which are based on data from SAMHSA’s 2011 Drug Abuse Warning Network (DAWN), seem to disagree.

According to the researchers, using alprazolam in a non-medical manner can result in physical dependence, causing withdrawal symptoms such as tremors and seizures. Combining it with central nervous system depressants such as alcohol or narcotic pain relievers can dangerously enhance the effects of those drugs and result in depressed breathing and heart rate. At worst, the combination can result in unconsciousness and death.

A massive 81 percent of those abusing Xanax used it in combination with some other prescription drug or alcohol. Close to two-thirds of this group used alprazolam with another prescription drug, and one-third of these used it with a prescription pain killer such as oxycodone.

For comparison, in 2011 emergency departments saw 1,200,000 visits related to the non-medical use of prescriptions drugs. Around 10 percent of that number was for Xanax alone.

In 2011 as well, Xanax was the single most commonly prescribed psychiatric medicine. In 2012, the drug was the 13th most commonly sold medication over long. Alprazolam is also known as Xanax XR and Niravam.

“When used as directed, alprazolam is safe and effective, but misuse can result in serious health consequences,” said SAMHSA Administrator Pamela S. Hyde in a recent statement. “This report highlights the need to educate people about the dangers of misusing or sharing prescription medications and the importance of properly disposing of unused medication.”

The Drug Enforcement Agency (DEA) has created a National Prescription Drug Take-Back Day Initiative. SAMHSA supports this program, which provides a safe, convenient, and responsible way for people to dispose of both prescription and over the counter drugs at designated locations. The program also provides education to the public on drug misuse and abuse.

Life Expectancy For Those With Mental Illness Is Lower Than That Of Heavy Smokers

April Flowers for redOrbit.com – Your Universe Online
A great deal of research and marketing has gone into understanding exactly how smoking changes your life expectancy. The outcomes from heavy smoking are bleak, and most people are well aware of these facts. Recently, a team of scientists from Oxford University decided to compare the mortality rates of heavy smoking with those of some of the more common mental illnesses, and the results were surprising.
On average, the study found that mental illness reduces a person’s life expectancy by 10 to 20 years. This is equivalent, or worse than, the loss of years attributed to heavy smoking. The study, published in World Psychiatry, hopes to galvanize governments and social services to put a much higher priority on dealing with mental health issues as a public health issue, the way they do with smoking.
The researchers say that an estimated 25 percent of people in the UK will experience some sort of mental health issue in the course of a year. The number of smokers is similar, with 21 percent of men and 19 percent of women identified as smokers.
To conduct their analysis, the research team collected 20 of the best systemic reviews of clinical studies reporting on mortality rates for a wide range of diagnoses — mental health problems, substance and alcohol abuse, dementia, autistic spectrum disorders, learning disability and childhood behavioral disorders. In total, these studies included over 1.7 million individuals and over 250,000 deaths.
A second collection of studies and reviews was conducted, this time with those reporting life expectancy and risk of dying by suicide. The results were then compared to mortality risks for heavy smoking.
The researchers found that, on average, heavy smokers lose eight to 10 years.
The reduction in average life expectancy for mental illnesses varies. For example, people with bipolar disorder average between nine and 20 years. Schizophrenia sufferers see a reduction of 10 to 20 years, drug and alcohol abuse garners between nine and 24 years, and sufferers of recurrent depression lose seven to 11 years.
Every diagnosis covered in the studies showed an increase in mortality risk. Even though the risk varied greatly between diagnoses, many of them were equal to or greater than that of heavy smoking.
According to a statement by Dr Seena Fazel of the Department of Psychiatry at Oxford University: “We found that many mental health diagnoses are associated with a drop in life expectancy as great as that associated with smoking 20 or more cigarettes a day.”
“There are likely to be many reasons for this. High-risk behaviors are common in psychiatric patients, especially drug and alcohol abuse, and they are more likely to die by suicide. The stigma surrounding mental health may mean people aren’t treated as well for physical health problems when they do see a doctor.”
Fazel, who is also an honorary consultant in forensic psychiatry, believes that one challenge is the tendency of doctors and patients to separate mental illness and physical illness. “Many causes of mental health problems also have physical consequences, and mental illness worsens the prognosis of a range of physical illnesses, especially heart disease, diabetes and cancer. Unfortunately, people with serious mental illnesses may not access healthcare effectively,” says Dr Fazel.
Dr Fazel is certain, however, that “All of this can be changed. There are effective drug and psychological treatments for mental health problems. We can improve mental health and social care provision. That means making sure people have straightforward access to health care and appropriate jobs and meaningful daytime activities. It’ll be challenging, but it can be done.”
He notes, “Beyond that, psychiatrists have a particular responsibility as doctors to ensure that the physical health of their patients is not neglected. De-medicalization of psychiatric services mitigates against that.”
Fazel feels that his results should be a wakeup call for governments and clinicians. He adds, “What we do need is for researchers, care providers and governments to make mental health a much higher priority for research and innovation. Smoking is recognized as a huge public health problem. There are effective ways to target smoking, and with political will and funding, rates of smoking-related deaths have started to decline. We now need a similar effort in mental health.”
Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust, which funded the study, said, “People with mental health problems are among the most vulnerable in society. This work emphasizes how crucial it is that they have access to appropriate healthcare and advice, which is not always the case. We now have strong evidence that mental illness is just as threatening to life expectancy as other public health threats such as smoking.”

Parasite-Blocking Antigen Discovery May Lead To Malaria Vaccine

[ Watch the Video: Protein May Lead To Malaria Vaccine ]
Lawrence LeBlond for redOrbit.com – Your Universe Online
Malaria remains a dangerous disease in the developing world, killing more than 627,000 people a year, according to the World Health Organization (WHO). Because most deaths associated with malaria occur in children in Sub-Saharan Africa, finding a vaccine against the deadly virus is all too important.
New research, by a team from Rhode Island Hospital (RIH), has uncovered a protein, or antigen, that is essential for malaria-causing parasites to escape from red blood cells. This protein also generates antibodies that can hinder the ability of malaria parasites to multiply, which may protect against severe malaria infection.
This antigen, known as PfSEA-1, was associated with reduced parasite levels among children and adults in malaria-endemic areas. An investigational vaccine was developed using the antigen and when exposing mice to this antigen, the rodents experienced lower malaria parasite levels.
This discovery could be a critical addition to the limited number of antigens that are currently being used in candidate malaria vaccines. The study findings, published in the journal Science, results from a collaboration of scientists from the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health, and NIAID grantees.
“This research really began in 2002 when our colleagues from the National Institutes of Health, led by Patrick Duffy and Michal Fried, enrolled a birth cohort of children in Tanzania,” said lead author Jonathan Kurtis, MD, PhD, director of the Center for International Health Research at RIH. “Six years ago we began using these samples to identify novel vaccine candidates and now it’s coming full circle. While a portion of this research was conducted in mice, the actual vaccine discovery experiments were performed using human samples, thus we believe the results will effectively translate to humans.”
In the mouse experiments, the team conducted five independent vaccine trials in which they were vaccinated with the novel PfSEA-1 (Schizont Egress Antigen-1). All mice were then injected with malaria parasites. In all five experiments, the vaccinated mice had lower levels of parasites and survived much longer than the unvaccinated group.
“When my post-doctoral fellow Dipak Raj discovered that antibodies to this protein, PfSEA-1, effectively trapped the malaria-causing parasite within the red blood cells, it was truly a moment of discovery,” Kurtis said. “Many researchers are trying to find ways to develop a malaria vaccine by preventing the parasite from entering the red blood cell, and here we found a way to block it from leaving the cell once it has entered. If it’s trapped in the red blood cell, it can’t go anywhere… it can’t do any further damage.”
Kurtis and colleagues then measured antibodies to PfSEA-1 in the entire Tanzanian birth cohort of 785 children. They found that among children with antibodies to PfSEA-1, there were zero cases of severe malaria.
The team then went back to a serum bank they had collected from 140 children in Kenya in 1997. They found that the children who had antibodies to PfSEA-1 in this group had 50 percent lower parasitemea than those without the antibodies during a high transmission season.
“Our findings support PfSEA-1 as a potential vaccine candidate, and we are confident that by partnering with our colleagues at the National Institutes of Health and other researchers focused on vaccines to prevent the parasites from entering red blood cells, we can approach the parasite from all angles, which could help us develop a truly effective vaccine to prevent this infectious disease that kills millions of children every year,” Kurtis said in a statement.
The researchers said the results were encouraging, but stressed that more research was required before human trials could begin.
“I am cautious. I’ve seen nothing so far in our data that would cause us to lose enthusiasm. However, it still needs to get through a monkey study and the next phase of human trials,” Kurtis told BBC News.
This study adds to the growing body of work being explored in the race to find a malaria vaccine.
The RTS,S vaccine, developed by GlaxoSmithKline, is one of the more advanced vaccine candidates being pushed for regulatory approval after Phase III clinical trials showed that the drug almost halved the number of malaria cases in young children and in infants reduced the number of malaria cases by 25 percent.
“The identification of new targets on malaria parasites to support malaria vaccine development is a necessary and important endeavor,” Dr Ashley Birkett, director of the PATH Malaria Vaccine Initiative, said in a comment following the latest study.
“While these initial results are promising with respect to prevention of severe malaria, a lot more data would be needed before this could be considered a leading vaccine approach – either alone or in combination with other antigens,” he said in an interview with the BBC.

Groundbreaking Earth-To-Moon Broadband Uplink To Be Detailed At CLEO: 2014

redOrbit Staff & Wire Reports – Your Universe Online

Most of us have gotten used to being connected no matter where we go, but if mankind ever starts living or working on the moon or some far-off asteroid, how would they be able to check their email or post killer selfies on Facebook? Thankfully, researchers from the Massachusetts Institute of Technology (MIT) are close to a solution.

While working with NASA officials last fall, a team from MIT’s Lincoln Laboratory was able to demonstrate for the first time that there is a type of data communication technology that can provide people living in space with the same type of broadband connectivity that those of us living on Earth enjoy on a daily basis – technology that would allow them to transfer a large amount of data and even stream video in HD, according to the researchers.

Details about the technology will be unveiled Monday, June 9 at the annual Conference on Lasers and Electro-Optics (CLEO) in San Jose, California. During their presentation, the team will also be providing the first comprehensive overview of the on-orbit performance of the Lunar Laser Communication Demonstration (LLCD).

The LLCD is a laser-based communication uplink system between the Earth and the moon, and last year, the MIT team made history by surpassing the previous record transmission speed by a factor of 4,800. During a 30-day mission that concluded last December, NASA reported that LLCD reached data download and upload speeds to the moon at 622 megabits per second (Mbps) and 20 Mbps, respectively.

“For example, LLCD demonstrated error-free communications during broad daylight, including operating when the moon was to within three degrees of the sun as seen from Earth,” the US space agency said. “LLCD also demonstrated error-free communications when the moon was low on the horizon, less than 4 degrees, as seen from the ground station, which also demonstrated that wind and atmospheric turbulence did not significantly impact the system. LLCD was even able to communicate through thin clouds, an unexpected bonus.”

According to Mark Stevens of the MIT Lincoln Laboratory, the CLEO 2014 presentation will mark the first time that the team demonstrates both the implementation overview, as well as the actual performance of the network. He added that the on-orbit performance was close to predictions, making them confident they have a good grasp of the underlying physics behind the technology.

“Communicating at high data rates from Earth to the moon with laser beams is challenging because of the 400,000-kilometer distance spreading out the light beam,” he said in an Optical Society statement. “It’s doubly difficult going through the atmosphere, because turbulence can bend light – causing rapid fading or dropouts of the signal at the receiver.”

The demonstration utilizes multiple techniques to overcome problems with signal fading over such a great distance, making it possible to achieve error-free performance over a vast array of optically challenge atmospheric conditions both in bright sunlight and darkness.

A New Mexico-based ground terminal uses four different telescopes, each of which are about six inches in diameter, in order to send the uplink signal to the moon. Each of the telescopes is fed by a laser transmitter, which sends data codes as invisible infrared light pulses. The total transmitter power is the sum of the four separate transmitters, resulting in a total of 40 watts of power.

According to Stevens, multiple telescopes are used because each one transmits light through a different column of air, which experiences different atmospheric bending effects. This method increases the chance that at least one of the laser beams will interact with a receiver, which is mounted on the Lunar Atmosphere and Dust Environment Explorer (LADEE) satellite which is currently orbiting the moon.

The receiver collects the light using a slightly narrower telescope, then focuses it into an optical fiber similar to those used in Earth-based fiber optic networks. Afterwards, the signal is amplified about 30,000 times, and the pulses of light are converted into electrical pulses by a photodetector.

Those pulses are then converted a second time into data bit patterns that carry the actual message. Out of the 40-watt signals sent by the transmitter, less than one-billionth of one watt is actually received by the satellite. Even so, that’s approximately 10 times the signal necessary to achieve error-free communication, according to Stevens.

The CLEO: 2014 presentation will also detail how the large margins in received signal level will permit the system to operate through partly transparent thin clouds in the Earth’s atmosphere. Stevens said that the team successfully “demonstrated tolerance to medium-size cloud attenuations, as well as large atmospheric-turbulence-induced signal power variations, or fading, allowing error-free performance even with very small signal margins.”

House Vote Limits NSA Data Collection, But Experts Say It Won’t Stop Government Snooping

Peter Suciu for redOrbit.com – Your Universe Online

On Thursday the United States House of Representatives voted overwhelmingly to limit the National Security Agency’s (NSA) mass collection of telephone records. In a 303 to 121 vote, the House approved scaled-back legislation that was seen as sending a clear message that both parties no longer favor the NSA’s power to collect bulk surveillance data.

Last year the House was much more divided than this year, but anger over the leaks from Edward Snowden helped bring the two sides together.

House Republicans as well as House Democrats and the White House were able to work out a deal that apparently satisfied few, but at the same time was able to limit government snooping. The measure would end the NSA’s practice of gathering the bulk data, but would leave such records in the custody of telephone companies – and that data could be searched at the NSA’s request.

This so-called “USA Freedom Act” would also allow the NSA to collect an individual’s phone records if investigators could convince the secret Foreign Intelligence Surveillance Court that there is reasonable suspicion a person was involved with terrorism.

“This part of the legislation is something to think about,” said Jim Purtilo, associate professor in the computer science department at the University of Maryland. “I compare this with my students to the six degrees of separation to Kevin Beacon. How many degrees are you from a police suspicion? Most people don’t know how many degrees they are from someone under suspicion.”

Still this measure could reduce some government snooping said House leaders.

“People are a lot more comfortable with a government that is not storing all this metadata,” said Speaker John A. Boehner of Ohio, as reported by the New York Times. He further praised the bill, which “makes it clear there will be no access to this data without a court decision and the standards for that decision are higher than they were.”

However, it is unlikely government surveillance will completely stop.

“It is going to continue,” Purtilo told redOrbit. “I can’t believe for a second that this is going to be the end of it. First of all we got into this situation because officials creatively interrupted the rules so they could get as much data as they could.”

“The only middle ground that seems to have been reached is for the current House to pass a bill that says let the next Congress handle it,” added Purtilo. “At the same time that people are being told behind the scenes to creatively interrupt the data.”

The question then becomes how concerned should individuals be about the snooping?

“This is part of the changing world,” telecommunications industry analyst Jeff Kagan told redOrbit. “There is really an avalanche of data that pours in daily. But this didn’t exist in the 1990s and it only started to pop up in recent years, as that happened we started to talk about it as it was an invasion of privacy.”

“It keeps getting worse as it shines a light on it but doesn’t stop it,” Kagan added. “Gathering data is getting easy. Today a cash register is a computer terminal and every sale is instantly registered. Bit by bit in every area they know more about us. It is a flood of information. Corporations are just gathering the data now, but at some point there could be technology that could allow them to use it. The government could also use it to keep an eye on everyone.”

The next question is when does this go too far?

“In many ways we’ve already crossed that line a long time ago,” said Kagan. “It is unstoppable and it is regrettable, but it is one of the downsides of the technological revolution. However, not everyone thinks there is a downside. Some love it, some hate it but we’re not going to stop it. Companies like they can market to people and the government can watch us.”

Of course it should be stressed that the government would argue it isn’t gathering the data for sinister reasons, but rather to keep the public safe. Just as there are degrees of how far the government is allowed, or should be allowed to go, there are degrees of separation between those doing bad and those who are completely innocent, said the experts.

“These officials doing this surveillance fought for this legislation for a reason,” said Purtilo. “The biggest reason is because it works. There are degrees of working, but the meta data is more than just a phone call. It actually paints a picture of someone so it is extremely helpful for law enforcement.”

“That data isn’t just in a silo waiting to be checked for your connections to some nefarious organization. It is actually joined to other data that is out there,” added Purtilo. “The data creates a tapestry and the treads that hold it together is the meta data. What people are arguing is what is used from the tapestry. It paints a tremendous picture. It goes to the saying, ‘Tell me who your friends are and I can tell who you are.'”

The House vote may also be telling.

“This had a pretty strong house vote,” Purtilo agreed. “This suggests to me there was a compromise in the works.”

One-third Of All Brain Aneurysms Rupture Regardless Of Size

[ Watch The Video: Microneurosurgical Clipping Of An Unruptured Intracranial Aneurysm ]

University of Helsinki

The lifetime risk for rupture of a brain aneurysm depends heavily on the patient’s overall load of risk factors. However, a recent study by researchers from the University of Helsinki and Helsinki University Central Hospital demonstrated that the size of an aneurysm has no great significance on the risk of rupture.

This is a unique study in that it monitored aneurysm patients over their entire lifetimes, whereas typical follow-up studies last only between one and five years in duration. The study is also exceptionally broad in scope; Dr. Seppo Juvela points out that the only other place where a study of similar scope has been conducted is Japan.

“It is unlikely that another similar, non-selected lifetime follow-up study on aneurysm patients will ever be conducted again,” he states.

Current care practices are based largely on the results of previous, shorter studies. Such studies have shown that the size of the aneurysm is the most significant factor predicting its risk for rupture. Consequently, small (<7mm) aneurysms have often been left untreated, even though such aneurysms have also been known to rupture and cause brain hemorrhages.

The new study established that approximately one third of all aneurysms and up to one fourth of small aneurysms will rupture during a patient’s lifetime. The risk of rupture is particularly high for female smokers with brain aneurysms of seven millimeters or more in diameter. What surprised the researchers most was that the size of an aneurysm had little impact on its risk for rupture, particularly for men, despite a previously presumed correlation. In addition, the risk of rupture among non-smoking men was exceptionally low.

“This is not to say that aneurysms in non-smoking men never rupture, but that the risk is much lower than we previously thought. This means treating every unruptured aneurysm may be unnecessary if one is discovered in a non-smoking man with low blood pressure,” Juvela clarifies.

But why have previous studies not reached these same results if they are so obvious?

“It is difficult to conduct reliable epidemiological research in brain aneurysms. The past 10 years have seen a distortion in the field due to a very limited group of researchers determining the direction for research. Now the situation is clearly changing, and clinically reasonable, population-based studies using non-selected data are on the rise again,” states Docent Miikka Korja of the HUCS neurosurgery clinic.

Finland has a strong tradition of studying the prevalence, risk factors and care of brain aneurysms, and the Helsinki University Central Hospital is one of the world’s leading units to provide treatment for brain aneurysms. Major studies in the field published by Finnish researchers include the world’s most extensive twin study on the hereditability of subarachnoid hemorrhage, the largest follow-up study on subarachnoid hemorrhages among diabetics, the most extensive study on the life expectancy of subarachnoid hemorrhage survivors and a study on the risk factors for subarachnoid hemorrhages using the most extensive population data.

How Does Common Obesity Gene Contribute To Weight Gain?

Columbia University Medical Center

Researchers have discovered how a gene commonly linked to obesity—FTO—contributes to weight gain. The study shows that variations in FTO indirectly affect the function of the primary cilium, a little-understood hair-like appendage on brain and other cells. Specific abnormalities of cilium molecules, in turn, increase body weight, in some instances, by affecting the function of receptors for leptin, a hormone that suppresses appetite. The findings, made in mice, suggest that it might be possible to modify obesity through interventions that alter the function of the cilium, according to scientists at Columbia University Medical Center (CUMC).

“If our findings are confirmed, they could explain how common genetic variants in the gene FTO affect human body weight and lead to obesity,” said study leader Rudolph L. Leibel, MD, the Christopher J. Murphy Memorial Professor of Diabetes Research, professor of pediatrics and medicine, and co-director of the Naomi Berrie Diabetes Center at CUMC. “The better we can understand the molecular machinery of obesity, the better we will be able to manipulate these mechanisms and help people lose weight.”

The study was published on May 6 in the online edition of Cell Metabolism.

Since 2007, researchers have known that common variants in the fat mass and obesity-associated protein gene, also known as FTO, are strongly associated with increased body weight in adults. But it was not understood how alterations in FTO might contribute to obesity. “Studies have shown that knocking out FTO in mice doesn’t necessarily lead to obesity, and not all humans with FTO variants are obese,” said Dr. Leibel. “Something else is going on at this location that we were missing.”

In experiments with mice, the CUMC team observed that as FTO expression increased or decreased, so did the expression of a nearby gene, RPGRIP1L. RPGRIP1L is known to play a role in regulating the primary cilium. “Aberrations in the cilium have been implicated in rare forms of obesity,” said Dr. Leibel. “But it wasn’t clear how this structure might be involved in garden-variety obesity.”

Dr. Leibel and his colleague, George Stratigopoulos, PhD, associate research scientist, hypothesized that common FTO variations in noncoding regions of the gene do not change its primary function, which is to produce an enzyme that modifies DNA and RNA. Instead, they suspected that FTO variations indirectly affect the expression of RPGRIP1L. “When Dr. Stratigopoulos analyzed the sequence of FTO’s intron—its noncoding, or nonprotein-producing, portion—we found that it serves as a binding site for a protein called CUX1,” said Dr. Leibel. “CUX1 is a transcription factor that modifies the expression of RPGRIP1L.”

Next, Dr. Stratigopoulos set out to determine whether RPGRIP1L plays a role in obesity. He created mice lacking one of their two RPGRIP1L genes, in effect, reducing but not eliminating the gene’s function. (Mice that lack both copies of the gene have several serious defects that would obscure the effects on food intake.) Mice with one copy of RPGRIP1L had a higher food intake, gained significantly more weight, and had a higher percentage of body fat than controls.

In a subsequent experiment, the CUMC team found that RPGRIP1L-deficient mice had impaired leptin signaling. “The receptors didn’t convene properly on the cell surface around the base of cilium,” said Dr. Leibel. “RPGRIP1L appears to play a role in getting leptin receptors to form clusters, where they are more efficient in signaling.”

“Overall,” said Dr. Leibel, “our findings open a window onto the possible role of the primary cilium in common forms of obesity.”

The CUMC team is now conducting studies to learn more about the various components of the FTO- RPGRIP1L pathway, which ciliary proteins are affected by changes in this pathway, and how these proteins mediate actions of leptin receptors.

Home-Based Walking Program Eases Clogged Leg Arteries

American Heart Association
Study Highlights:
-A home-based exercise program helped people with clogged leg arteries walk farther and faster.
-Supervised exercise for PAD (peripheral artery disease) is not usually covered by insurance and is inaccessible for many people with this painful condition.
-Physicians should recommend walking even if their patients don’t have access to a supervised exercise program.
A home-based exercise program helped people with clogged leg arteries walk farther and faster, according to new research in the Journal of the American Heart Association. The program was beneficial even 12 months after participants started the program.
Previously, studies have shown that supervised exercise can improve walking and lessen the symptoms of peripheral artery disease (PAD), but this is the first to document the long-term benefits of a home-based walking program.
“The problem with supervised exercise is that it takes many visits to a cardiac rehabilitation center or other exercise facility, and it is not covered by Medicare,” said Mary McGrae McDermott, M.D., lead author and the Jeremiah Stamler professor of medicine at the Northwestern University Feinberg School of Medicine in Chicago. “Our results should encourage physicians to recommend walking even if their patients do not have access to a supervised-exercise program.”
The study compared walking ability in patients and controls a year after the end of a six-month program that encouraged home-based walking. For the first six months, 81 patients participated in weekly meetings to provide support and skills training to help them adhere to the home exercise program. They also received phone calls to encourage continued walking during months 7-12.
Eighty-seven controls participated for a year in weekly educational meetings and received phone contact on unrelated PAD topics such as managing hypertension, cancer screening and vaccinations.
At 12 months, participants in the home-based program had increased the distance they could walk in six minutes from 355.4 to 381.9 meters, an improvement of about 87 feet. In contrast, the distance covered by the controls fell slightly, from 353.1 to 345.6 meters.
According to McDermott, walking exercise is the most effective non-invasive treatment for PAD, but a program must take into account that walking may cause a cramp-like pain in leg muscles that don’t get sufficient oxygen. By alternating walking and rest, patients can build up the amount of time they can walk before pain occurs.
In the home program, patients were instructed to try to walk at least five days a week, building up to 50 minutes. When leg pain occurred, they were to stop and rest until legs were comfortable again and then resume walking.
“The results emphasize the importance of recognizing and treating PAD, a common condition that often remains undiagnosed and can become life-threatening as it restricts circulation to the legs, arms, feet, and kidneys,” McDermott said. “Patients with PAD are also at heightened risk for heart attack and stroke.”
“Don’t think walking problems are a normal part of aging. If you have leg pain, weakness, tingling or other difficulty walking, report it to your doctor and ask about the possibility you may have PAD. Diagnosing PAD is important because therapies can improve your health.”
Co-authors are Jack M. Guralnik, M.D., Ph.D.; Michael H. Criqui, M.D., M.P.H.; Luigi Ferrucci, M.D., Ph.D.; Lihui Zhao, Ph.D.; Kiang Liu, Ph.D.; Kathryn Domanchuk, B.S.; Bonnie Spring, Ph.D.; Lu Tian, Sc.D.; Melina Kibbe, M.D.; Yihua Liao, M.S.; Donald Lloyd Jones, M.D.; and W. Jack Rejeski, Ph.D. Author disclosures are on the manuscript.
The National Heart, Lung, and Blood Institute and National Institute on Aging supported the study.

Breast Cancer Diagnosis Often Leads Women To Have An Unnecessary Double Mastectomy

Brett Smith for redOrbit.com – Your Universe Online

One year ago last week, Angelina Jolie announced in a New York Times op-ed that she had undergone a preventative double mastectomy – but now a new study indicates that surgically removing both breasts may be unnecessary for women who are diagnosed with cancer in a single breast.

Published in the journal JAMA Surgery, the new study found that approximately 70 percent of women that have both breasts removed after having a breast cancer diagnosis in a single breast do it regardless of having a low risk of having cancer in the healthy breast.

The results of the study are particularly concerning given report of female breast cancer patients increasingly choosing to undergo the preventative measure – called contralateral prophylactic mastectomy (CPM).

“Women appear to be using worry over cancer recurrence to choose contralateral prophylactic mastectomy. This does not make sense, because having a non-affected breast removed will not reduce the risk of recurrence in the affected breast,” said study author Sarah Hawley, an associate professor of internal medicine at the University of Michigan’s Medical School.

In the study, researchers examined more than 1,400 women who had been treated for breast cancer and who had not had a recurrence. They discovered that 8 percent of women had both breasts removed and 18 percent thought about it.

Overall, around three-quarters of patients said they were very concerned about their cancer recurring. Those who decided to have both breasts removed were considerably more prone to say they were worried about recurrence. However, a prognosis of cancer in one breast does not boost the odds of breast cancer affecting the other breast for most women.

The study also looked into approximate clinical signs for double mastectomy, such as the patients’ family background of breast and ovarian cancer and the outcomes of any genetic evaluating.

Women with a family background of breast or ovarian cancer or who tested positive for mutations in the BRCA1 or BRCA2 genes may be asked to think about having both breasts removed, because they are at an elevated risk of a new cancer establishing in the other breast. This signifies approximately 10 percent of all women identified as having breast cancer. Women devoid of these signs will probably not develop another cancer in the healthy breast.

The study discovered that among women undergoing double mastectomy, almost 70 percent did not have either a family history or positive genetic test. Angelina Jolie did report testing positive for a mutation in the BRCA1 gene.

“For women who do not have a strong family history or a genetic finding, we would argue it’s probably not appropriate to get the unaffected breast removed,” Hawley said.

“Decision making surrounding early breast cancer, with respect to CPM in particular, provides an opportunity to encourage a supportive, shared decision-making approach,” wrote Shoshana M. Rosenberg and Dr. Ann H. Partridge of the Dana-Farber Cancer Institute, Boston in an editorial accompanying the study.

“Not only should pros and cons of different treatment options be communicated, but there needs to be consideration of the patient’s personal circumstances and perceptions, all the while addressing anxiety and concerns about breast cancer recurrence and new primary disease (and the distinction between the two),” the editorial continued. “Finding balance around this issue, like the decision process itself, should be a goal shared by patients and clinicians alike.”

SpaceX Goes From Suing Air Force To Potentially Launching Its Satellites

Lawrence LeBlond for redOrbit.com – Your Universe Online

Late last month Elon Musk announced he was displeased that the US Air Force shut his company, Space Exploration Technologies (SpaceX), out of the competition market for government launches. He was so annoyed by it that he filed a legal complaint against United Launch Alliance’s (ULA) monopoly over military launches.

Less than a week after that filing, a federal court judge filed a temporary injunction against ULA’s acquisitions of rocket engines from Russia. While that ban was uplifted a week later after government documents cleared up some confusion over where the rockets actually came from, some Russian officials subsequently barred any of its rocket engines, including the RD-180 purchased by ULA for US Air Force launches, from being purchased by American companies.

Initially, ULA officials said they had enough engines already in stock to continue launching government satellites for at least two years. But now, it looks like Air Force officials are not taking any chances and moving forward with its plans to open competition for SpaceX.

The Air Force is working as fast as it can to certify SpaceX as a venerable launch provider for the nation’s military and intelligence satellites, according to General William Shelton, who heads the Air Force Space Command.

Gen. Shelton told Reuters that SpaceX was likely to achieve certification by December or January, but the process could not be accelerated given the complex issues that still need to be addressed.

“It’s very difficult to pick up the pace on that,” Shelton said after a speech at a space conference hosted by the Space Foundation. In addition to certifying SpaceX’s three launches, the Air Force was also looking at the firm’s financial and auditing systems and manufacturing processes, he noted.

Shelton maintained that the Air Force remains committed to increasing competition in the rocket launch market and is pressing forward to lower ULA’s launch costs.

THE COST OF BUSINESS

When ULA was launched in 2006 – a joint venture between Boeing and Lockheed Martin – the newly formed company was supposed to save taxpayers $100-$150 million per year in rocket launches, according to SpaceX.

However, SpaceX maintained that instead of saving money, rocket launch costs skyrocketed. Musk maintained that since 2006, vehicle costs have climbed from about $100 million per vehicle to more than $400 million per vehicle, making the ULA launch vehicles the most expensive rockets in not just the US, but the entire world.

Musk said that his company could save the government $1 billion a year by moving to SpaceX and its rockets.

Gen. Shelton said that SpaceX could possibly compete for some launches before it becomes certified, with awards contingent on approval. He said the lawsuit was a surprise, given that the military was already dedicating $60 million and 100 people to the certification process for SpaceX.

“Generally,” he said to Reuters, “the person you’re going to do business with, you don’t sue them.”

Eric Fanning, undersecretary of the Air Force, said during a recent conference that the Air Force is committed to increasing competition and determined to drive down the cost of the existing ULA rockets. He added that the Air Force is also reassessing its reliance on the Russian-made RD-180 engines used in ULA’s Atlas rockets, due to the recent issues between Russia and the US — namely the former Soviet Union’s annexation of Crimea.

Fanning said the Air Force is also looking at a number of long-term options, including developing an alternative engine, bringing in new entrants, or increasing use of ULA’s Delta rockets, which do not rely on Russian engines.

Gen. Shelton said he was aware of the recent threats made by some Russian officials that the RD-180 rocket engines would be taken off the market for US buyers, but according to Reuters, he noted that no official position has been conveyed by Russia as of yet.

If the RD-180 shipments did come to a halt, sources familiar with the report noted that it would have a significant impact on the US military launch program. In the short-term, there is no viable mitigation strategy to replace the Russian rocket engines. In the long-term, the Air Force could do well to boost funding to develop a new US engine by 2017, the sources recommended.

Gen. Shelton said he favored the idea of a US rocket engine to reduce reliance on foreign-suppliers. But, he added, it would cost more than a billion dollars and could take five years or longer to complete, and it also remains unclear from where that funding would arise.

He cautioned, however, about reading too much into the Russian comments on the rocket engines, noting that the ties between the Russian engine builder NPO Energomash and ULA were proceeding as “business” as usual, according to Reuters.

HARMFUL ACTIONS

While the Air Force remains committed to certifying SpaceX, some feel the recent actions by its founder and CEO Elon Musk may have done more harm than good.

“If recent news reports are accurate, it affirms that SpaceX’s irresponsible actions have created unnecessary distractions, threatened U.S. military satellite operations, and undermined our future relationship with the International Space Station,” Jessica Rye, spokeswoman for ULA, said in an email to Bloomberg last week.

Musk’s actions may have hurt SpaceX’s relationship with the Department of Defense and NASA, according to James Thurber, director of the Center for Congressional and Presidential Studies at American University in Washington.

“When you start pushing big boys around, people may not get mad, but they’ll get even,” Thurber told Bloomberg. “Doors will close, and it will make it much more difficult for him in the advocacy world.”

SpaceX, despite its harmful tactics, may not be too worried about making a few enemies. It doesn’t rely on others to build its rockets, its engines or its spacecraft, and doesn’t need to make wealthy partner deals to keep afloat, noted Jeff Foust, a senior analyst at Futron Corp.

“Among the big aerospace companies, the company you’re competing against today you might be partnering with tomorrow,” he told Bloomberg. “SpaceX isn’t like that. They’re not jeopardizing any conceivable business relationship that they might have.”

Still, Musk really wanted a chance to compete with ULA for government satellite launches and, according to his sources, it looked as if the Air Force was throwing him and his company under the rug. Musk maintained that the complaint was only issued as a way of getting his foot back in the door and be able to compete for those military launches, a chance that, according to Gen. Shelton’s comments, was never lost.

As part of his goal to bring competition to the marketplace, Musk is spending more than a million dollars for Washington influence. In his fight for military launches, he has found allies among US lawmakers who also want to see competition open up in the military launch market.

“It’s very clear they made a commitment to increase competition, and then the Air Force reversed itself,” Arizona Senator John McCain, the top Republican on the Homeland Security and Governmental Affairs investigations subcommittee, said in an interview with Bloomberg. “It just doesn’t seem right to me.”

When asked by Bloomberg whether the Air Force might retaliate against Musk following his lawsuit, McCain said that one of his jobs is “to make sure they don’t.”

Rich Biodiversity Of Species Makes Annual Top Ten List Of Discoveries

Lawrence LeBlond for redOrbit.com – Your Universe Online

Each year, an international committee of taxonomists and related experts form a list of the top 10 species of the year. In 2013, the list included such species as the Lesula monkey, lightning roaches and the lyre sponge. Not to be outdone, 2014’s list includes some pretty impressive species as well.

The list is compiled annually by the SUNY College of Environmental Science and Forestry’s (ESF) International Institute for Species Exploration (IISE). The top 10 species list is derived from all species that are discovered and/or named in the previous year – in 2013, roughly 18,000 new species of animal and plant were named. The list was released today (May 22) to coincide with the birthday of Carolus Linnaeus (May 23), an 18th century Swedish botanist who is considered the father of modern taxonomy.

This year’s includes a carnivorous mammal, a strange reptile, several tiny creatures, a brightly-colored fungus, and a large plant.

The annual list, which was established in 2008, calls attention to discoveries that are made even as many species are going extinct faster than they are being identified.

“The majority of people are unaware of the dimensions of the biodiversity crisis,” said Dr. Quentin Wheeler, founding director of the IISE and ESF president.

“The top 10 is designed to bring attention to the unsung heroes addressing the biodiversity crisis by working to complete an inventory of earth’s plants, animals and microbes. Each year a small, dedicated community of taxonomists and curators substantively improve our understanding of the diversity of life and the wondrous ways in which species have adapted for survival,” Wheeler said in a statement.

“One of the most inspiring facts about the top 10 species of 2014 is that not all of the ‘big’ species are already known or documented,” said Dr. Antonio Valdecasas, chair of the selection committee and a biologist and research zoologist with Museo Nacional de Ciencias Naturales in Madrid, Spain. “One species of mammal and one tree species confirm that the species waiting to be discovered are not only on the microscopic scale.”

The popular consensus among scientists is that some 10 million species are still awaiting discovery, five times the number that are already known to science. If bacteria and microbes called archaea are counted in, that number rises to 50 million.

“We have not increased our rate of species discovery and description at all since before World War II… It’s pretty much a steady state of 17,000 to 18,000 species a year. Given the technological advances in recent decades, I find that really inexcusable. We could easily be working an order of magnitude faster,” Wheeler told National Geographic.

And it doesn’t help that many species are going extinct before they can be described. Human encroachment on natural habitats, deforestation, pollution, and climate change are all forcing species biodiversity out the door.

A recent study in the journal Science forecasts that if the current extinction rates continue, Earth will reach mass extinction status – defined as the loss of 75 percent of species – within 300 years, NatGeo reports, noting that the last time that happened was 65 million years ago when the dinosaurs were wiped out.

“Some people say, ‘Well, the Earth recovered from that one and is quite diverse and pleasant today,'” said Wheeler. “And that’s true. The problem is it took tens of millions of years—and it wouldn’t have been a very nice place to live during those tens of millions of years.”

Without further ado, we bring you the top 10 species of 2014, which ESF-IISE notes were not numbered, but rather ordered alphabetically by genus.

Bassaricyon neblina

Up first is the Olinguito, a carnivorous, tree-dwelling mammal found in the cloud forests of the Andes Mountains in Colombia and Ecuador. This small mammal – weighing about 4.5 pounds – resembles a cross between a slinky cat and wide-eyed teddy bear. It is an arboreal carnivore that belongs to the family Procyonidae, which includes the raccoon. Most important about the olinguito, is the fact that this is the first carnivorous mammal described in the Western Hemisphere in 35 years. Because it largely depends on cloud forest habitat, it is likely threatened by deforestation.

Dracaena kaweesakii

Second on the list is Kaweesak’s Dragon Tree, also known as the Mother of Dragons. It was somewhat of a surprise that this 40-foot-tall tree, native to Thailand, went so long without notice. It has beautiful sword-shaped leaves with white edges and cream-colored flowers with bright orange filaments. The tree is found in the limestone mountains of the Loei and Lop Buri Provinces of Thailand and also in nearby Burma. This tree has been given the endangered status due to the fact that there are only a small number of known specimens (2,500 +/-) and they only grow on limestone that is being extracted for the manufacture of concrete.

Edwardsiella andrillae

Up next is a small creature known as the ANDRILL anemone. This small sea creature was discovered in the most unlikely of places – under a glacier on the Ross Ice Shelf in Antarctica. It is not yet understood how this species withstands the extreme and harsh conditions in its habitat and it is the first species of sea anemone that reportedly lives in ice. The creature was discovered when the Antarctic Geological Drilling Program (ANDRILL) sent a remotely operated submersible under the ice shelf via drilled holes. The submersible captured images of the small critters – each less than an inch long. These anemones, which are pale yellow in coloration, burrow into the ice shelf and leave their 20-plus tentacles dangling in the frigid sea water below.

Liropus minisculus

A skeleton shrimp is next on the top 10 list. This see-through crustacean was discovered in a cave on Santa Catalina, off the coast of Southern California. This shrimp, a distant relative of those that are commonly consumed by seafood lovers, is the first genus of skeleton shrimp reported in the northeast Pacific. It has an eerie, translucent appearance that makes it resemble a bony structure. Males are about an eighth of an inch long; females are smaller at less than a tenth of an inch long.

Penicillium vanoranjei

We next move on to the Orange Penicillium, a species of fungus found in Tunisia. This fungus is distinguishable by the bright orange color it displays when in colonies. The species was named as a tribute to the Dutch royal family, and more specifically to His Royal Highness the Prince of Orange. The fungus, which was isolated from soil in Tunisia, produces a sheet-like extra-cellular matrix that may function as a protective measure during times of drought.

Saltuarius eximius

A striking reptile makes its way into the 2014 top 10 list. Called the leaf-tailed gecko, this hard-to-spot reptile was discovered last year in the isolated rain forests of the Melville Range in Australia. The species is a master of camouflage, resembling the leaf litter or rocky debris in its natural habitat, as the creature has an extremely wide tail that makes up part of its camouflage. It has longer limbs, a more slender body and larger eyes than other species in the genus Saltaurius and its mottled coloration helps it blend in with its surroundings. Native to rain forests and rocky habitats, this species is somewhat nocturnal. It waits for prey to approach while sitting on the vertical surfaces of its habitats. Because surveys of its surroundings did not reveal the presence of any other specimens, it has been assumed that this is a very rare species.

Spiculosiphon oceana

The amoeboid protist is considered a giant in the world of single-celled organisms. Discovered in underwater caves 30 miles off the southeast coast of Spain, this one-celled giant amoeba can grow to two inches in length. This Mediterranean foram – part of a distinct group of amoeboids – gathers pieces of silica spicules (sponge fragments) from its surroundings and uses them like Lego blocks to construct a shell. While not a sponge itself, it does look like and feeds like one, extending its arms – known as pseudopods – outside its shell to feed on invertebrates that get trapped in the spiny structures. Interestingly, the location where this amoeboid was discovered is the same caves where carnivorous sponges were first discovered.

Tersicoccus phoenicis

Last year, the first evidence of contamination in clean rooms where spacecraft are assembled came to light. This contamination comes in the form of clean room microbes, first discovered at a spacecraft facility in Florida and subsequently at another in French Guiana more than 2,500 miles away. These microbes are a serious problem due to the fact that we do not want to send anything into space that could potentially contaminate other planets, such as Mars, that these spacecraft visit. Frequent sterilization of clean rooms is conducted to keep microbes out, but it seems there are ones out there that are resistant to extreme dryness; wide ranges of pH, temperature and salt concentrations; and exposure to UV light and hydrogen peroxide.

[ Watch the Video: Super Microbes Discovered In Two Spacecraft Clean Rooms ]

Tinkerbella nana

Included in the 2014 top 10 list is a tiny species of fairyfly called the Tinkerbell fairyfly. Fairyflies got their name due to their tiny size and their delicately fringed wings. They belong to the parasitoid wasp family Mymaridae. This species gets its name from Peter Pan’s fairy sidekick Tinkerbell – not surprising since it is less than one-hundredths-of-an-inch long (250 micrometers). This new species, joining the nearly 1,400 already known species of its family, was collected by sweeping vegetation in secondary growth forest at LaSelva Biological Station in Costa Rica. Although its host is not yet known, it presumably has a life span of no more than a few days and is known for attacking the eggs of other insects.

Zospeum tholussum

Last, but not least, on the list is the domed land snail, which has a ghostly appearance. This terrestrial snail, which lives in complete darkness at 3,000 feet below ground in the Lukina Jama-Trojama caves of western Croatia, lacks eyes as they are not necessary in its habitat. Also, due to its habitat, the creature lacks shell pigmentation, which gives it the ghostly translucent appearance. Only a single living specimen was collected in a large cavern among rocks and a nearby stream of running water. While only one living specimen was found, many other empty shells were also discovered. As with most snails, this ghost snail is a slow-goer, moving along at just a few millimeters per week. These small snails, measuring less than one-tenth-of-inch, travel more easily in the water currents or by hitching a ride with other cave inhabitants, such as bats or crickets.

CRITICAL INVENTORY

“I have been participating in the top 10 since its beginning in 2008, and I am always surprised by the constant number of species discovered in all the organic kingdoms,” Valdecasas said. “It makes selecting the species challenging and demanding, but at the same time, inspiring. We are very far from having exhausted the knowledge of the biodiversity on Earth.”

Wheeler offered three reasons why an inventory of the world’s plants and animals is critical:

  • Without a baseline of what exists, humans will not know if something disappears, moves in response to climate change or invades new habitats. “As long as we remain ignorant of the vast majority of species, we unnecessarily limit our effectiveness at conservation goals.”
  • Billions of years of natural selection have driven plants and animals to solve the same survival problems that humans face. “By studying the millions of ways in which organisms have met challenges, we open a great library of possibilities for meeting our own needs more sustainably.”
  • Simple curiosity is a factor. “If we want to understand what it means to be human the answer is buried deep in evolutionary history. We are a modified version of our ancestors, and they of theirs … all the way back to the first species on Earth. With the loss of every species, we lose one chapter in our own story that we’ll never get back.”

Wheeler said he hopes the list draws attention to the urgent need to complete an inventory on all of the species on the planet.

“Advances in technology and communication mean that the centuries-old dream of knowing all species is within our reach. The benefits of learning our world’s species are incalculable and the single most important step we can take in preparation for an uncertain environmental future,” he explained.

In conclusion, Valdecasas conjured an image of a human getting a one-way ticket to Mars. At some point after arriving on the desolate, arid landscape the space traveler would start pining for the extremely rich level of biodiversity that only the Earth provides, as well as the sights, smells and sounds of nature.

“Nothing, nothing could ever compensate for that,” he said. “Now, think how fortunate we are to have at hand such a universe.”

Image 2 (below): The Cape Melville leaf-tailed gecko, a spectacular new species from remote northern Australia. Credit: Conrad Hoskin

Eating A Healthy Diet Can Help Improve Lung Function In COPD Patients

April Flowers for redOrbit.com – Your Universe Online

An international team of researchers has discovered a direct link between eating fish, fruit and dairy products with improved lung function in patients with chronic obstructive pulmonary disease (COPD). The study, which is being presented at the American Thoracic Society (ATS) 2014 International Conference, specifically examined COPD patients’ lung function within 24 hours of consuming fish, cheese, grapefruit and bananas.

“Diet is a potentially modifiable risk factor in the development and progression of many diseases, and there is evidence that diet plays a role in both the development and clinical features of COPD,” said Corinne Hanson, PhD, assistant professor of Medical Nutrition at the University of Nebraska Medical Center. “This study aimed to evaluate that association.”

The research team used data collected as part of the Evaluation of COPD Longitudinally to Identify Predictive Surrogate Endpoints study (ECLIPSE), which was designed to determine the progression of COPD and identify biomarkers associated with the disease. The team analyzed limited diet records for 2,167 ECLIPSE subjects. The participants provided dietary intake information eight times over a three-year period, reporting the amount of a specific food they had consumed during the past 24 hours.

Standard lung functionality measurements for the same group were analyzed. These assessments included the six-minute walk test (SMWT), St. George’s Respiratory Questionnaire (SGRQ) scores and inflammatory biomarkers. The team adjusted their findings for age, sex, body mass index (BMI) and smoking.

According to the results, people who reported recently consuming fish, grapefruit, bananas or cheese showed improvement in lung function, less emphysema, improved scores on the SMWT, improved scores on the SGRQ, and a decrease in certain inflammatory biomarkers associated with poor lung function including white blood cells and C-reactive protein.

“This study demonstrates the nearly immediate effects a healthy diet can have on lung function in in a large and well-characterized population of COPD patients,” Hanson said. “It also demonstrates the potential need for dietary and nutritional counseling in patients who have COPD.”

Hanson believes that the link between diet as a modifiable risk factor in COPD and the results of the new study deserves further investigation.

Delivering Drugs To The Brain Via Nasal Spray

University of Southern Denmark
When the doctor gives us medicine, it is often in the shape of a pill. But when it comes to brain diseases, pills are actually an extremely ineffecient way to deliver drugs to the brain, and according to researchers from University of Southern Denmark we need to find new and more efficient ways of transporting drugs to the brain. Spraying the patient’s nose could be one such way.
Every time we have an infection or a headache and take a pill, we get a lot more drugs than our body actually needs. The reason is that only a fraction of the drugs in a pill reaches the right places in the body; the rest never reaches its destination and may cause unwelcome side effects before they are flushed out of the body again. This kind of major overdosing is especially true when doctors treat brain diseases, because the brain does not easily accept entering drugs.
“People with brain diseases are often given huge amounts of unnecessary drugs. During a long life, or if you have a chronic disease, this may become problematic for your health,” says Massimiliano Di Cagno, assistant professor at the Department of Physics, Chemistry and Pharmacy, University of Southern Denmark.
He is concerned with finding more efficient ways of delivering drugs to the brain. He and his colleagues at University of Southern Denmark and Aalborg University have turned their attention to the nose – specifically the nasal wall and the slimy mucosa that covers it.
As we know from e.g. cocaine addicts, substances can be assimilated extremely quickly and directly through the nose. But many medical substances, however, need help to be transported through the nasal wall and further on to the relevant places in the brain.
Researchers have long struggled with this challenge and have come up with different kinds of transport vehicles that are very good at transporting the active ingredients through the nasal wall into the brain. The problem with these vehicles, though, is that they cannot release their cargo of drugs once they have reached the inside of the brain. The drugs stay locked inside the strong vehicles.
“If the drugs cannot get out of their vehicles, they are no help to the patient. So we needed to develop a vehicle that does not lock the drug in,” explains Massimiliano Di Cagno.
The vehicles for drug delivery through the nose are typically made of so called polymers. A polymer is a large molecule composed of a large number of repeats of one or more types of atoms or groups of atoms bound to each other. Polymers can be natural or synthetic, simple or complex.
Direct track to the brain
Massimiliano Di Cagno and his colleagues tested a natural sugar polymer and they now report that this particular polymer is not only capable of carrying the drugs through the nasal wall but also – and most importantly – releasing the drug where it is needed.
“This is an important breakthrough, which will bring us closer to delivering brain drugs by nasal spray,” says Massimiliano Di Cagno.
With this discovery two out of three major challenges in nasal delivery of brain drugs have been met:
“We have solved the problem of getting the drug through the nose, and we have solved the problem of getting the drug released once it has entered the brain. Now there is a third major challenge left: To secure a steady supply of drugs over a long period. This is especially important if you are a chronic patient and need drug delivery every hour or so,” says Massimiliano Di Cagno.
When a patient sprays a solution with active drugs into his nose cavity, the solution will hit the nasal wall and wander from here through the nasal wall to the relevant places in the brain.
“But gravity also rules inside the nose cavity and therefore the spray solution will start to run down as soon as it has been sprayed up the nose. We need it to cling to the nasal wall for a long time, so we need to invent some kind of glue that will help the solution stick to the nasal wall and not run down and out of the nose within minutes,” says Massimiliano Di Cagno.
The work is a collaboration between Massimiliano Di Cagno, Professor Annette Bauer-Brandl and Associate Professor Judith Kuntsche from the Institute of Physics, Chemistry and Pharmacy at the University of Southern Denmark, and Assistant Professor Thorbjorn Terndrup Nielsen and Associate Professor Kim Lambertsen Larsen from the Department of Chemistry and Environmental Engineering at Aalborg University. The team’s work is published in the International Journal of Pharmaceutics.

Paper-Based Diagnostics, Made With A Scrapbooking Tool, Could Curb Hepatitis C Pandemic

American Chemical Society

To the relief of patients diagnosed with hepatitis C, the U.S. Food and Drug Administration approved two new treatments late last year, and a few more are on the way. Now scientists are solving another side of the disease’s problem: identifying the millions more who have the virus but don’t know it — and unwittingly pass it on. A report in the ACS journal Analytical Chemistry describes a novel, scrapbook-inspired test that does just that.

Xuan Mu, Zhi Zheng and colleagues point out that the hepatitis C virus (HCV), a blood-borne pathogen that can cause liver cirrhosis, cancer and even death, kills more people in the U.S. than HIV. It also infects an estimated 150 million people around the world. Although diagnostic tests exist, they require an initial screening and then a costly second test for confirmation. The extra office visits, money and time required for a definitive diagnosis means a lot of people simply can’t or won’t follow up. To make diagnosis more accessible, the researchers took advantage of the recent development of new, inexpensive paper-based medical technologies and applied it to HCV screening.

Taking a page from the popular scrapbooking pastime, the scientists used a flower-shaped metal paper cutter to punch out shapes from special paper for their diagnostic test. The method solves the problem of patterning the paper, made of nitrocellulose — a highly flammable substance — without using heat. They add antigens, antibodies and other chemicals to the paper to test patient samples. With one flower-shaped paper, they can conduct both HCV tests on a sample simultaneously in just minutes, instead of hours.

Upcoming Camelopardalid Meteor Shower Could Rival The Perseids

[ Watch the Video: ScienceCasts – NASA On The Lookout For A New Meteor Shower ]

Brett Smith for redOrbit.com – Your Universe Online

May isn’t exactly known for its meteor showers. In fact, this month’s Camelopardalid meteor shower, caused by dust from periodic comet 209P/LINEAR, has technically never even been seen before. However, astronomers have predicted that May 2014 could see a Camelopardalid meteor shower that rivals the year’s biggest display – the Perseids of August.

This year could be such a historic display, NASA’s head of its Meteoroid Environment Office, Bill Cooke, said, adding that he plans to head out and see the display with his own two eyes, instead of simply reviewing images of the meteor shower captured by the space agency’s nationwide network of fireball cameras.

“There could be a new meteor shower, and I want to see it with my own eyes,” Cooke said. “Some forecasters have predicted more than 200 meteors per hour.”

When a comet crosses Earth’s orbit as it circles the Sun, it leaves behind streams of debris for the Earth to plow through around the same time every year. The relatively faint comet behind the Camelopardalids, Comet 209P/LINEAR, was discovered in 2004 and was found to orbit the Sun once every five years.

In 2012, meteor experts Esko Lyytinen of Finland and Peter Jenniskens at NASA Ames Research Center said Earth was due to have an encounter with debris from Comet 209P/LINEAR as streams of dust released by the comet during the 1800s would intersect with Earth’s orbit on May 24, 2014. The end result, they explained, might be a major meteor episode.

The consensus among astronomers seems to be that the Camelopardalids will indeed be visible this year. However, experts aren’t certain if the meteor shower will be a major event or a major fail. The amount of activity will depend on how much debris was ejected more than 100 years ago.

“We have no idea what the comet was doing in the 1800s,” Cooke said, adding, “there could be a great meteor shower—or a complete dud.”

To see just how much activity there will be, NASA suggested watching the skies between 2 and 4 a.m. (EDT) on May 24. That’s when models project Earth is probably going to come across most of the comet’s debris. North Americans are in an ideal situation because peak activity should occur during nighttime hours while the location in the sky where the fireballs will originate from, called the radiant, is high in the sky.

“We expect these meteors to radiate from a point in Camelopardalis, also known as ‘the giraffe’, a faint constellation near the North Star,” Cooke said.  “It will be up all night long for anyone who wishes to watch throughout the night.”

NASA warned that major fireball outbursts could take place before or after that early morning window. Since this could be the first time anyone has seen the Camelopardalis, skywatchers should be ready for anything, the space agency said.

If the Camelopardalis do turn out to be a major letdown, NASA said the crescent Moon and Venus are expected to converge closely together the next morning as a bit of a consolation for amateur astronomers.

Image 2 (below): Map of projected peak viewing for 2014 May Camelopardalids meteor shower. Credit: NASA/MSFC/Danielle Moser

Many Healthy Americans Harbor Human Papilloma Viruses

April Flowers for redOrbit.com – Your Universe Online

There are currently 148 known strains of the human papillomavirus (HPV), and a new study led by NYU Langone Medical Center reveals that 69 percent of otherwise healthy US adults are infected with one or more strains.

The findings, presented at the 2014 American Society for Microbiology Meeting, are derived from what is believed to be the largest and most detailed genetic analysis of its kind. The analysis showed that, out of the 103 people whose DNA was publically available through a government database, they found 109 strains of HPV. Only four participants had either of the two HPV strains known to cause most cases of cervical cancer, some throat cancers, and genital warts.

The majority of the 109 strains appear to be harmless, so far, and can remain dormant in the body for years. In the body, where many viral strains keep each other in check to prevent other strains from spreading uncontrollably, the overwhelming presence of HPV indicates a delicate balancing act. Researchers are increasingly aware that HPV can be spread through skin-to-skin contact, however it remains the most common sexually transmitted infection (STI) in the US — so common, in fact, that if estimates are correct, nearly all men and women contract some strain of it during their lives.

“Our study offers initial and broad evidence of a seemingly ‘normal’ HPV viral biome in people that does not necessarily cause disease and that could very well mimic the highly varied bacterial environment in the body, or microbiome, which is key to maintaining good health,” says NYU Langone pathologist and associate professor Zhiheng Pei, who presented the findings this week.

NYU Langone research scientist Yingfei Ma, PhD, said “the HPV ‘community’ in healthy people is surprisingly more vast and complex than previously thought, and much further monitoring and research is needed to determine how the various non-cancer-causing HPV genotypes interact with the cancer-causing strains, such as genotypes 16 and 18, and what causes these strains to trigger cancer.”

During the two-year study, the research team analyzed data obtained from the National Institutes of Health (NIH) Human Microbiome Project. The project gathers information on the effect microorganisms have on human health.

The NIH researchers used a technique called shotgun sequencing to assemble a comprehensive DNA analysis of the samples. Shotgun sequencing allowed the research team to sort through vast amounts of genetic material contained in the 748 tissue swabs. The tissue samples were collected from the major organs (skin, mouth, vagina, and gut) of healthy participants ages 18 to 80. Shotgun sequencing deciphers the genetic code of long strands of DNA in a random firing pattern.

Until the harm or benefits of the many HPV strains can be understood, Dr. Pei cautions that people should not be overly concerned. To understand any potential threat before seeking treatment, he suggests that they consult with their clinician or an infectious disease specialist. Pei also recommends getting vaccinated against types 16 and 18 to prevent cervical cancer until a broader anti-HPV vaccine becomes available to target other harmful effects.

Additional key findings for the study include:

• Skin infections made up 61 percent, vaginal infections 41 percent, oral infections made up 30 percent, and gut infections counted for 17 percent.

• Seventy-one participants were infected. Of these, 42 had HPV in only one organ (59 percent), 22 had it in two organs (31 percent), and seven had it in three (10 percent). None had HPV infections in all four organs tested.

• The most varied strains of HPV were found in the skin samples (80 types total, with 40 that were only found in skin). The second highest number was found in vaginal tissue 43 types of HPV, with 20 strains exclusive to the organ), followed by mouth tissue (33 types, of which five were exclusively oral in origin), and gut tissue (six types, all of which were found in other organs).

The findings illustrate the weaknesses in current clinical testing for HPV, which are designed to recognize only around a dozen of the viral types most closely tied to cervical cancer. To more accurately assess people’s true HPV infection status, Pei says that broader detection methods are necessary.

Dr. Ma says that the research team intends to continue their investigations by determining which non-cancer-causing strains of HPV might play a role in cancers of the cervix, mouth and skin, as well as developing more comprehensive diagnostic testing kits to detect all strains of HPV.

Infertility Could Be Linked To High Cholesterol In Would-Be Parents

redOrbit Staff & Wire Reports – Your Universe Online

Couples who are having trouble conceiving might want to have their cholesterol levels checked, according to new research published online Tuesday by The Journal of Clinical Endocrinology and Metabolism.

According to the study authors, high cholesterol could impair fertility, and couples in which each partner had elevated levels took the longest time to conceive. Furthermore, cases in which the woman had a high cholesterol level but not the man also took a greater amount of time to successfully conceive when compared to couples in which both the male and female partners had cholesterol levels that fell in the acceptable range.

“We’ve long known that high cholesterol levels increase the risk for heart disease,” explained first author Dr. Enrique Schisterman, chief of the Epidemiology Branch at the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). “In addition to safeguarding their health, our results suggest that couples wishing to achieve pregnancy could improve their chances by first ensuring that their cholesterol levels are in an acceptable range.”

Cholesterol, which is a waxy fat-like substance found within all of the body’s cells, is used to produce vitamin D, hormones and several other types of substances, the researchers explained. Approximately 71 million Americans over the age of 18, or one-third of all adults in the US, have elevated low-density lipoprotein (LDL) or “bad cholesterol” levels, according to the Centers for Disease Control and Prevention (CDC).

The population-based prospective cohort study looked at the pregnancy rate of 501 heterosexual couples from four counties in Michigan and 12 counties in Texas between 2005 and 2009. Each of the couples was involved in the Longitudinal Investigation of Fertility and the Environment (LIFE) study, which examined the link between fertility and exposure to environmental chemicals and lifestyle, and all of them were actively trying to conceive.

Over the course of the 12-month study, 347 of the couples became pregnant, while 54 did not conceive and 100 others withdrew from the study (including some who changed their mind about having a child). Dr. Schisterman and his colleagues measured the cholesterol levels of each potential mother and father by testing a blood sample that was taken at the outset of the study.

Instead of directly measuring HDL cholesterol, LDL cholesterol and triglycerides separately, the investigators opted to measure the total and free amounts of cholesterol in the blood. Based on their findings, they found that couples where one or both partners had high cholesterol took significantly longer to conceive. They believe that blood cholesterol could be linked to fertility because it is used to manufacture the sex hormones testosterone and estrogen.

“Couples in which both the prospective mother and father had high cholesterol levels took the longest time to conceive a child,” said Dr. Schisterman, who was joined on the study by experts from NICHD, Emory University and the University of Buffalo. “Our study also found couples in which the woman had high cholesterol and the man did not took longer to become pregnant than couples where both partners had cholesterol levels in the normal range.”

Time Warner, Comcast Top The Heap Of Most Hated US Companies

Peter Suciu for redOrbit.com – Your Universe Online
Some might say it is a match made in heaven. While the merger between the nation’s number one and two cable companies is still far from a done deal, on Tuesday Comcast and Time Warner Cable were named the most hated companies in America.
According to the American Customer Satisfaction Index, which is put out quarterly by the University of Michigan’s Ross School of Business and is often considered the most comprehensive customer satisfaction survey in the United States, the two cable giants have the lowest customer satisfaction ratings of any ISPs in the nation.
Overall, the cable industry is not well liked it seems. Whilst the average companies score in the high 70s on the index (out of 100) – which in school terms is still a “C” – subscription TV companies scored just 65 on the index. This includes traditional cable companies as well as fiber optic and satellite cable operators. Only Internet companies scored worse at 63 – but it is worth adding that many of the cable companies are also Internet Service Providers so it can be impossible to differentiate between the two.
The ACSI also found that customer satisfaction with subscription-TV services fell 4.4 percent in just the last year while Internet companies declined 4.1 percent.
Cable giants Comcast and Time Warner had the most dissatisfied customers, and both saw declines in user satisfaction. Comcast fell five percent to 60, while Time Warner registered a seven percent decline to 56. That is its lowest score to date. Cox Communications, which also fell three percentage points, was the highest cable company with a score of 63.
Fiber optic and satellite services fared a little better. DirecTV lost four points while AT&T – which is in talks to buy the satellite TV provider – lost three points; both had ACSI scores of 69. Verizon Communications’ FiOS scored 68, while DISH Network scored 67.
This isn’t exactly new news either. Price was deemed a factor; the cost of subscription TV rose six percent on average – four times the rate of inflation.
“It’s consistently been that way,” ACSI managing director David VanAmburg told MarketWatch. “It eats up a big chunk of the household budget…renters pay more for that than energy. Combine these high prices with issues of reliability and there’s a value for the money problem.”
Perhaps the ISPs and cable companies should look to mobile phones. In fact, while many of the same companies are in multiple businesses, consumer satisfaction with cell phones improved to 2.6 percent to a score of 78 while wireless phone service remained unchanged for the year at 72.
Consumers are finding alternatives to the traditional pay TV services as well.
“The Internet has been a disruptor for many industries, and subscription TV and ISPs are no exception,” said Claes Fornell, ACSI chairman and founder, in a statement. “Over-the-top video services, like Netflix and Hulu, threaten subscription TV providers and also put pressure on ISP network infrastructure. Customers question the value proposition of both, as consumers pay for more than they need in terms of subscription TV and get less than they want in terms of Internet speeds and reliability.”
These findings come shortly after AT&T announced its plans to acquire DirecTV, and Comcast’s plans to buy Time Warner. The two cable giants have the most dissatisfied customers, but perhaps the merger could help their scores.
“Comcast and Time Warner assert their proposed merger will not reduce competition because there is little overlap in their service territories,” VanAmburg added in a statement. “Still, it’s a concern whenever two poor-performing service providers combine operations. ACSI data consistently show that mergers in service industries usually result in lower customer satisfaction, at least in the short term. It’s hard to see how combining two negatives will be a positive for consumers.”
Some have already noted that it takes a lot to come in so low in the ACSI rankings.
“What’s most amazing is that both Comcast and TWC have even lower customer satisfaction ratings than United Airlines, which has a notoriously bad reputation in an industry that, due in part to government security requirements, is known for delivering a miserable experience,” Brad Reed wrote for BGR. “Other notable companies that had higher customer satisfaction scores than Comcast and TWC included Bank of America, perennially unpopular wireless carrier Sprint, health insurance giant Aetna and the Los Angeles Department of Water and Power.
“It’s unfortunate that ACSI didn’t ask how people felt about Skeletor, Gargamel and Cobra Commander, because we get the feeling that Comcast and TWC would have had lower ratings than them as well — after all, if you’re more unpopular than major airlines, health insurance companies and even monopolistic utility companies, then being more disliked than ’80s cartoon super-villains doesn’t seem like much of a stretch,” Reed added.

Studies Focus On How Electronic Devices And Sedentary Behaviors Impact Kids

Brett Smith for redOrbit.com – Your Universe Online
Young children are often encouraged to pursue activities besides watching TV and playing with electronic devices, and a new report has found that sedentary children can exhibit increased risk for type 2 diabetes and cardiovascular disease as young as 6 to 8 years old.
Published in the International Journal of Behavioral Nutrition and Physical Activity, the new report is based on the Finland-based Physical Activity and Nutrition in Children Study (PANIC), which included more than 510 children between the ages of 6 and 8 years old from 2007 to 2009.
Researchers from the University of Eastern Finland administered a survey of their young participants given by parents and validated activity levels by a monitor that tracked heart rate and accelerometry. The study team also assessed body fat percentage, waist circumference, blood glucose, blood pressure and several other biomarkers. Based on this data the team calculated a cardiometabolic risk score for each child.
The researchers found low levels of exercise – and particularly unstructured exercise – are associated with heightened risk factors for type 2 diabetes and vascular diseases. Also, heavy usage of electronic media was associated with greater levels of risk factors in children. The greatest increases in risk factors were discovered in children with lowest amounts of physical activity and highest amounts of electronic media time.
Heavy use of electronic media boosted risk factors not only in inactive children, but also in children who were physically active. Furthermore, irregular eating habits and an unhealthy diet were associated with higher risk factors for type 2 diabetes and vascular diseases. These nutrition-related factors somewhat explain the connection between heavy use of electronic media and the risk factors, the researchers said.
“The results of our study emphasize increasing total and unstructured (physical activity) and decreasing watching TV and videos and other sedentary behaviors to reduce cardiometabolic risk among children,” the Finnish team wrote in their conclusion.
Another electronic device study, currently being conducted by Imperial College and the University of London, is looking to see if Wi-Fi transmissions from devices are damaging to a child’s brain. The UK team is monitoring the development of 11- to 14-year olds while their device usage is tracked via a specially-designed app.
“We need to investigate because it is a new technology,” Paul Elliot, director of Medical Research Council Centre for Environment and Health at Imperial College, told The Telegraph.
“Scientific evidence available to date is reassuring and shows no association between exposure to radiofrequency waves from mobile phone use and brain cancer in adults in the short term,” he added. “But the evidence available regarding long term heavy use and children’s use is limited and less clear.”
The Study of Cognition, Adolescents and Mobile Phones (SCAMP), which claims to be the largest such study in the world, will examine cognitive functions such as memory and attention that develop as a child progresses through adolescence.
“As mobile phones are a new and widespread technology central to our lives, carrying out the SCAMP study is important in order to provide the evidence base with which to inform policy and through which parents and their children can make informed life choices,” said study investigator Mireille Toledano, an epidemiologist at Imperial College.

Sense Of Taste May Play A Role In Longevity

April Flowers for redOrbit.com – Your Universe Online

How food tastes can influence healthy habits by encouraging or discouraging us from eating certain foods, but can taste do more? According to two new studies published in the Proceedings of the National Academy of Science (PNAS), taste may play a role in our longevity as well.

Our taste buds control our longing for sweet, salty or bitter foods, but the research teams from the University of Michigan, Wayne State University and Friedrich Miescher Institute for Biomedical Research in Switzerland found that they may also play a powerful role in a long and healthy life.

The researchers found that suppressing the ability to taste food in fruit flies, regardless of how much the animal eats, can cause significant increases or decreases in length of life, as well as promote healthy aging.

The fruit flies demonstrated that bitter tastes could negatively affect lifespan, while sweet tastes had positive effects on longevity. The most surprising, however, was the effect of the taste of water. Fruit flies that were unable to taste water lived up to 43 percent longer than flies with the ability, suggesting that the loss of taste might cause physiological changes to help the body adapt to the perception that it’s not getting adequate nutrients.

The researchers suggest fruit flies that have lost the ability to taste water might compensate for a perceived water shortage by storing greater amounts of fat. These fat stores would produce water internally. The researchers plan to continue their studies to understand why bitter and sweet tastes affect aging.

“This brings us further understanding about how sensory perception affects health. It turns out that taste buds are doing more than we think,” said Scott Pletcher, Ph.D., associate professor in the Department of Molecular and Integrative Physiology at the University of Michigan and research associate professor at the Institute of Gerontology and senior author on the first study.

“We know they’re able to help us avoid or be attracted to certain foods but in fruit flies, it appears that taste may also have a very profound effect on the physiological state and healthy aging,” said Pletcher, who worked with Michael Waterson, a PhD graduate student in U-M’s Cellular and Molecular Biology Program.

“Our world is shaped by our sensory abilities that help us navigate our surroundings and by dissecting how this affects aging, we can lay the groundwork for new ideas to improve our health,” said senior author of the other study, Joy Alcedo, Ph.D, assistant professor in the Department of Biological Sciences at Wayne State University, formerly of the Friedrich Miescher Institute for Biomedical Research in Switzerland. Ivan Ostojic, Ph.D., of the Friedrich Miescher Institute for Biomedical Research in Switzerland collaborated with Alcedo on this study.

Health-related characteristics such as athletic performance, Type II diabetes and aging have been shown to be influenced by sensory perception in prior research. These studies, however, are the first to provide in-depth looks into the role of taste perception.

“These findings help us better understand the influence of sensory signals, which we now know not only tune an organism into its environment but also cause substantial changes in physiology that affect overall health and longevity,” Waterson added. “We need further studies to help us apply this knowledge to health in humans potentially through tailored diets favoring certain tastes or even pharmaceutical compounds that target taste inputs without diet alterations.”

Your Next Smartphone Case Or Electric Vehicle Body May Be A Battery

Brett Smith for redOrbit.com – Your Universe Online

Because space within today’s electronic devices is at a premium and these devices are requiring more and more power – engineers have been looking into making batteries that are also the cases of devices or possibly the exteriors of electric vehicles.

According to a new study published in the journal Nano Letters, engineers at Vanderbilt University have developed small, grey wafers that could be the forerunner of these new types of batteries.

“These devices demonstrate – for the first time as far as we can tell – that it is possible to create materials that can store and discharge significant amounts of electricity while they are subject to realistic static loads and dynamic forces, such as vibrations or impacts,” said study author Cary Pint, an assistant professor of mechanical engineering at Vanderbilt University.

The novel device is actually a supercapacitor that holds electricity by cobbling together charged ions on the surface of a porous material instead of holding it in chemicals as conventional electric batteries do now. Consequently, supercapacitors may charge and release in minutes, as opposed to hours, and function for millions of cycles as opposed to thousands of cycles like conventional batteries.

In the report, the researchers stated that their design functions perfectly in storing and releasing a charge as it is being subjected to pressures up to 44 pounds per square inch and vibrational accelerations much higher than those acting on turbine blades in a jet engine.

Pint said the robust nature of the novel device does not compromise its performance.

“In an unpackaged, structurally integrated state our supercapacitor can store more energy and operate at higher voltages than a packaged, off-the-shelf commercial supercapacitor, even under intense dynamic and static forces,” Pint said.

One major drawback with supercapacitors is the fact that they must be larger and heavier than lithium-ion batteries to store the same amount of energy. However, Pint said this supposed drawback isn’t so bad considering what ‘supercaps’ might be used for in the future.

“Battery performance metrics change when you’re putting energy storage into heavy materials that are already needed for structural integrity,” Pint said. “Supercapacitors store ten times less energy than current lithium-ion batteries, but they can last a thousand times longer. That means they are better suited for structural applications. It doesn’t make sense to develop materials to build a home, car chassis, or aerospace vehicle if you have to replace them every few years because they go dead.”

The energy-storing wafers consist of electrodes made from silicon, which is best suited for consumer devices and solar cells. However, the study researchers said that their design will probably carry over to other materials, such as carbon nanotubes and metals like aluminum.

The US Department of Energy’s Advanced Research Project Agency for Energy (ARPA-E) is currently investing $8.7 million in study projects that concentrate expressly on integrating energy storage into structural materials. While there have been new reports of other attempts to create multifunctional materials or structural batteries, there have not been any published studies on tests performed with these storage materials that indicate how they perform under practical mechanical loads, Pint said.

Gum Disease Linked To Increased Risk Of Heart Disease

Brett Smith for redOrbit.com – Your Universe Online
A new study presented on Sunday at the annual meeting of the American Society for Microbiology has found that the same bacteria responsible for gum disease can also advance heart disease.
While doctors recognize that individuals with gum disease are at greater risk for heart disease, gum disease isn’t currently considered a standard risk factor for heart disease.
“We report evidence that introduction of oral bacteria into the bloodstream in mice increased risk factors for atherosclerotic heart disease,” said study investigator Irina M. Velsko, a graduate medical student in the University of Florida. “Our hope is that the American Heart Association will acknowledge causal links between oral disease and increased heart disease. That will change how physicians diagnose and treat heart disease patients.”
The study also implies that brushing teeth clean, flossing and regular visits to the dentist can directly lower the risk of developing heart disease later in life. Caused by bacteria, gum disease impacts 46 percent of the US population, while heart disease is the top cause of death in North America – the study team said. The American Heart Association published a statement in 2012 linking gum and heart disease, but did not find a cause-and-effect relationship.
In the new study, the scientists infected mice with four gum-disease bacteria and followed their progression. As soon as the bacteria were seen in the mouse gums, heart and aorta – the study scientists said they saw an uptick in risk factors, such as cholesterol and inflammation, related to disease.
“The mouth is the gateway to the body and our data provides one more piece of a growing body of research that points to direct connections between oral health and systemic health,” said study investigator Kesavalu Lakshmyya, from the University of Florida’s Department of Periodontology.
“In Western medicine there is a disconnect between oral health and general health in the rest of the body; Dentistry is a separate field of study from Medicine,” Lakshmyya noted.
The new study is actually part of a bigger research project funded by the National Institutes of Health/National Institute of Dental and Craniofacial Research and focused on how gum disease affects overall health.
“Our intent is to increase physician awareness of links between oral bacterial infection and heart disease,” said study investigator Alexandra Lucas, a cardiologist at the University of Florida. “Understanding the importance of treating gum disease in patients with heart disease will lead to future studies and recommendations for careful attention to oral health in order to protect patients against heart disease.”
Another study published last year by Journal of Alzheimer’s Disease found a connection between gum disease and Alzheimer’s disease. In the study, researchers discovered that subjects with Alzheimer’s showed the presence of products from Porphyromonas gingivalis bacteria in the brain. P. gingivalis is the most common cause of chronic gum disease.
“We are working on the theory that when the brain is repeatedly exposed to bacteria and/or their debris from our gums, subsequent immune responses may lead to nerve cell death and possibly memory loss,” said study author Sim K. Singhrao, a medical and dental research fellow at the University of Lancashire. “Thus, continued visits to dental hygiene professionals throughout one’s life may be more important than currently envisaged with inferences for health outside of the mouth only.”

Urine Is Not So Sterile After All, Bacteria Found In Urine Of Women With Overactive Bladder

Brett Smith for redOrbit.com – Your Universe Online

Although urine has long-been thought to be sterile, a new study has found that not only can bacteria survive in urine – they are relatively prolific in women with overactive bladder (OAB).

“Doctors have been trained to believe that urine is germ-free,” said Linda Brubaker, dean of Loyola University Chicago’s Stritch School of Medicine (SSOM) and an investigator in the new study, which was presented on Sunday as part of the 2014 General Meeting of the American Society for Microbiology being held in Boston.

“These findings challenge this notion, so this research opens the door to exciting new possibilities for patient treatment,” Brubaker added.

In their presentation, the study team said the presence of bacteria had gone undetected because normal bacteria assays aren’t calibrated properly to analyze urine. Co-investigator Evann Hilt said, quite frankly, that researchers in the past weren’t looking to find latent bacteria in urine.

“For all these years, (analyses looked at) whether or not a person had an infection and with that there was a certain threshold that they used to determine whether or not a person was suffering from an infection,” said Hilt, a graduate student at Loyola. “All the bacteria that we see present are below that threshold.”

To detect these bacteria, the researchers assessed urine specimens of 90 women with and without OAB utilizing a new technique known as expanded quantitative urine culture (EQUC). EQUC was capable of finding bacteria that were not discovered by the typical urine culture means normally used to identify urinary tract syndromes.

The team found that the bacteria in the bladders of healthy women vary considerably from those in women suffering from OAB, indicating that particular bladder bacteria are likely involved in OAB. Around 15 percent of women have OAB and about 40 to 50 percent of impacted women do not benefit from typical treatments. One possible reason behind the insufficient therapy response could be the bacteria within the bladder of these women.

“The presence of certain bacteria in women with overactive bladder may contribute to OAB symptoms,” Hilt said. “Further research is needed to determine if these bacterial differences are clinically relevant for the millions of women with OAB and the doctors who treat them.”

“If we can determine that certain bacteria cause OAB symptoms, we may be able to better identify those at risk for this condition and more effectively treat them,” said co-investigator Alan Wolfe, professor of Microbiology and Immunology at SSOM.

“While traditional urine cultures have been the gold standard to identify urine disorders in the past, they do not detect most bacteria and have limited utility as a result,” added Paul Schreckenberger, director of the clinical microbiology laboratory in the Loyola University Health System. “They are not as comprehensive as the EQUC protocol used in this study.”

The study team said they now plan to find out which bacteria in the bladder are beneficial and which are hazardous. They said they will also investigate how these bacteria communicate with each other and interact with their host.

Greenland Could Become Greater Contributor To Sea Level Rise Than Previously Expected

University of California – Irvine
Major UCI-NASA work reveals long, deep valleys connecting ice cap to the ocean
Greenland’s icy reaches are far more vulnerable to warm ocean waters from climate change than had been thought, according to new research by UC Irvine and NASA glaciologists. The work, published May 18 in Nature Geoscience, shows previously uncharted deep valleys stretching for dozens of miles under the Greenland Ice Sheet.
The bedrock canyons sit well below sea level, meaning that as subtropical Atlantic waters hit the fronts of hundreds of glaciers, those edges will erode much further than had been assumed and release far greater amounts of water.
Ice melt from the subcontinent has already accelerated as warmer marine currents have migrated north, but older models predicted that once higher ground was reached in a few years, the ocean-induced melting would halt. Greenland’s frozen mass would stop shrinking, and its effect on higher sea waters would be curtailed.
“That turns out to be incorrect. The glaciers of Greenland are likely to retreat faster and farther inland than anticipated – and for much longer – according to this very different topography we’ve discovered beneath the ice,” said lead author Mathieu Morlighem, a UCI associate project scientist. “This has major implications, because the glacier melt will contribute much more to rising seas around the globe.”
To obtain the results, Morlighem developed a breakthrough method that for the first time offers a comprehensive view of Greenland’s entire periphery. It’s nearly impossible to accurately survey at ground level the subcontinent’s rugged, rocky subsurface, which descends as much as 3 miles beneath the thick ice cap.

Since the 1970s, limited ice thickness data has been collected via radar pinging of the boundary between the ice and the bedrock. Along the coastline, though, rough surface ice and pockets of water cluttered the radar sounding, so large swaths of the bed remained invisible.
Measurements of Greenland’s topography have tripled since 2009, thanks to NASA Operation IceBridge flights. But Morlighem quickly realized that while that data provided a fuller picture than had the earlier radar readings, there were still major gaps between the flight lines.
To reveal the full subterranean landscape, he designed a novel “mass conservation algorithm” that combined the previous ice thickness measurements with information on the velocity and direction of its movement and estimates of snowfall and surface melt.
The difference was spectacular. What appeared to be shallow glaciers at the very edges of Greenland are actually long, deep fingers stretching more than 100 kilometers (almost 65 miles) inland.
“We anticipate that these results will have a profound and transforming impact on computer models of ice sheet evolution in Greenland in a warming climate,” the researchers conclude.
“Operation IceBridge vastly improved our knowledge of bed topography beneath the Greenland Ice Sheet,” said co-author Eric Rignot of UC Irvine and NASA’s Jet Propulsion Laboratory. “This new study takes a quantum leap at filling the remaining, critical data gaps on the map.”
Other co-authors are Jeremie Mouginot of UC Irvine and Helene Seroussi and Eric Larour of JPL. Funding was provided by NASA.
The team also reported stark new findings last week on accelerated glacial melt in West Antarctica. Together, the papers “suggest that the globe’s ice sheets will contribute far more to sea level rise than current projections show,” Rignot said.

Post-traumatic Stress Disorder Symptoms Common After A Stay In An Intensive Care Unit

American Thoracic Society
Patients who have survived a stay in the intensive care unit (ICU) have a greatly increased risk of developing symptoms of post-traumatic stress disorder (PTSD), according to a new study presented at the 2014 American Thoracic Society International Conference.
“An ICU stay can be traumatic for both patients and their families,” said Ann M. Parker, MD, a Pulmonary and Critical Care Medicine fellow at Johns Hopkins University in Baltimore, Maryland. “In our analysis of more than 3,400 ICU patients, we found that one quarter of ICU survivors exhibited symptoms of PTSD.” The systematic review of 28 studies involved a total of 3,428 adult ICU survivors. Evaluation included testing with validated PTSD instruments, most commonly the Impact of Events Scale (IES, score range 0-75), administered one month or more after the ICU stay.
In a subset of 429 patients assessed 1-6 months after their stay in the ICU, meta-analysis demonstrated the pooled prevalence of PTSD symptoms was 23% at an IES threshold of ≥35 and 42% at a threshold of ≥20. In 698 patients assessed at 7-12 months, corresponding pooled PTSD prevalence rates were 17% and 34%. Rates in other studies included in the analysis ranged from 5% to 62%.
Risk factors for the occurrence of PTSD symptoms included younger age, use of benzodiazepines and/or mechanical ventilation during the ICU stay, and post-ICU memories of frightening ICU experiences. In some studies of European ICU patients, keeping an ICU diary significantly reduced the occurrence of PTSD symptoms.
Importantly, 3 of 3 studies demonstrated that more PTSD symptoms were associated with worse health-related quality of life.
A potential limitation of this systematic review is the variability of patient populations and PTSD survey instruments studied, which makes direct comparison between studies difficult.
“Our meta-analysis confirms that a large proportion of patients who survive an ICU stay will suffer PTSD symptoms, which are associated with worse health-related quality of life,” said Thiti Sricharoenchai, MD, Instructor in the Division of Pulmonary and Critical Care Medicine at Thammasat University, Thailand who conducted this study as a post-doctoral research fellow at Johns Hopkins University. “Further research should focus on PTSD screening, prevention, and treatment in this vulnerable patient population.”
Dr. Parker and her mentor, Dr. Dale Needham, Associate Professor of Pulmonary and Critical Care Medicine at Johns Hopkins University, are currently planning a study to evaluate an out-patient intervention to address PTSD symptoms in ICU survivors.

Lyme Disease: Ten Things You Always Wanted To Know About Ticks, But Were Afraid To Ask

Cheryl Dybas, NSF
To find out how to steer clear of Lyme disease during “picnic season” – a time when people are more likely to pick up ticks – the National Science Foundation spoke with NSF-funded disease ecologist Rick Ostfeld of the Cary Institute of Ecosystem Studies in Millbrook, N.Y., and program director Sam Scheiner of NSF’s Division of Environmental Biology.
Ostfeld’s research is funded by the joint NSF-NIH Ecology and Evolution of Infectious Diseases Program and NSF’s Long-Term Research in Environmental Biology Program.
1) What have we learned about how Lyme disease is transmitted?
Lyme disease can develop when someone is bitten by a blacklegged tick infected with a virulent strain of the bacterium Borrelia burgdorferi. At least 15 strains of the bacterium are found in ticks, but only a few turn up in Lyme disease patients, says Ostfeld.
Newly hatched larval ticks are born without the Lyme bacterium. They may acquire it, however, if they feast on a blood meal from an infected host. Scientists have learned that white-footed mice, eastern chipmunks and short-tailed shrews can transfer the Lyme bacterium to larval ticks.
Tick nymphs infected with Lyme bacteria pose the biggest threat to humans; their numbers are linked with the size of mouse populations.
2) The list of illnesses spread by blacklegged ticks seems to increase each year. What’s going on?
People in the Northeast, Mid-Atlantic, and Midwest have experienced waves of “new” tick-borne diseases. It started in the 1980s with Lyme disease. Then in the 1990s it was anaplasmosis, followed in the early 2000s by babesiosis. Now we may be seeing the emergence of Borrelia miyamotoi, says Ostfeld.
The pathogens are transmitted by blacklegged ticks. “We suspect that they were present for decades in isolated geographic areas, but we’re working to understand what’s triggering their spread,” says Ostfeld. For example, while Lyme disease bacteria can be carried long distances by birds, Anaplasma and Babesia don’t fare well in birds.
3) How do small mammals play a part?
Mice, chipmunks and shrews play a major role in infecting blacklegged ticks with the pathogens that cause Lyme disease, anaplasmosis, and babesiosis. Ticks feeding on these animals can acquire two or even all three pathogens from a single bloodmeal, says Ostfeld.
Health care providers need to be aware, he says, “that patients with Lyme disease may be co-infected with anaplasmosis and babesiosis, which will affect symptoms, treatments, and possibly outcomes. The good news is that by regulating these small mammals, we can reduce our risk of exposure to all three illnesses.”
4) How are predators like foxes protecting us against diseases such as Lyme?
Some predators appear to be protecting our health by regulating small mammals, Ostfeld says. Research suggests that where red foxes are abundant, there is a lower incidence of Lyme disease in the human population.
“We’re investigating whether foxes and other predators reduce our risk by preying on the small mammals responsible for transmitting Lyme disease to ticks,” says Ostfeld. “We don’t yet know whether predators like owls and hawks behave similarly.”
5) How is climate change influencing the spread of tick-borne illnesses?
The northward and westward spread of blacklegged ticks and Lyme disease in recent decades is caused in part by climate warming, says Ostfeld. However, Lyme disease has also been spreading south, which is unlikely to be caused by climate change, scientists believe.
Models predict that Lyme disease will continue to move to higher latitudes and elevations over coming decades, a result of milder winters and longer growing seasons. “We’re currently exploring how climate warming affects the seasonal timing of host-seeking and biting behavior of ticks,” says Ostfeld.
6) Why are we more likely to contract Lyme disease in fragmented forests?
“When humans fragment forests, often through urbanization, we create conditions that favor the small mammals that amplify Lyme disease,” Ostfeld says.
The species consistently found in forest sites, no matter how small or isolated, is the white-footed mouse. And lyme-infected ticks are often most abundant in the smallest forest patches, leading to a high risk of human exposure.
“To combat Lyme disease, one of the fastest growing threats to human health in the U.S., we need to know where it is, how it’s transmitted, and how it can be controlled,” says Scheiner.
“Long-term studies, such as work by Ostfeld and colleagues, show that the abundance of the disease-causing bacteria is determined by the number and variety of small mammals in a community. The research also demonstrates the value of conserving biodiversity as a way of limiting the spread of disease.”
7) Aren’t mice affected by ticks?
Long-term monitoring of mice and ticks in upstate New York shows that mice survive just as well when they’re infested with hundreds of ticks as when they have few or no ticks. In fact, male mice survive longer when they have more ticks, Ostfeld says.
“This is bad news, as it means that heavy tick loads won’t bring down mouse numbers, which would have helped reduce the human risk of tick-borne diseases.”
8) Why are ecological studies essential to understanding emerging infectious diseases?
Tick-borne disease takes a huge toll on public health and on the economy, says Ostfeld. “Take the case of Lyme disease, where diagnosis and treatment remain controversial. One thing that everyone can agree on is the importance of preventing exposure. Doing this requires understanding the ecology of ticks, pathogens and hosts.”
The more we know about where and when the risk is high, he says, the better we’ll be able to protect ourselves and respond appropriately when we’re exposed.
9) What precautions might be wise for people wishing to spend time outside?
“I’d recommend the use of tick repellents on skin or clothes, paying special attention to shoes and socks,” Ostfeld says. “Tick nymphs seek hosts on or just above the ground, so shoes and socks are the first line of defense.” Some studies show that daily tick checks during late spring and early summer can be protective.
Knowing the early symptoms of Lyme disease – fever, chills, muscle aches, often a large rash – is important. “People who live in the heaviest Lyme disease zones of the Northeast, Mid-Atlantic, and Upper Midwest,” says Ostfeld, “and who start feeling flu-like symptoms, especially from May through July, should ask their doctors to consider Lyme disease.”
10) Does this mean that we should stay inside so we don’t risk becoming infected?
The likelihood of contracting Lyme disease is very low overall, says Scheiner, “and is even lower if you take reasonable precautions. Don’t let the threat of Lyme disease keep you from enjoying the best part of spring and summer: the great outdoors.”

High Tech Telecom Satellite Destroyed After Russian Rocket Launch Fails

Brett Smith for redOrbit.com – Your Universe Online
A Russian telecommunications satellite crashed to the ground on Friday as a result of a launch failure by the country’s Proton-M rocket.
Launched in the early morning hours from the Baikonur Cosmodrome in Kazakhstan, the rocket experienced an “emergency situation” that kept it from going into orbit, Russian space agency Roscosmos said.
Russian media reported that an issue with the rocket occurred about 94 miles above the Earth and ultimately the rocket and its cargo crashed across parts of Siberia and the Pacific Ocean. They were no casualties or significant damage reported. Russian media did report that the Express-AM4R satellite was made using cutting-edge technology and was supposed to provide cheap internet access to Russians in remote parts of the country.
The incident is the first involving a Proton-M rocket since July 2013, when three other satellites were lost due to engine failure. The rocket was carrying over 600 tons of fuel and the Kazakh government calculated the environmental damage from the crash at about $70 million.
The incident damaged relations between Russia and Kazakhstan; a close political and economic ally. Kazakhstan had enacted a short-term moratorium on all Proton launches from its territory following last year’s incident.
The situation comes after Russian and US officials publicly clashed over the future of the International Space Station, a symbol of post-Cold War collaboration, and rocket engines that Russia supplies to NASA. The clash is fueled by the on-going upheaval in Ukraine.
On Tuesday, Deputy Prime Minister Dmitry Rogozin said Russia would not agree to continue operating the International Space Station (ISS) beyond 2020. The Russian announcement, which came from a politician not a space agency official, came one day after NASA announced it would stop all space-related, non-ISS efforts with Russia due to the current political dispute.
“We are very concerned about continuing to develop high-tech projects with such an unreliable partner as the United States, which politicizes everything,” Rogozin said during a press conference.
Russia said it will be preventing the US from access to Russian-made rocket engines for launching military satellites, as well as suspend operations of GPS satellite navigation system sites on its territory starting in June.
In addition to suspending all non-ISS collaboration with Russia, the US has said it intends to refuse export licenses for high-technology items typically used by the Russian military.
“These sanctions are out of place and inappropriate,” Rogozin said. “We have enough of our own problems.”
The Russian official added that the NK-33 and RD-180 engines, which Russia supplies to the United States, are ready for delivery – but said they would not be released, keeping the US from potentially launching military satellites. The RD-180 engines are used in the satellite-launching Atlas 5 rockets – manufactured by the United Launch Alliance, a Lockheed Martin-Boeing collaboration.
The ULA said it hopes any dispute between the two nations can be resolved quickly, but said it has a two-year supply of engines that can be used until the conflict is resolved or an alternative is found.
“ULA and our Department of Defense customers have always prepared contingency plans in the event of a supply disruption,” ULA spokeswoman Jessica Rye said in a statement, according to Reuters.

SapC-DOPS Technology May Help With Imaging Brain Tumors

UC Health News

Just because you can’t see something doesn’t mean it’s not there.

Brain tumors are an extremely serious example of this and are not only difficult to treat—both adult and pediatric patients have a five-year survival rate of only 30 percent—but also have even been difficult to image, which could provide important information for deciding next steps in the treatment process.

However, Cincinnati Cancer Center research studies published in an April online issue of the Journal of Magnetic Resonance Imaging and a May issue of the Journal of Visualized Experiments (JoVE), an online peer-reviewed scientific journal that publishes experimental methods in video format, reveal possibly new ways to image glioblastoma multiforme tumors—a form of brain tumor—using the SapC-DOPS technology.

A lysosomal protein saposin C (SapC), and a phospholipid, known as dioleoylphosphatidylserine (DOPS), can be combined and assembled into tiny cavities, or nanovesicles, to target and kill many forms of cancer cells.

Lysosomes are membrane-enclosed organelles that contain enzymes capable of breaking down all types of biological components; phospholipids are major components of all cell membranes and form lipid bilayers—or cell membranes.

Xiaoyang Qi, PhD, member of the CCC, associate professor in the division of hematology oncology at the UC College of Medicine, a member of the UC Cancer and Neuroscience Institutes and the Brain Tumor Center, says his lab and collaborators have previously found that the combination of two natural cellular components, called SapC-DOPS, caused cell death in cancer cell types, including brain, lung, skin, prostate, blood and breast cancer, while sparing normal cells and tissues.

“We used this knowledge to gain assistance from our collaborators Kati LaSance, Vontz Core Imaging Lab (VCIL) director, and Patrick Winter, PhD, in the Imaging Research Center (IRC) at Cincinnati Children’s Hospital Medical Center. We used SapC-DOPS as a transport vesicle to deliver bio-fluorescence agents and gadolinium-labeled contrast agents directly to brain tumors which provided visualization using optical imaging and MRI,” Qi says.

“There are two things lacking when it comes to brain tumors: getting a good picture of them and treating them effectively,” says LaSance. “With this discovery, there are possibilities to improve both. With good visualization of the tumor, physicians might one day be able to better determine which form of treatment—chemotherapy, radiation or surgery—would be best for a patient and can image a tumor at its smallest stages with hopes of intervening much earlier.”

Qi says this is preclinical research, as the studies were done using animal models that were injected with the SapC-DOPS vesicle assembled with illuminating agents, but is translational in nature and could be tested soon in human populations.

“While optical imaging is not applicable to a patient population, both MRI and PET imaging are,” he says. “The bio-fluorescent molecule used in the JoVE study can be substituted for a PET molecule and fortunately, PET imaging is widely used by doctors and hospitals in current cancer patients.

“This research has the potential to make a large impact in treatment of brain tumors, and most importantly, it would not have been impossible without support and collaboration from the VCIL and the IRC.”

These studies were funded by the Mayfield Education and Research Foundation, a New Drug State Key Project (009ZX09102-205) and the National Institutes of Health/National Cancer Institute (1R01CA158372-01). Researchers cite no conflicts of interest.

Comcast Could Impose Data Caps On Its Customers

Peter Suciu for redOrbit.com – Your Universe Online

Those who use massive amounts of data will simply have to pay more. That is in essence what Comcast told its customers on Thursday. The nation’s largest Internet service provider (ISP) and cable giant also responded to the Federal Communications Communication’s (FCC) net neutrality proposal – and the two events could be closely tied together.

Comcast is currently seeking the FCC’s approval for its $45-billion buyout of Time Warner Cable, a move that would make the Philadelphia-based company the most dominant provider in the United States with more than 30 million cable TV, and high-speed Internet customers across the country – including top markets such as New York, Los Angeles and Chicago.

“Comcast remains committed to a free and open Internet and [is] working with the FCC on appropriate rules for all players across the industry,” Comcast Executive Vice President David L. Cohen wrote Thursday in a blog post.

The other take away on this is that Comcast’s Cohen, speaking at the Moffett Nathanson Media & Communications Summit on Wednesday in New York, suggested that within five years all Comcast customers could once again have monthly bandwidth caps imposed on home broadband usage.

Cohen, however, suggested that most users would not use up the allotment and have to pay more.

“I would also predict that the vast majority of our customers would never be caught in the buying the additional buckets of usage, that we will always want to say the basic level of usage at a sufficiently high level that the vast majority of our customers are not implicated by the usage-based billing plan,” Cohen said in a statement as reported by Digital Trends. “And that number may be 350 — that may be 350 gig a month today, it might be 500 gig a month in five years, but it will never — I don’t think we will want to be in a model where it is fully variablized and 80% of our customers are implicated by usage-based billing and are all buying different packets of usage.”

According to reports, Comcast is currently running several pilot projects in select markets in the United States to test its bandwidth caps. These include options that allow users to combine download speeds with bandwidth caps, and the higher the speed the higher the bandwidth cap.

In another test case, one that Comcast executives prefer, all users would have 300GB per month of data – and the company would charge $10 for every extra 50GB used after that. These test cases are not actually new, as PC World reports the cap pilot projects have been ongoing since 2012.

Comcast didn’t provide an option for ultra high-usage customers, and instead those who went over limits received warnings. Too many warnings and users face seeing their account suspended for a year. The Internet and cable provider also experimented with bandwidth throttling for heavy users in 2009, PC World reported.

This move could be most worrisome to those so-called “cord cutters” – users who opt to ditch cable TV for streaming services such as Netflix and Hulu.

Forbes contributor Amadou Diallo, a self-proclaimed cord cutter, did note that the vast majority of cable customers fall far below a 300GB threshold, and that an FCC study conducted in 2012 found that a typical consumption rate is around 40GB per month..

However, for those cutting the cord, they may be faced with paying more to watch video content via streaming services. Diallo noted, “The upcoming onslaught of 4K streaming options will only increase data usage.”

According to a new Sandvine study, cord cutters are in fact using more Internet data than other users. The study found that in North America those subscribers who exhibit cord cutting, are dominating network usage – consuming on average 212GB a month, more than seven times the 29GB of data a typical subscriber uses. That is equal to 100 hours of video each month.

The Sandvine study also found that streaming video content now accounts for the majority (54 percent) of total monthly network traffic.

Monitoring The State Of Global Rainfall And Drought

April Flowers for redOrbit.com – Your Universe Online

Despite the fact that it has become a widely accepted practice to use modern weather satellites to monitor terrestrial rainfall, establishing a reliable context for relating space-based rainfall observations to current and historical ground-based rainfall data has been difficult.

A team of researchers from UC Santa Barbara and the United States Geological Survey (USGS) has developed a new dataset that can be used for environmental monitoring and drought early warning. The dataset is called CHIRPS, or Climate Hazards Group Infrared Precipitation with Stations, and it is the result of a collaboration between UCSB’s Climate Hazards Group and USGS’s Earth Resources Observation and Science (EROS). CHIRPS combines space observed rainfall data with more than three decades of ground station data collected worldwide.

“This dataset seeks to blend the best qualities of rainfall station observations, satellite temperature data and rainfall’s unique spatial characteristics to create the best available rainfall information for climate and agricultural monitoring,” said Gregory J. Husak, an assistant researcher with the Climate Hazards Group in UCSB’s Department of Geography, in a recent statement.

Experts who specialize in early warning for drought and famine will be able to use CHIRPS to monitor rainfall in near real-time, at high resolution, over most of the planet. The data can be imported into existing climate models, where combining it with other environmental and meteorological data will allow scientists to project future agricultural and vegetation conditions.

The research team designed CHIRPS specifically for drought monitoring, and it is already in use identifying hot spots of food insecurity. For example, because of a series of poor rainy seasons in 2008, 2009, 2011 and 2012, much of East Africa is still suffering because of high food prices. This is especially true in South Sudan where over one million refugees have been displaced by a civil war.

Near Lake Victoria, Kenya, is the Rift Valley, which is full of highly productive farms that make up the bulk of Kenya’s food supply. Typically, millions of people are fed from the high crop yields brought about by abundant spring rains. CHIRPS has identified a very poor start to the growing season in the Rift Valley this year. Using the long historical record set incorporated as part of CHIRPS, researchers know that this April’s rainfall (approximately 2 inches) was the lowest in 34 years. This information has already been transmitted to the Famine Early Warning Systems Network of the U.S. Agency for International Development (USAID), prompting on-the-ground assessments of potential crop failure.

CHIRPS is also being used to investigate recent trends in rainfall, such as the long-term decline across both the southwestern United States and easternmost part of East Africa. UCSB and USGS scientists are working with USAIID to create development and adaptation strategies in Africa.

“Our most recent research suggests that these declines are likely linked to warming in the western Pacific and eastern Indian oceans,” said Chris Funk, a research scientist with the USGS’ EROS.

The development team would like to see the CHIRPS dataset being used to guide climate-smart development, as well as water managers, hydroelectric power companies, natural resource managers and disaster relief agencies prepare for a changing climate.

CHIRPS is being hosted by both the UCSB Climate Hazard Group and USGS’ EROS, and is available to the public.

Magnets And Kids Are A Dangerous Duo

Elsevier Health Sciences
Magnet ingestions by children have received increasing attention over the past 10 years. With the growing availability of new and stronger neodymium-iron-boron magnets being sold as “toys,” there has been an increase of cases of ingestion, resulting in serious injury and, in some cases, death. In a new study scheduled for publication in The Journal of Pediatrics, researchers studied the trends of magnetic ingestions at The Hospital for Sick Children (SickKids), Canada’s largest children’s hospital.
Matt Strickland, MD, and colleagues reviewed all cases of foreign body ingestion seen in the emergency department from April 1, 2002 through December 31, 2012. Inclusion criteria included being less than 18 years of age, with suspected or confirmed magnetic ingestion. According to Dr. Strickland, “We chose to limit our scope to the alimentary tract because the majority of serious harm from magnets arises from perforations and fistulae of the stomach, small bowel, and colon.” To reflect the introduction of small, spherical magnet sets in 2009, the study was divided into two time periods, visits during 2002-2009 and those during 2010-2012.
Of 2,722 patient visits for foreign body ingestions, 94 children met the inclusion criteria. Of those, 30 children had confirmed ingestion of multiple magnets. Overall magnet ingestions tripled from 2002-2009 to 2010-2012; the incidence of injuries involving multiple magnets increased almost 10-fold between the two time periods. Six cases required surgery for sepsis or potential for imminent bowel perforation, all of which occurred in 2010-2012. The average size of the magnets also decreased approximately 70% between 2002-2009 and 2010-2012.
This study shows a significant increase in the rate of multiple magnet-related injuries between 2002 and 2012. “More concerning, however,” notes Dr. Strickland, “is the increased number of high-risk injuries featuring multiple, smaller magnets.” Despite new magnet-specific toy standards, labeling requirements, product recalls, and safety advisories issued in the past 10 years, continuing efforts should focus on educating parents and children on the dangers inherent in magnetic “toys.”

Pew Study Predicts ‘Internet Of Things’ Will Be Everpresent By 2025

Peter Suciu for redOrbit.com – Your Universe Online
It has become the new buzz phrase: “The Internet of Things” and chances are you’ll likely hear it a lot more, or it could become such a part of everyday life that it will just be the new normal.
According to research compiled by the Pew Research Center Internet Project, which commissioned multiple studies to mark the 25th anniversary of the creation of the World Wide Web by Sir Tim Berners-Lee, things are going to get a whole lot more connected.
Pew Research, along with Elon University Imagining the Internet Center, canvassed more than 1,600 experts about the Internet of Things to determine where this might be headed by 2025.
A February 2014 report from the Pew Internet Project, which was the first part of a sustained effort to mark the 25th anniversary of the Web, examined the strikingly fast adoption of the Internet and focused on the generally positive attitudes users have had about its role in their social environment. The overall verdict of the study found that the Internet has been a “plus” for society but more importantly it has been very good for individual users.
The March 2014 Digital Life in 2025 report, which was conducted in association with Elon University, pondered the future of the Internet in the next decade. The general opinion of the experts and stakeholders was that if the Internet could become so deeply part of the environment that it could reach the level that electricity plays in our lives today — it is less visible but increasingly more important in people’s daily lives.
“The (1,600 tech) experts (canvassed) say the next digital revolution is the often-invisible spread of the Internet of Things,” Janna Anderson, director of the Internet Center and co-author of the report, told USA Today.
“They expect positive change in health, transportation, shopping, industrial production and the environment,” she added. “But they also warn about the privacy implications of this new data-saturated world and the complexities involved in making networked devices work together.”
According to the survey respondents, the Internet of Things could become all the more evident in a plethora of ways. One notable sector that is already growing is in wearables and other wearable technology. The respondents suggested that the Internet of Things won’t be just something we use but could become part of our body with a variety of health- and fitness-related trackers.
The respondents added that technology will increasingly be used to connect our homes — with remote controlling of climate, security and maintenance — and our communities, aiding in commutes and controlling infrastructure.
The Internet of Things could also be used to monitor supply chains, allowing for greater manufacture and distribution of goods, while it could provide real-time updates about the environment as a whole.
“Here are the easy facts: In 2008, the number of Internet-connected devices first outnumbered the human population, and they have been growing far faster than have we,” said expert respondent Patrick Tucker, author of The Naked Future: What Happens In a World That Anticipates Your Every Move? “There were 13 billion Internet-connected devices in 2013, according to Cisco, and there will be 50 billion in 2020. These will include phones, chips, sensors, implants, and devices of which we have not yet conceived.”
In other words this could truly become the new normal. To make sense of it, the concept could also finally have a new definition for the Internet of Things, which The Guardian newspaper reported would be: “A global, immersive, invisible, ambient networked computing environment built through the continued proliferation of smart sensors, cameras, software, databases, and massive data centers in a world-spanning information fabric known as the Internet of Things.”

A Fetal Enzyme Helps Stem Cells Recover From Limb Injuries

April Flowers for redOrbit.com – Your Universe Online
Nearly two million Americans a year suffer from ischemia reperfusion injuries. A wide variety of scenarios can be caused by these injuries that result in restricted blood flow—from traumatic limb injuries, to heart attacks, to donor organs. Restoring the blood flow to an injured leg, for example, seems like it would be a good idea. A new study from Georgia Regents University, however, suggests that restoring the flow could cause additional damage that actually hinders recovery.
Rather than promoting recovery, restoring blood flow actually heightens inflammation and cell death for many of these patients.
“Think about trying to hold onto a nuclear power plant after you unplug the electricity and cannot pump water to cool it down,” said Dr. Jack Yu, Chief of MCG’s Section of Plastic and Reconstructive Surgery. “All kinds of bad things start happening.”
Yu collaborated with Dr. Babak Baban, immunologist at the Medical College of Georgia and College of Dental Medicine at Georgia Regents University. Their study, published in PLOS ONE, reveals that one way stem cell therapy appears to intervene is with the help of an enzyme also used by a fetus to escape rejection by the mother’s immune system.
Baban notes that previous studies have found a correlation between stem cells and recovery. The stem cells both enable new blood vessel growth and by turning down the now-severe inflammation. The new findings reveal that ndoleomine 2,3 dioxygenase, or IDO, widely known to dampen the immune response and create tolerance, plays an important role in regulating inflammation in that scenario. IDO is expressed by stem cells and numerous other cell types.
Stem cell efficiency was boosted by approximately one-third when tested on animal models comparing the therapy in normal mice versus mice missing IDO. Decreased expression of inflammatory markers, swelling and cell death were all observed. These are all associated with shorter, improved recoveries.
“We don’t want to turn off the immune system, we want to turn it back to normal,” Baban said.
Even a brief period of inadequate blood flow, and the resulting lack of nutrients, can start problems that result in the rapid accumulation of destructive acidic metabolites, free radicals, and damage to cell structures. Mitochondria, which are the cells’ power plants, should be producing the energy source ATP. Instead, they quickly become fat, leaky and dysfunctional in this situation.
“The mitochondria are sick; they are very, very sick,” Yu said. Enormous additional stress is added to these sick powerhouses when blood flow is restored.
“They start to leak things that should not be outside the mitochondria,” he said. Cells quickly give up without a source of adequate energy and leaky powerhouses. Inflammation is part of the normal healing process that escalates to help clean up the injury. In this case, however, it makes the problem worse.
The researchers focused their studies on limb injuries, but their findings should translate to any number of scenarios resulting in reperfusion injury. Yu notes that he has seen free flaps, where tissue is moved from one area of the body to another to cover a burn or rebuild a breast, start dying from ischemia reperfusion injury in just two hours.
“It cuts across many, many individual disease conditions,” Yu said. To avoid reperfusion trauma, transplant centers are experimenting with pulsing donor organs.
The team plans to continue their studies by seeing if “more is better”—in other words, giving more stem cells, as well as IDO enhancing drugs. This will include incubating stem cells with IDO. During this study, only one dose was given.
Stem cell therapy isn’t currently in use for aiding in the recovery of limb injuries, but it is being clinically studied to aid stroke and heart attack recovery at MCG and other centers. The best treatment currently available for reperfusion injuries is to restore blood flow and give broad-spectrum antibiotics.
“Sometimes it works, sometimes it doesn’t,” Yu said. “That is why this kind of pharmacologic intervention could be very, very important.”

New Species Of Sea Bass Identified Based On Larval, Adult Specimen

Brett Smith for redOrbit.com – Your Universe Online

Thanks to a lot of hard work and a little luck – two scientists from the National Museum of Natural History at the Smithsonian Institution have identified a mysterious larval fish and the same fish in its adult stage as a new species of sea bass.

Most fish that live in the ocean have a pelagic larval stage that floats inside the surface or near-surface currents, an ecosystem very distinct from the one they occupy as adults. Two distinct environments often call for two distinct physiques and appearances to maximize the odds of survival, leading to larvae that appear very different from the adults of the same species.

The newly identified fish, described in a new report published on Tuesday in the journal PLOS ONE, first came to the attention of researchers via a photograph in a previous study. It was identified as a member of the sea bass family Serranidae, but its seven very elongated dorsal-fin spines made it a very unique looking specimen.

“This feature isn’t known in any Atlantic sea bass larvae, but it is similar to one species of Indo-Pacific sea bass,” said study author David Johnson, a zoologist at the Smithsonian museum. “We initially thought the larva must have been caught in the Indo-Pacific Ocean, but we were wrong.”

However, the fish larva in the photo was identified as being captured in the Florida Straits — the body of water located between Florida and Cuba.

To properly identify the fish, the study team obtained the mysterious larva for further review. They found that the DNA from the specimen did not match up with any recognized fish in their database. Only then did the researchers begin considering the larva as a new species in spite of not having an adult specimen.

Meanwhile, Smithsonian scientists investigating deep-reef fish off Curacao inside the southern Caribbean gathered several “golden bass,” which the team recognized as Liopropoma aberrans. However, genetic analyses revealed more than one species had been collected. Incorporating this new genetic data with available DNA barcoding information for all known western Atlantic sea bass produced an unforeseen discovery: The larva from the Florida Straits is the same novel species of Liopropoma. The sea bass was ultimately named Liopropoma olneyi, after a deceased colleague, John E. Olney.

“This was one of those cases where all the stars were properly aligned,” said study author Carole Baldwin, a zoologist at Smithsonian’s National Museum of Natural History. “We discover a new species of sea bass on Curacao deep reefs that just happens to be the missing adult stage of a larval fish from Florida, which we only knew existed because it was included as ‘decoration’ in a scientific publication. What a great little fish story!”

She added that the reefs where the adult fish live are remote and underexplored ecosystems.

“You can’t access them using traditional SCUBA gear, and if you’re paying a lot of money for a deep-diving submersible that goes to Titanic depths, you’re not stopping at 300 or 800 feet to look for fishes,” Baldwin said. “Science has largely missed the deep-reef zone, and it appears to be home to a lot of life that we didn’t know about.”

Image 2 (below): An adult of the new species of sea bass, Liopropoma olneyi, recently discovered in the deep reefs of Curacao. Once discovered, it simultaneously solved the identification mystery of a fish larva found in the Florida Straits. Credit: Barry Brown, Substation Curacao

Species Biodiversity Changing, Not Declining From Global Warming

Brett Smith for redOrbit.com – Your Universe Online

While some studies have come to the conclusion that global warming is threatening to cause extinctions and lower the biodiversity of ecosystems, a new report published in the journal Science has found that a changing climate isn’t lowering biodiversity – but it is creating new mixes of species within various ecosystems.

The new report reviewed 100 long-term monitoring research studies executed around the world – in both marine environments and on land. They found that the amount of species in many of these locations has not transformed significantly, or has actually risen.

Instead of finding a decrease in biodiversity as some might have expected, the study team found an increase in species richness in 59 out of 100 studies and a decrease on 41. The rate of change in all of these studies was modest, the researchers said.

The study team did find one major change: shifts in the species living in the locations being studied. Nearly 80 percent of the communities they analyzed exhibited substantial shifts in species composition, averaging around 10 percent change per decade, much greater than the rate of change predicted by models.

Study author Nick Gotelli, a biology professor at the University of Vermont, said an enormous turnover of species in environments around the globe is taking place, creating novel biological communities.

“Right under our noses, in the same place that a team might have looked a decade earlier, or even just a year earlier, a new assemblage of plants and animals may be taking hold,” he said.

Gotelli added that prevention strategies are currently focused on preserving biodiversity – working under the theory that biodiversity will decrease as average temperatures rise as they are expected to do.

“A main policy application of this work is that we’re going to need to focus as much on the identity of species as on the number of species,” he said. “The number of species in a place may not be our best scorecard for environmental change.”

In their report, the scientists cited the example of disturbed coral reefs in some areas are being replaced with a group of species dominated by algae. While biodiversity in this situation might remain the same – the overall impact on the ecosystem is significant – particularly to fisheries and tourism that relies on healthy coral reefs.

“In the oceans we no longer have many anchovies, but we seem to have an awful lot of jellyfish,” Gotelli said. “Those kinds of changes are not going to be seen by just counting the number of species that are present.”

The study team noted that their find is similar to what science writer David Quammen described as our “planet of weeds” – in which invasive species or successful colonists may be spreading into new places. This phenomenon would keep the local species count up, but even the overall biodiversity on a macro level is degraded.

“We move species around,” Gotelli says. “There is a huge ant diversity in Florida, and about 30 percent of the ant species are non-natives. They have been accidentally introduced, mostly from the Old World tropics, and they are now a part of the local assemblage. So you can have increased diversity in local communities because of global homogenization.”

In their report, the researchers concluded there “is need to expand the focus of research and planning from biodiversity loss to biodiversity change.”

Image 2 (below): With survey data from every continent and climate type, a new study found species compositions changing — but not systematic losses of biodiversity — around the globe. Each dot represents a site included in the analysis. Credit: Courtesy of Science Magazine

New Cellular Study Indicates Schizophrenia Begins In The Womb

Brett Smith for redOrbit.com – Your Universe Online

New research from a large team of American researchers has revealed evidence that schizophrenia may originate during fetal development in the womb.

For the study, published in the journal Molecular Psychiatry, researchers analyzed neurons grown in the laboratory from the skin cells of individuals with schizophrenia. The study team noted in its report that these neurons behaved strangely and a similar type of neuronal development in the womb could be a root cause for schizophrenia.

“This study aims to investigate the earliest detectable changes in the brain that lead to schizophrenia,” said study author Fred H. Gage, a professor of genetics at the Salk Institute for Biological Studies in La Jolla, California. “We were surprised at how early in the developmental process that defects in neural function could be detected.”

Scientists currently do not know much regarding the underlying causes of schizophrenia, including which cells in the brain are affected and how. Previous research only had the opportunity to analyze schizophrenia by evaluating the brains of individuals with the condition after death. The effects of age, stress, treatment or substance abuse often ruined the minds of these individuals – making it challenging to pinpoint the disease’s beginnings.

To skirt around these confounding effects, the study team took skin cells from people with schizophrenia, engineered the cells into a stem cell form, and then pushed them to become extremely early-stage neurons called neural progenitor cells (NPCs). These NPCs are very similar to the cells found in the brain of a developing fetus.

“We realized they weren’t mature neurons but only as old as neurons in the first trimester,” said study author Kristen Brennand, the first author of the paper and assistant professor at Icahn School of Medicine at Mount Sinai. “So we weren’t studying schizophrenia but the things that go wrong a long time before patients actually get sick.”

The scientists produced NPCs from the skin cells of four patients with schizophrenia and six individuals without the condition. They screened the cells in two kinds of assays: one in which they viewed how far the cells moved and acted on selected surfaces – and one in which they measured stress in the cells by imaging mitochondria, which are small energy-generating organelles.

The results on both tests showed major discrepancies between cells taken from those with schizophrenia and those without the condition. Cells linked to schizophrenia exhibited abnormal activity in two main classes of proteins: those related to adhesion and connectivity, and those related to oxidative stress. The cells from individuals with schizophrenia also seemed to have both unusual migration – which may cause the development of poor connectivity – and heightened levels of oxidative stress.

The study team said their findings are in keeping with a predominant theory that states events happening during pregnancy can give rise to schizophrenia, despite the fact that the disease doesn’t reveal itself until early adulthood. Past analyses indicate that mothers who suffer from an infection, poor nutrition or extreme stress during pregnancy are at a greater risk of having children who will develop schizophrenia. Scientists suspect both genetic and environmental factors likely play a role in the development of the condition.

“The study hints that there may be opportunities to create diagnostic tests for schizophrenia at an early stage,” Gage said.

Follow That Fish – Understanding Alcohol And Social Behavior

April Flowers for redOrbit.com – Your Universe Online

Alcohol consumption affects social behavior, as any college student can tell you. But does it have to have such an effect? A new study from New York University’s (NYU) Polytechnic School of Engineering suggests that the complex interplay between alcohol and social behavior can be understood and the negative effects perhaps mitigated. Their groundbreaking experiments, described in the journal Alcohol: Clinical and Experimental Research, center on an unlikely test subject: the zebrafish.

Maurizio Porfiri, associate professor of mechanical and aerospace engineering and director of the school’s Dynamical Systems Laboratory, led a team of scientists that included NYU School of Engineering Research Scholar Fabrizio Ladu and Postdoctoral Fellow Sachit Butail, along with Visiting Scientist Simone Macrì, Ph.D., of the Istituto Superiore di Sanità in Rome, Italy. Their experiment completely overturned the traditional paradigm for alcohol-related research. The traditional paradigm consists of all subjects being exposed to alcohol, then analyzing the behavior and movements of the group. The new method allows for detailed tracking of a single, alcohol-exposed zebrafish swimming amid a school of “sober” fish.

The working theory driving this experiment is that an individual’s response to alcohol would vary depending on the presence or absence of unexposed peers. The remarkable effect that the alcohol-exposed fish would have on its peers, however, was completely unexpected.

For the experiment, a single zebrafish was exposed to four concentrations of ethanol in water—ranging from zero to acute exposure, which causes no harm to the fish—then released into a group of untreated zebrafish. The team believes their efforts are the first to allow ethanol-exposed and untreated zebrafish to swim freely together. The researchers developed a custom algorithm that allowed them to track a single fish throughout the experiment, while at the same time analyzing group behavior.

Alcohol exposure has been shown to affect the locomotion of a zebrafish in previous studies. At low concentrations, the fish’s movements become faster. As the dose increases, the swimming motion slows down. The cohesion of the group is also negatively affected by alcohol exposure.

As expected, the single exposed zebrafish showed predictable locomotion changes when observed alone after exposure. When added to the unexposed group, however, the zebrafish’s behavior was remarkably different. The swimming speeds of fish exposed to intermediate or high concentrations nearly doubled, indicating that the presence of peers had a substantial impact on social behavior under the influence of alcohol.

Perhaps the most remarkable finding of the study was that the unexposed fish also modulated their behavior and swimming speeds differentially in the presence of a shoalmate exposed to different levels of alcohol.

“These results were very surprising,” explained Porfiri. “It is clear that the untreated fish were matching the swimming speed of the alcohol-exposed fish, and this correlation was especially strong at an intermediate level of alcohol exposure. At very high or low levels, the influence decreases.”

The swimming speed increase of the exposed individual might be explained as hyper-reactivity to an enriched environment—in other words, the tank full of unexposed peers. As with humans, alcohol has been shown to reduce inhibitory behavior in zebrafish—meaning that the faster swimming speed could also reflect a heightened interest in interacting with the school. Regardless of the underlying mechanism, this deviation from expected behavior is very significant, the authors say, because it highlights the ability of social stimuli, such as the untreated fish, to change an individual’s response to alcohol.

The unexpected ability of the alcohol-exposed zebrafish to influence the behavior of the school might be a form of leadership. The researchers caution, however, that aggressive, risk-taking behaviors, which are common under the disinhibitory effects of alcohol, might resemble normal leadership behavior in this species.

“This research integrates rigorous principles from engineering and biology to develop an experiment that sheds light on the complex interactions among individuals in a group, their individuality and their collective behavior,” said Massimo Ruzzene, NSF program director for the Division of Civil, Mechanical and Manufacturing Innovation. “Analysis of complex dynamical systems is an important area of research. An understanding of complex systems, ranging from smart robots that assist in surgery to manufacturing enterprises and emergency response teams, is integral to our ability to respond effectively to emerging challenges and opportunities.”

The new method of experimentation developed for this study holds a great deal of promise for future research on the social determinants of individual responses to alcohol, the researchers believe. The team was able to study how social environments shape the impact of alcohol on behavior by dissociating sociality from exposure. To what degree social influence is exerted by exposed or unexposed individuals is affected by group size will be the focus of future studies by the group.

The findings will advance our understanding of how social interactions magnify or mitigate negative effects of alcohol abuse, and could possibly lead to new, improved therapies to reduce these effects.

What Would Happen If Saturn Came Extremely Close To Earth?

redOrbit Staff & Wire Reports – Your Universe Online

While astronomers enjoyed the best view of Saturn this past weekend, an animator created a new video depicting what would happen if the people of Earth got an extreme close-up of the sixth planet from the Sun.

According to Laurel Kornfeld of The Space Reporter, Saturn reached opposition on May 10, which means that it and the Earth were on opposite sides of the Sun. The ringed planet was also at its closest distance to our world – approximately 83 million miles away, making it appear much larger and far brighter than usual.

Taking inspiration from its opposition and close approach, Yeti Dynamics took images captured by the Voyager and Cassini missions to create a video demonstrating what Saturn would look like if it came as close to Earth as Mars is, Kornfeld added. While such a scenario is possible, as Saturn maintains a stable nearly-circular orbit, the video shows that the planet would appear to be as bright in our night sky as the Moon, despite being 150 times further away.

In addition, Slate.com’s Phil Plait noted that the planet’s disk would be approximately one-fourth of the Moon’s size, and the rings would stretch to be approximately two-thirds as large as our planet’s natural satellite. However, while the video is interesting, it isn’t exactly 100 percent scientifically accurate for reasons other than Saturn’s orbit.

For purposes of the video, Yeti Dynamics had to ignore the immense gravitational pull created by a planet with a mass nearly 100 times that of Earth, Plait said. As it approached, it would throw the Moon out of Earth’s orbit, most likely causing it to enter a highly elliptical orbit around the Sun, and would eventually recross the Earth’s orbit, resulting in “a potentially very bad future scenario,” he added.

But that isn’t the only problem, as Saturn’s rings would wind up being destroyed during the approach, according to Plait. The rings, which are comprised of countless miniature ice particles, would begin feeling the pull of Earth’s gravity as Saturn approached, causing the particles to be yanked out of them and forming a long plume. Most of those particles would continue to orbit Saturn, he said, but the structures that they form would be destroyed.

“Over time, the particles would collide with the other parts of the ring, and eventually the system would settle down once again,” Plait said. “Except I doubt they’d get the chance. Even if the rings could survive their encounter with Earth they wouldn’t have a very long future,” as Saturn’s path of travel would ultimately bring it close enough to the Sun for the ice particles to heat up and turn into gas particles through a process known as sublimation.

Fortunately, there are plenty of opportunities for astronomy buffs to observe Saturn without the added risk of potentially apocalyptic scenarios. While this past weekend was the best time to observe the planet, Tuesday evening also provided a golden opportunity to see the ringed planet, according to Tech Times reporter Alexander Saltarin.

“When it rises with the moon this May 13, astronomers expect that the planet will be almost as bright as the star Arcturus,” Saltarin said. “Experts say that in terms of the brightness scale used on celestial objects, Saturn will be shining at a magnitude of zero, which is very bright indeed. However, Saturn will be slightly dimmer compared to Arcturus.  Moreover, Saturn will be shining with more of a yellowish-white light whereas Arcturus will be a bit on the orange side.”

Study Questions Younger Dryas Event Comet Theories

April Flowers for redOrbit.com – Your Universe Online

Approximately 128,000 years ago, near the end of the last Ice Age, there was a brief episode of glacial conditions called the Younger Dryas event. The Younger Dryas, named for a flower that flourished during this time, lasted about 1,000 years. There has been quite a bit of controversy in the scientific community regarding what might have initiated the event—with a wide range of theories, including one that has the event caused by a comet impacting the Earth. Proponents of this theory point to sediments containing deposits they believe could only result from such an impact.

A new study from Southern Methodist University (SMU) and published in Proceedings of the National Academy of Sciences disproves the comet impact theory.

David Meltzer, SMU professor of archeology and a leading expert on the Clovis culture, led the research team. Their findings, based on samples from 29 sites in North America and on three other continents, indicate that nearly all sediment layers purported to be from the Ice Age are actually either much younger or much older.

Although the scientific community agrees that the Younger Dryas period occurred, causing widespread cooling for a relatively short time, they do not agree on what caused it. Theories range from changes in ocean circulation patterns caused by glacial meltwater entering the ocean to the cosmic-impact theory.

Geological indicators that are extraterrestrial in origin are said to be the basis of the cosmic-impact theory. Meltzer said that the cosmic-impact theory is false, however, after a careful review of the dating of sediment samples from 29 sites reported to have such indicators. Of the 29, only three were found to have sediments that date to the window of time for the Ice Age.

“The supposed impact markers are undated or significantly older or younger than 12,800 years ago,” report the authors. “Either there were many more impacts than supposed, including one as recently as five centuries ago, or, far more likely, these are not extraterrestrial impact markers.”

At the 29 sites in North America and elsewhere, there is supposedly a Younger Dryas boundary layer that contains deposits of magnetic grains with iridium, magnetic microspherules, charcoal, soot, carbon spherules, glass-like carbon containing nanodiamonds, and fullerenes with extraterrestrial helium. These grains are supposedly extraterrestrial in origin, from a comet or other cosmic event impacting the Earth that dates to a 300-year span centered on 128,000 years ago.

To test the cosmic-impact theory and determine if these markers dated to the onset of the Younger Dryas, the team examined the existing stratigraphic and chronological data sets reported in the published scientific literature and accepted as proof by cosmic-impact proponents. The 29 sites were sorted initially by the availability of radiometric or numeric ages, then by the type of age control, if available, and whether the age control is secure.

Three of the sites lacked absolute age control:

  • Chobot, Alberta—the three Clovis points found lack stratigraphic context, and the majority of other diagnostic artifacts are younger than Clovis by thousands of years.
  • Morley, Alberta—without evidence, ridges are assumed to be chronologically correlated with Ice Age hills 1,600 miles away.
  • Paw Paw Cove, Maryland— according to the principal archaeologist, horizontal integrity of the Clovis artifacts found is compromised.

According to the Smithsonian, the Clovis culture is the oldest North American culture that we have much knowledge of, with sites ranging from Washington State to Venezuela. More than 10,000 Clovis points — made from jasper, chert, obsidian and other fine, brittle stones — have been found since their initial discovery in Clovis, New Mexico, in 1932. The oldest securely dated examples were unearthed in Texas, tracing back 13,500 years.

Radiometric or other potential numeric ages were found at the remaining 26 sites, but only three of these date to the Younger Dryas boundary layer.

The ages of eight of the sites were unrelated to the Younger Dryas boundary layer. An example would be at the Gainey, Michigan site where extensive stratigraphic mixing of artifacts found at the site makes it impossible to know their position to the supposed Younger Dryas boundary layer. What direct dating was obtained pointed to sometime after the 16th century CE.

Sediment in the skull of an extinct horse found at Wally’s Beach, Alberta was dated to an age of 10,980 using radiocarbon dating. These sediments purportedly point to extraterrestrial impact markers. In actuality, the date is from an extinct musk ox instead of a horse, and the fossil that yielded the supposed impact markers was not dated. The scientists could not find evidence suggesting that the Wally’s Beach fossils are all of the same age or even date to the Younger Dryas event onset.

The team demonstrates that the chronological results at nearly a dozen other sites are neither reliable nor valid as a result of significant statistical flaws in the analysis, the omission of ages from the models, and the disregard of statistical uncertainty that accompanies all radiometric dates.

In more cases than not, according to the findings, inferences about the ages of supposed Younger Dryas boundary layers are unsupported by replication.