Antibiotics An Effective Treatment For Chronic Back Pain

Brett Smith for redOrbit.com — Your Universe Online

A groundbreaking study from a well-renowned team of Danish researchers could bring unprecedented relief for sufferers of chronic back pain.

According to the study, which appeared in the European Spine Journal, as much as 40 percent of chronic lower back pain is caused by bacteria. Treating these patients with antibiotics has resulted in a significant amount of relief and increased quality of life when compared to those patients taking a placebo.

“In people who received the placebo, nothing happened,” said lead author Hanne B. Albert, an associate professor at the University of Southern Denmark, at a press conference in London. “People on the antibiotics attained highly clinically significant improvement.”

Besides experiencing relief from back pain, patients who were prescribed the antibiotics had better overall functioning, less leg pain and missed fewer days of work due to their condition, they reported after one year of treatment.

To reach their conclusion, the Danish researchers conducted two studies. First, they discovered bacterial infections in 46 percent of patients suffering from constant lower back pain after a slipped or herniated disc. Many of these patients were successfully treated with antibiotics.

The second study involved the antibiotic combination amoxicillin and clavulanate for patients with the same condition. The researchers found 80 percent were cured or saw a marked reduction in pain levels.

John O´Dowd, president of the British Society for Back Pain Research, told The Telegraph the results are encouraging.

“It is a very striking study, and those behind it are a very high caliber group of scientists,” he said. “This is definitely something we need to take seriously but the results are very surprising and this is an area where there is a great deal of uncertainty.”

O´Dowd recommended experts conduct further trials before official treatments for the condition are altered.

“I wouldn´t want to see a great rush to market this as the best response to chronic lower back pain until that has been done,” he said.

Peter Hamlyn, a neurologist and spinal surgeon at University College London Hospital, had even higher praise for the study and said it could have life-changing impacts for many people who are in pain or disabled.

“Make no mistake this is a turning point, a point where we will have to re-write the textbooks,” he told The Telegraph. “It is the stuff of Nobel prizes.”

However, not everyone in the scientific community embraced the study´s results with such enthusiasm. Infection experts cited the growing number of antibiotic-resistant ℠superbugs´ and warned against the overuse of antibiotics.

Laura Piddock, a microbiology professor at the University of Birmingham, told The Telegraph antibiotics should only be prescribed if there is the positive identification of a bacterial cause. She warned against “needlessly” exposing too many patients to the drugs that can also cause other minor side effects.

Chronic back pain is pervasive throughout society, affecting 31 million Americans at “any given time,” according to the American Chiropractic Association. Sufferers also feel the financial pinch of their condition, as back pain is one of the most common reasons for missing work.

US Naval UAV Breaks Its Own Endurance Record

Lee Rannals for redOrbit.com — Your Universe Online

U.S. Naval Research Laboratory researchers have broken their own endurance record with their fuel cell powered Ion Tiger Unmanned Aerial Vehicle (UAV).

The team was able to fly their UAV for 48 hours and one minute on April 16 through 18 using liquid hydrogen fuel in a new cryogenic fuel storage tank and delivery system. This flight broke their previous flight record of 26 hours and two minutes, which was set in 2009 using the same vehicle.

The research laboratory said they were able to complete this longer duration flight with the liquid hydrogen because it is three times denser than 5000-psi compressed hydrogen. The cryogenic liquid stored in the lightweight tank allows for more hydrogen to be carried onboard in order to increase flight endurance.

“Liquid hydrogen coupled with fuel-cell technology has the potential to expand the utility of small unmanned systems by greatly increasing endurance while still affording all the benefits of electric propulsion,” said Dr. Karen Swider-Lyons, NRL principal investigator.

According to the laboratory, although long endurance is possible with hydrocarbon-fueled systems, these aircraft are loud, inefficient, and unreliable. Battery-powered systems are limited to endurances of just a few hours, so the researchers turned to liquid hydrogen for fuel.

The team’s previous flight record was set on November 16 through 17 in 2009. It broke its own previous record from a month before of 23 hours and 17 minutes.

The world record for a long endurance UAV flight was set in 2010 by the UK-built Zephyr aircraft. This vehicle took flight for 336 hours, 22 minutes and eight seconds, according to the Federation Aeronautique Internationale (FAI). This aircraft is made out of ultra-light carbon fiber and weighs just 110 pounds. Its 74-foot wingspan and lithium-sulphur batteries are what helped it to achieve such great endurance. The Zephyr UAV broke the previous endurance record set by Northrop Grumman’s Global Hawk in 2001 after just 31 hours in the air.

“This aircraft can help track pirates off the Horn of Africa, alert the authorities about where and how fast forest fires are spreading, and ensure that soldiers´ communications remain unaffected when fighting in mountainous or hilly terrain,” technology firm Qinetiq´s chief designer Chris Kelleher told BBC News in 2010.

French Health Ministry Confirms Three New Cases Of Novel Coronavirus

Lawrence LeBlond for redOrbit.com – Your Universe Online

Two days after French health officials confirmed the first case of novel coronavirus (NCoV) in their country, a spokeswoman for the regional health authority of Northern France said two health care workers who cared for the 65-year-old infected man are under surveillance on suspicion of being infected as well.

Beatrice Degrugillers said a doctor and nurse were hospitalized Thursday night and are continuing to be monitored closely. A third person who shared the room with the infected man is also under close surveillance, said Degrugillers.

Test results are expected sometime today to see if they, in fact, have an NCoV infection. If the results are positive, it will heighten the concerns about the virus´ ability to spread easily among humans. Currently, health experts believe there is no sustained human-to-human transmission of the disease, despite a British family becoming infected in February after a relative returned home from a trip to the Middle East.

Health authorities have previously said the virus only spreads in limited circumstances between people who are in extremely close contact with the infected, such as family members or health care workers, such as in the French case.

The 65-year-old man, who has not been identified, returned home from a nine-day trip to Dubai on April 17. He was hospitalized due to a respiratory infection on April 23 and was transferred to a more advanced facility on April 29. The Health Ministry said Wednesday he was in “worrying condition.”

The Ministry said they had tested others from his tour group who were in possible contact with the Frenchman, but none had tested positive for the virus.

French officials have since moved to calm the public.

“It is not a health catastrophe; we are taking the systematic measures for the case of an illness that risks being transmittable,” Sandrine Segiovia-Kueny, deputy director of the public health agency in the Nord Pas de Calais region, said on French radio station RTL Friday, according to The Wall Street Journal (WSJ).

The World Health Organization (WHO), as well as other health agencies around the world including the CDC, has been monitoring the situation closely.

The NCoV causes serious respiratory illness, with patients having high fever, cough, breathing difficulties and possible kidney failure. The mortality rate is currently above 50 percent, with 18 of 33 confirmed cases resulting in death.

The virus, which is similar to the common cold and SARS, was first discovered in humans in September, but some cases have now been identified dating back to last April, WSJ reported.

Most of the cases have been isolated in the Middle East — Jordan, Qatar, Saudi Arabia and the United Arab Emirates. France has now been added to the very short list of countries outside the Middle East that have seen cases of NCoV; Britain and Germany are the only other two countries with confirmed infections so far.

The WHO has also received a report from Saudi deputy health minister Ziad Memish via email, noting that web-based disease monitoring system ProMED has found two previously unreported cases of NCoV in Saudi Arabia on May 8.

One patient was a 48-year-old man with existing medical conditions who became ill on April 29. The second patient was a 58-year-old man who also had existing medical conditions who became ill on April 6. He fully recovered and was discharged on May 3; the former is still hospitalized but is in stable condition, according to a Reuters report.

Memish told the WHO, via the email, Saudi authorities have worked since May 1 to bring the virus under control and have been instrumental and keeping new cases from emerging.

Still, French authorities are taking no chances and have advised anyone who has recently traveled to the Middle East to consult a doctor for checkup, especially if they have a fever or other flu-like symptoms.

The WHO currently has no travel advisories or restrictions to the Middle East.

Many Women Opt To Alter Their Menstrual Cycles, Says Study

April Flowers for redOrbit.com – Your Universe Online

A new study led by researchers at the University of Oregon found that a surprising number of adult women ages 18 and up are opting to delay or completely skip their monthly menstruation. They are achieving this by deviating from the instructions that come with birth-control pills or other hormonal contraceptives.

The study, published in the journal Contraception, found that most women who alter bleeding cycles do so for convenience rather than to avoid menstrual symptoms. According to the team, which included researchers from the UO’s Department of Human Physiology, Portland-based Oregon Health and Sciences University and Eastern Michigan University, many women learn about the option to alter their cycle from nonmedical sources.

“These findings emphasize the need for health care providers to carefully interview combined hormonal contraceptive users on how they are using their method — for example, many women may be skipping pills to extend their cycles,” said researcher Christopher Minson, a professor of human physiology at UO.

“With a greater understanding of the issues, health care providers may be able to more effectively engage in conversations with college-aged women and educate them about available options.”

Women have begun to use hormonal contraceptives to alter bleeding cycles as more research becomes available indicating that reducing the occurrence of menstruation is safe and can even be beneficial. Until now, however, there has been a lack of information concerning why women do so, and where they are receiving information about this option.

The researchers surveyed undergraduate and graduate students, finding that 17 percent reported altering their scheduled bleeding pattern by deviating from the instructions of hormonal contraceptives. These types of contraceptives include birth-control pills, vaginal contraceptive rings and transdermal contraceptive patches.

Convenience and scheduling purposes are the reasons given by half of the women who alter their cycles. Other reasons include personal preference (28.9 percent) and reducing menstrual symptoms (16.7 percent). Amongst those who cited convenience or personal choice, 53 percent indicated that the knowledge was obtained from nonmedical sources such as a family member or friend.

The researchers sent out approximately 11,900 survey-linked emails to female university students and received 1,719 initial responses — a 14.4 percent return. Some 1,374 (79.9 percent) of the respondents indicated that they had used a combined hormonal contraceptive during the last six months.

The survey also examined the factors that influence a woman´s decision to alter her bleeding schedules. They found that Asian women have a 7 percent lower probability of altering their hormonal cycles, and women who exercise regularly are 5 percent less likely to do so. Another characteristic that decreases the likelihood of cycle alteration was a preference for a monthly cycle.

“We found that it is possible to identify some of the specific characteristics of women in a college population who may be more or less likely to practice scheduled bleeding manipulation,” said Dr. Paul Kaplan, of the University Health Center and Oregon Health and Sciences University. “This study provides information about the motives, beliefs and influences relating to this practice.”

The research team was surprised to find that women who said they would prefer to have no menstrual periods were less likely to alter their cycles than women who preferred having one per year. Those whose preference ran to one cycle per year had a 17 percent higher probability of modifying their hormonal contraceptive practices than those who preferred having a menstrual period every three months.

The researchers suggest that this information should be used by health care providers to improve education of the hormonal contraception regimen best-suited to a patient´s needs and desires.

UN Agency Predicts More Mobile Phones Than People

Peter Suciu for redOrbit.com — Your Universe Online

Many people in the developing world have ditched the landline for a mobile phone, with many families having multiple handsets. In the developing world, many people simply never had a landline and now, instead, have a mobile device. Moreover, the move towards individuals having both work and personal mobile phones might mean soon there could be more mobile subscriptions than people in the world.

Presently there are some 6.8 billion mobile subscriptions worldwide, while the total population is about 7.1 billion. By next year, the number of mobile handsets could be larger than the world population, reports the International Telecoms Union, a UN specialized agency that tracks global information and communication technology (ICT).

The core mission of the ITU is to foster international cooperation and solidarity in the delivery of technical assistance and in the creation, development and improvement of telecommunication/ICT equipment and networks in developing countries — exactly the areas seeing the fastest growth in mobile handset adoption.

Currently, the Commonwealth of Independent States, which consists of countries that formerly made up the Soviet Union, leads the world with the highest level of mobile penetration with about 1.7 mobile phone subscriptions per person.

On the other end of the spectrum is Africa, which has 63 subscriptions per 100 people. India, which also has a growing population, is actually seeing its mobile adoption slowing.

“Every day we are moving closer to having almost as many mobile cellular subscriptions as people on earth,” Brahima Sanou, director of the ITU Telecommunication Development Bureau, told the BBC. “The mobile revolution is ℠m-powering´ people in developing countries by delivering ICT applications in education, health, government, banking, environment and business.”

A study conducted last month by researcher Luke Wroblewiski, author of Mobile First, also found that in January, more iPhones were sold in 24 hours than babies born. This was based on Apple´s reports it had sold 37.04 million iPhones in the first quarter of 2012 — which puts the number at around 402,000 per day, compared to the average 300,000 people that are born each day.

The mobile phone is also outpacing the number of people that can get online and access the Internet. Worldwide online penetration is highest in Europe, which currently sees about 75 percent of all residents going online, followed by the Americas at 61 percent. Currently 32 percent of the population in Asia has online access, while the number is just 16 percent for Africa.

This could certainly create a “digital divide” between the developed world and those regions that are still developing.

“Two-thirds of the world’s population, some 4.5 billion people, is still offline,” ITU secretary-general Hamadoun Toure told the BBC. “This means that two-thirds of the world’s people are still locked out of the world’s biggest market.”

However, mobile phone adoption isn´t only outpacing those who can access the Internet or even the population as a whole; mobile adoption has already outpaced those who have access to basic sanitation. Earlier this year, the United Nations noted that while six billion of the world´s seven billion people currently have a mobile phone, only 4.5 billion can get to clean restroom facilities, while the remaining 2.5 billion, most of whom are located in rural areas, do not have access to toilets and other sanitation.

This suggests when it comes to priorities, mobile communication is still outpacing other basic needs.

Breast Cancer Patients May Find Comfort From Laughter With Friends

Michael Harper for redOrbit.com — Your Universe Online

A new study from health insurance provider Kaiser Permanente has found that laughter, when shared with friends, really is the best medicine.

According to the firm’s research, breast cancer patients who have “positive social interactions” amongst friends are better equipped to endure the pain and other physical symptoms associated with the disease. Women with the largest social networks and the highest amount of social interaction reported higher quality of life and a better emotional quality of life. The study is available in the latest edition of Breast Cancer Research and Treatment.

“This study provides research-based evidence that social support helps with physical symptoms,” said lead author Candyce H. Kroenke, ScD, MPH, staff scientist with the Kaiser Permanente Division of Research. “Social support mechanisms matter in terms of physical outcomes.”

This study, the first of its kind, goes beyond measuring the emotional support of a strong social network and even observes the effects of tangible support from these networks, which includes helping out with household chores or running errands.

“While hundreds of studies have examined the role of factors influencing cancer risk and prevention, this study is one of a small but growing number that focus on quality of life after a breast cancer diagnosis,” said Kroenke.

This study was performed as a part of the larger pathways study which investigated the best diets, exercise and remedies for breast cancer patients. Kroenke surveyed the same 3,139 female participants in the pathways study to conduct her research. Each of these women was diagnosed with breast cancer between 2006 and 2011. Shortly after their diagnoses, Kroenke and team asked these women to answer questions about their social networks, including their friends and relatives, intimate relationships, and social and religious ties. The women also answered questions about the kinds of support they received from the people in their life as well as their symptoms and their quality of life, both physical and emotional.

Kroenke found that those women who were most engaged in social interaction and therefore had larger social networks were more likely to report a higher quality of life during their treatments. The study also found that those women who had friends who were willing to do more things together were more likely to have higher quality of life and a better emotional quality of life. The inverse was also true – those women with little social interaction and small networks reported lower quality of life and were more likely to experience pain throughout their treatments.

Those women who were in the later stages of breast cancer were found to receive a large benefit from their friends helping them out around the house or running errands and cooking food for them. Again, those women who received little to no help around the house experienced a lesser quality of life than the others.

“Positive social interaction was significantly related to every quality-of-life measure,” writes Kroenke in her published paper.

“Given that this dimension was determined by the availability of someone with whom to have fun, relax and get one´s mind off things for a while, it is possible that positive social interaction may enable women to forget for a while the distress of being a cancer patient, and the physiologic effects last beyond the actual interaction,” she adds.

Some 230,000 women are diagnosed with breast cancer every year in the US, according to Kaiser Permanente´s data. Fortunately, there are also a reported 2.9 million survivors living in the US as of 2012. This, says Kroenke, makes it all the more important for patients to increase their quality of life.

Picky Eating Drove Saber-Tooth Tiger To Extinction In Last Ice Age

April Flowers for redOrbit.com – Your Universe Online

During the Pleistocene epoch, an astounding diversity of large-bodied mammals inhabited the so-called “mammoth steppe” — a cold and dry, yet productive, environment that extended from western Europe through northern Asia and across the Bering land bridge to the Yukon territory. Three types of large predators roamed the steppe during the Pleistocene, wolves, bears and large cats. After the end of the last ice age, only wolves and bears were able to maintain their ranges.

Dietary flexibility may have been an important factor, giving wolves and bears an edge over saber-toothed cats and cave lions, according to a new study led by researchers at the University of California, Santa Cruz.

“We found that dietary flexibility was strongly species-specific, and that large cats were relatively inflexible predators compared to wolves and bears. This is a key observation, as large cats have suffered severe range contractions since the last glacial maximum, whereas wolves and bears have ranges that remain similar to their Pleistocene ranges,” said Justin Yeakel, now a postdoctoral researcher at Simon Fraser University in British Columbia who worked on the study as a graduate student at UC Santa Cruz.

The findings were published recently in the journal Proceedings of the Royal Society B are are based on an analysis of stable isotope ratios, chemical traces in fossil bones that can be used to reconstruct an animal´s diet. The data was obtained from previously published stable isotope datasets. These datasets were used to reconstruct predator-prey interactions at six sites located from Alaska to western Europe. The sites cover a range of time from before, during and after the last glacial maximum. The maximum occurred between 20 and 25 thousand years ago when the ice sheets reached their greatest extent.

The researchers found that the large cats´ diet was similar across the different locations, especially in the post-glacial period. In contrast, wolves and bears ate different things in different locations. Bison, horses, yaks, musk oxen, caribou and mammoths were all prey species on the mammoth steppes. Changes in predator diets coincided with an increase in caribou abundance, the researchers noticed, starting around 20,000 years ago.

“During and after the last glacial maximum, many predators focused their attention on caribou, which had been a marginally important prey resource before then,” Yeakel said. “Large cats began concentrating almost solely on caribou in both Alaska and Europe. Wolves and bears also began consuming more caribou in Alaska, but not in Europe.”

Though morphologically similar to modern lions, the cave lions and saber-toothed cats of the mammoth steppes went extinct within the past 10,000 years. The bears of that time were also morphologically similar to modern bears, such as the short-faced bear that was larger than a polar bear and has since gone extinct. The researchers found that the short-faced bear was the only bear species that did not focus on caribou as prey in the post-glacial period.

The demise of the mammoths and other large fauna of the mammoth steppes coincided with a growing human population after the last ice age. Many species, including wolves and bears, are still around, however. Previous studies of past ecosystems can inform scientists´ understanding of modern carnivores and their capabilities, said Yeakel.

“If you look at wolves today, they are specialist carnivores preying on large herbivores like deer and elk, but when we look in the fossil record we see that wolves are remarkably flexible. Their environment today is fairly artificial compared to when they evolved,” he said.

The researchers found that large-scale patterns of interactions differed between locations but remained stable over time. Predator-prey interactions had relatively little overlap in the preferred prey of different predator species in Alaska. In Europe, however, predator-prey interactions were less “compartmentalized.”

“The large-scale patterns don’t seem to change, which suggests this community was resilient to the climate changes associated with the last glacial maximum. That makes sense, because it survived multiple ice ages further back in time,” Yeakel said.

Herpes Vaccine Could Arise Due To Suppressive Immune Cell Discovery

Lawrence LeBlond for redOrbit.com – Your Universe Online
Herpes is an infectious disease that affects more than 24 million people in the United States alone, according to a recent report by the US Centers for Disease Control and Prevention (CDC). Now, researchers have identified a class of immune cells that exist in genital skin and mucosa that may play a role in developing a vaccine to prevent one of the most common sexually transmitted infections (STIs) in America.
These immune cells, known as CD8αα+ T cells, have been found to suppress symptoms and recurring outbreaks of genital herpes, and are believed to be the key reason most sufferers are asymptomatic when viral reactivations occur. The researchers, from Fred Hutchinson Cancer Research Center (FHCRC) and University of Washington (U-W), say the immune cells could open up a new path in the prevention and treatment of herpes simplex virus type 2 (HSV-2). The team said that identifying these T cells´ specific molecular targets (epitopes) will be the next step in developing a vaccine.
The researchers, led by Larry Corey, MD, virologist and president of FHCRC, also believe that better understanding of these T cells could play a crucial role in the development of vaccines for other types of skin and mucosal infections, including HIV infection.
“The discovery of this special class of cells that sit right at the nerve endings where HSV-2 is released into skin is changing how we think about HSV-2 and possible vaccines,” said Corey in a statement. “For the first time, we know the type of immune cells that the body uses to prevent outbreaks. We also know these cells are quite effective in containing most reactivations of HSV-2. If we can boost the effectiveness of these immune cells we are likely to be able to contain this infection at the point of attack and stop the virus from spreading in the first place.”
Currently there is no treatment that can cure genital herpes, but antiviral medications have been shown to prevent or shorten recurring outbreaks while a person is taking the medication. But even with antiviral treatment, Corey noted that the virus “often breaks through this barrier and patients still can transmit the infection to others.”
“In addition, newborn herpes is one of the leading infections transmitted from mothers to children at the time of delivery. An effective genital herpes vaccine is needed to eliminate this complication of HSV-2 infection,” he added.
Jia Zhu, PhD, a corresponding author on the study from U-W´s Laboratory of Medicine, said the long-term presence of CD8αα+ T cells where initial infection occurs could explain why patients have asymptomatic recurrences because the cells constantly recognize and destroy the virus.
“The cells we found perform immune surveillance and contain the virus in the key battlefield where infection occurs, which is the dermal-epidermal junction,” said Zhu, who is also an affiliate investigator at FHCRC. “These cells are persistent in the skin and represent a newly discovered phenotype distinguished from those of CD8+ T cells circulating in the blood.”
The team explained that the dermal-epidermal junction (DEJ) — region where the tissue layers connect to the outer skin layer — is an important area because of the roles it plays in cellular communication, nutrient exchange and absorption. The team further explained that T cell activity in the DEJ is important because this is the region where genital herpes virus multiplies after traveling from the body´s sensory neurons where the virus hides. The team found in earlier research that nerve endings that reach the DEJ are able to release the virus into the skin, where it can cause lesions.
Prior to the discovery of T cells in skin, these immune cells were only found in gut mucosa. Previous research has mainly focused on studying these cells in blood.
“We did not expect to find CD8αα+ T cells in the skin,” Zhu said. “This was a surprise.”
The researchers used a novel technique to examine the T cells in human skin tissue. Zhu noted that this technique could provide a “roadmap” to the treatment of other human diseases. She added that the studies they performed were unique.
“To our knowledge, we are the only research group to use sequential human biopsies to study CD8+ T cell function in situ, in their natural spatial distribution and at their original physiological state,” said Zhu.
With a disease that affects more than 775,000 people in the US every year, finding a vaccine that can prevent and/or cure herpes will be pretty significant. According to a CDC fact sheet, about one in six people aged 14 to 49 have genital HSV-2. The most common transmission of the infection is through sexual contact with someone who has the disease. Transmission can even occur when the partner shows no visible sores.
While the CDC notes that use of condoms can reduce the risk of genital herpes, the surest way to avoid transmission of the virus is to abstain from sexual contact because infection can occur in both male and female genital areas that are not covered or protected by a latex condom. To ensure it is safe to have sex, partners can seek testing to determine if they are infected with HSV.
Most individuals infected with HSV-1 or HSV-2 experience either no symptoms or have very mild symptoms that can go unnoticed or mistaken for other skin conditions. Because of this, most people infected with HSV are not aware of their infection.
The research team said the T cells examined could also play a role in helping in the treatment of herpes that affects oral regions, but they only examined the cells´ role in genital herpes for this study.
The team´s study findings are described in the May 8 advance online edition of the journal Nature.

Diets Rich In Soy And Tomato Could Help Prevent Prostate Cancer

redOrbit Staff & Wire Reports – Your Universe Online
Eating tomatoes and soy together could be better at preventing prostate cancer than consuming either food product by itself, according to new research published online in the journal Cancer Prevention Research.
“In our study, we used mice that were genetically engineered to develop an aggressive form of prostate cancer,” John Erdman, a professor of food science and nutrition at the University of Illinois and one of the study authors, said in a statement.
“Even so, half the animals that had consumed tomato and soy had no cancerous lesions in the prostate at study’s end,” he added. “All the mice in the control group — no soy, no tomato — developed the disease.”
According to the researchers, the mice were fed one of four different diets from the time they were four weeks old until they were 18 weeks old — a time frame chosen to model early and lifelong exposure to the bioactive components contained within those foods, Erdman explained.
The first group ate 10 percent whole tomato powder while the second consumed two percent soy germ. The third group was given both tomato powder and soy germ, while the fourth was a control group that was given neither substance. The researchers then measured the success of each diet at helping to prevent prostate cancer.
“Eating tomato, soy, and the combination all significantly reduced prostate cancer incidence,” Erdman said. “But the combination gave us the best results. Only 45 percent of mice fed both foods developed the disease compared to 61 percent in the tomato group, and 66 percent in the soy group.”
While prostate cancer is the most frequently diagnosed form of the disease in men, it has a nearly 100 percent survival rate provided it is detected early enough, the researchers said. It tends to be a slow-growing cancer in older men, and they often choose waiting over surgery and radiation, both of which could have side effects.
The soy isoflavone serum and prostate levels of the rodents are said to be similar to those found in Asian men who consume between one to two servings of the legume each day. In countries where soy is consumed daily, Erdman said that prostate cancer occurs at significantly lower levels — and the findings of the research suggest that soy and tomato could help men who are concerned about their prostate health.
“The results of the mouse study suggest that three to four servings of tomato products per week and one to two servings of soy foods daily could protect against prostate cancer,” said study co-author Krystle Zuniga.
“It’s better to eat a whole tomato than to take a lycopene supplement. It’s better to drink soy milk than to take soy isoflavones,” Erdman added. “When you eat whole foods, you expose yourself to the entire array of cancer-fighting, bioactive components in these foods.”

Nuts Good For Your Cardiovascular Health

Michael Harper for redOrbit.com — Your Universe Online

Doctors have long considered many nuts to be quite heart healthy. The fatty acids found in nuts along with high doses of fiber and vitamins are thought to reduce the levels of LDL, or “bad” cholesterol in the blood. Now, a team of Penn State researchers are parroting other studies that have found walnuts to be one of the healthiest nuts there is.

“We already know that eating walnuts in a heart-healthy diet can lower blood cholesterol levels,” said Penny Kris-Etherton, Distinguished Professor of Nutrition at Penn State in a statement.

“But, until now, we did not know what component of the walnut was providing this benefit. Now we understand additional ways in which whole walnuts and their oil components can improve heart health.”

In fact, the power of the walnut is so strong it can go to work in as little as 30 minutes. Kris-Etherton and teammate Claire Berryman found walnuts and the essential oil from these nuts is good at maintaining blood vessel function, as well as preserving important cells in the vessels. Their work will be published in the June 1st edition of the Journal of Nutrition.

In this random and controlled test, the Penn State researchers found 15 participants with high blood cholesterol levels and gave them one of four treatments — 85 grams of plain, whole walnuts, six grams of walnut skins, 34 grams of walnut meat with the fat removed, or 51 grams of essential walnut oil. The researchers also took measurements of the participants´ biochemical and psychological responses before they took part of the walnut study, then again at 30 minutes, one hour, two hours, four hours and six hours following the treatments.

After analyzing the results, Berryman, Kris-Etherton and team discovered taking different parts of the walnut even once affected the health of the participants´ vessels “favorably.” Specifically, eating walnuts whole helped HDL, or good cholesterol, remove excess cholesterols from the body.

“Our study showed that the oil found in walnuts can maintain blood vessel function after a meal, which is very important given that blood vessel integrity is often compromised in individuals with cardiovascular disease,” said Berryman in a statement. Berryman is a graduate student in nutritional sciences at Penn State.

“The walnut oil was particularly good at preserving the function of endothelial cells, which play an important role in cardiovascular health.”

Previous research has shown these nuts contain alpha-linolenic acid, gamma-tocopherol and phytosterols, which all play a role in heart health. The team now says they could use these findings to improve existing diet strategies used to fight heart disease.

“The science around HDL functionality is very new, so to see improvements in this outcome with the consumption of whole walnuts is promising and worth investigating further,” said Berryman. As it stands, the researchers are suggesting those with high LDL levels make a few dietary changes, including incorporating more walnut oil.

This isn´t the first study to hail walnuts as an incredibly heart healthy food. Previous research has found walnuts to be the best nut for your heart.

”Nuts are good for your heart,” said Joe Vinson, PhD, a researcher at the University of Scranton in Pennsylvania in a March 2011 study.

“Twenty-eight grams of walnuts (an ounce) have more antioxidants than the sum of what the average person gets from fruits and vegetables,” he said. “That is not to say they are a replacement for fruits and vegetables, but they are very antioxidant dense.”

Calories At Subway Equal To McDonalds For Teens

Michael Harper for redOrbit.com — Your Universe Online

A new study by researchers from the University of California at Los Angeles (UCLA) has finally proven that just because a restaurant advertises themselves as a healthy alternative, poor choices can make the meals just as unhealthy as other fast food eateries.

This study, published May 6 in the Journal of Adolescent Health, found that teenagers may end up eating nearly as many calories in a meal at Subway as a meal at McDonalds. The researchers behind this study say both restaurants contribute to the high amounts of overeating and obesity in America today.

“Every day, millions of people eat at McDonald´s and Subway, the two largest fast food chains in the world,” study leader Dr. Lenard Lesser, explained in a statement. “With childhood obesity at record levels, we need to know the health impact of kids´ choices at restaurants.”

Dr. Lesser and team recruited 97 teenagers aged 12 to 21 and asked them to buy meals at either McDonalds or Subway at a local shopping mall. These teenagers were asked to visit one of the two restaurants on different weekdays between 3 p.m. and 5 p.m. and used their own money to pay for the food. Later the researchers gathered the receipts from these meals and measured the calories from each item on an individual ticket.

According to the research, teenagers who ate at Subway only ate 83 calories less than the teens who purchased food at McDonalds. The average caloric intake from either of these restaurants is shocking: Teens who dined at McDonalds ate an average of 1,038 calories while those who bought a sandwich from Subway consumed an average of 955 calories. The results are alarming, especially since The Institute of Medicine (IOM) suggests school lunches weigh in at no more than 850 calories.

After calculating the results, Dr. Lesser said there “was no statistically significant difference between the two restaurants,” noting that while the nutritional value of a Subway sandwich may be slightly higher, the rest of the meal (and the calories of the sandwich itself) can supersede this small benefit.

For instance, the researchers found that those McDonald´s sandwiches chosen by teenagers averaged 784 calories. Teens chose Subway sandwiches which averaged 572 calories.

Once the teens ordered chips, fries and sodas, the two meals ended up clocking in nearly the same amount of calories.

“The nutrient profile at Subway was slightly healthier, but the food still contained three times the amount of salt that the Institute of Medicine recommends,” said Dr. Lesser.

The amount of salt in these sandwiches is startling; the average Subway sandwich ordered by the teens contained 2,149 mg of sodium compared to the 1,829 mg in a McDonalds sandwich.

Dr. Lesser and team attribute this higher amount of sodium to the processed meats used in Subway´s sandwiches.

Though the results are not entirely surprising, the researchers explained that there were some weak points in the study. For instance, they only asked the teenagers to share with them the contents of one meal in the day. It´s unknown what else the kids are eating earlier or later in the day. Furthermore, the teenaged participants were from one suburb in Los Angeles and mostly of Asian decent or mixed ethnicity. Other kids might order differently, say the researchers.

It´s also important to remember that it´s possible to make both bad and good choices at nearly any restaurant. Healthier options exist at both establishments, but the students in this study simply chose some of the more caloric options. Though this could be seen as indicative of normal teenage behavior, it´s worth noting that both restaurants provide options for the health conscious to choose.

Robotic Sensor Helps Track And Manage Toxic Red Tides

April Flowers for redOrbit.com – Your Universe Online

The way scientists monitor and manage red tides or harmful algal blooms (HABs) in New England may be transformed by a new robotic sensor deployed in the Gulf of Maine coastal waters by Woods Hole Oceanographic Institution (WHOI). WHOI launched the new instrument at the end of last month and expects to deploy a second system later this spring.

The robotic sensor will add critical data to weekly real-time forecasts of the New England red tide this year, which will be distributed to more than 150 coastal resource and fisheries managers in six states. Federal agencies such as NOAA, the FDA and the EPA also depend upon these forecasts. Data from the sensor will be added to regular updates provided on the “Current Status” page of the Northeast PSP website.

“This deployment is a critical step towards our long-term dream of having a network of instruments moored along the coast of the Gulf of Maine, routinely providing data on the distribution and abundance of HAB cells and toxins. The technology will greatly enhance management capabilities and protection of public health in the region,” says Don Anderson,“¯WHOI senior scientist.

The sensors are known as Environmental Sample Processors (ESPs). They are molecular biology labs packed inside canisters the size of kitchen garbage cans. The ESPs are mounted to ocean buoys in the Gulf of Maine and will detect and estimate concentrations of two algal species that cause HABs or “red tides.” The sensors will also detect one of the potentially fatal toxins that the algal species produce. Sensor data will then be transmitted to the shore in real time.

The first alga is a single-celled organism known as Alexandrium fundyense, which causes paralytic shellfish poisoning (PSP). Pseudo nitzschia is the second organism, a diatom responsible for amnesic shellfish poisoning (ASP).

One of the sensors was tested off the coast of Portsmouth, NH in 2011 and 2012. The deployment this year will be the first sustained test of the technology spanning the Alexandrium bloom season in the western Gulf of Maine. This will also be the first time the algal neurotoxin responsible for PSP will be autonomously measured by an ESP in natural waters.

The regional ocean observatory network managed by the Northeastern Regional Association of Coastal and Ocean Observing Systems (NERACOOS) currently consists of 12 instrumented buoys that measure currents, salinity, temperature and meteorological variables at multiple locations in the Gulf of Maine and Long Island Sound. The WHOI researchers would like to see their ESP units become an integral part of this network.

“The ESPs are not a replacement for state-run programs that monitor naturally occurring marine toxins in shellfish. Instead, they will provide valuable data on the phytoplankton cells and associated toxins in coastal waters giving managers a more complete picture of the magnitude and distribution of HAB events,” says Kohl Kanwit, director of the Bureau of Public Health for the Maine Department of Marine Resources.

Bloom toxicity can fluctuate substantially, influencing toxin levels in shellfish. This makes the capability to monitor the toxins a significant step towards assessing the potential of a bloom to cause shellfish toxicity.

“Developing this technology and transitioning it to field testing with academic and industry partners in the Gulf of Maine is the next step in delivering and validating routine forecasts,” says Greg Doucette of NOAA´s National Centers for Coastal Ocean Science (NCCOS), who developed the PSP toxin sensor.

“This pilot will demonstrate the ability of ESPs to deliver accurate and critical data to regional resource managers. This is an excellent example of federal, academic, and industry collaboration working together to protect the public´s health.”

Chris Scholin, former PhD student with Anderson, and now president and chief executive officer of the Monterey Bay Aquarium Research Institute“¯(MBARI), developed the ESP. MBARI researchers built, tested and used earlier versions of the ESP predominantly on the West Coast. The two sensors in the Gulf of Maine are the first commercially available ESPs, manufactured at McLane Research Laboratories in Falmouth, MA under a license from Spyglass Biosecurity.

The first ESP, deployed last month, is called “ESPchris” and will conduct sampling for approximately 45 days. “ESPdon” will be deployed in late May to continue sampling for another 45 days.

“This type of data will be extremely valuable for ongoing forecasting activities, which have been carried out routinely since 2008. These data are particularly important for testing the forecast model. In the longer term in which we expect more ESPs to be available, we envision assimilating these data into the model, in much the same way the weather service uses meteorological observations,” says Dennis McGillicuddy, WHOI senior scientist.

Gen-Xers Continue Formal Education Efforts Later In Life

redOrbit Staff & Wire Reports – Your Universe Online
More than one-tenth of Generation X-ers are currently taking classes to continue their formal post-secondary educations, and nearly half of them are participating in continuing education courses or certification training workshops, according to a new University of Michigan study released on Tuesday.
Of the approximately 80 million individuals born during the post-Baby Boom period (typically from the late 1960s through the 1980s), 1.8 million are studying to earn associate degrees and 1.7 million are working on their bachelor degrees, according to the Generation X Report. In addition, nearly two million of them are working on masters, doctoral or other advanced degree studies, researchers from the Ann Arbor university explained.
Furthermore, according to lead author Jon D. Miller, slightly more than 40 percent of Generation X members have earned at least a baccalaureate degree, and those living in suburbs or cities are more likely to have reached that level of education than those living in small towns or rural areas.
It also discovered that this generation has been earning graduate and professional degrees at a higher rate than any American generation that came before them. As of 2011, some two decades after they finished high school, 22 percent of those polled said that had completed at least one advanced degree.
Furthermore, 10 percent of them had completed a doctorate or other professional degree.
“This is an impressive level of engagement in lifelong learning,” Miller said in a statement. “It reflects the changing realities of a global economy, driven by science and technology.”
The report´s findings are the results of the Longitudinal Study of American Youth (LSAY), a National Science Foundation (NSF)-funded report that included responses from nearly 4,000 participants in their late thirties. The research was conducted by Miller at the University of Michigan Institute for Social Research (ISR).
In addition, the report looked at how members of Generation X utilized “informal sources of learning” to acquire information about important contemporary events. Three specific news stories were featured for this part of the study — influenza, global warming, and the accident at the Fukushima nuclear reactor in Japan.
“We found that Generation X adults use a mix of information sources, including traditional print and electronic media, as well as the internet and social media,” Miller said. “But for all three issues we examined, we found that talking with friends and family was cited as a source of information more frequently than traditional news media.”
“While a high proportion of young adults are continuing their formal education, reflecting the changing demands of a global economy, many are also using the full resources of their personal networks and the electronic era to keep up with information on emerging issues,” he added.

Bone-Headed Dinosaur Is The Oldest Known Pachycephalosaur In North America

[ Watch the Video: Scientists Name New Species of Dinosaur ]

Lee Rannals for redOrbit.com – Your Universe Online

Scientists writing in the journal Nature Communications have identified a new species of dog-sized bone-headed (pachycephalosaur) dinosaur.

The team was able to identify the new dinosaur species using both recently discovered and historically collected fossils. The dinosaur, Acrotholus audeti, represents the oldest bone-headed dinosaur in North America, and possibly the world, dating back 85 million years ago.

Scientists used two skull “caps” from the Milk River Formation of southern Alberta, Canada for the study. One of the skull caps was collected over 50 years ago by the Royal Ontario Museum, and the other was discovered in 2008 by University of Toronto graduate student Caleb Brown during a field expedition.

Acrotholus walked on two legs and had a thickened, dome skull above its eyes, which was used for display to other members of its species. Scientists believe the dinosaur may have used its skull in head-butting contests.

“Acrotholus provides a wealth of new information on the evolution of bone-headed dinosaurs. Although it is one of the earliest known members this group, its thickened skull dome is surprisingly well-developed for its geological age,” said lead author Dr. David Evans, ROM Curator, Vertebrate Palaeontology. “More importantly, the unique fossil record of these animals suggests that we are only beginning to understand the diversity of small-bodied plant-eating dinosaurs.”

Researchers believe the skull domes of pachycephalosaurs may be able to help reveal a few more details about Acrotholus. The skull domes of pachycephalosaurs are resistant to destruction, and are much more common than their relatively delicate skeletons. The scientists say fossil records from pachycephalosaurs provide valuable insights into the diversity of Acrotholus.

“We can predict that many new small dinosaur species like Acrotholus are waiting to be discovered by researchers willing to sort through the many small bones that they pick up in the field,” said co-author Dr. Michael Ryan, curator of vertebrate paleontology at The Cleveland Museum of Natural History. “This discovery also highlights the importance of landowners, like Roy Audet, who grant access to their land and allow scientifically important finds to be made.”

Scientists performed scans and computing modeling with pachycephalosaur skulls in 2011 and found that the domes were used as weapons to fend off rivals.

“Pachycephalosaur domes are weird structures not exactly like anything in modern animals. We wanted to test the controversial idea that the domes were good for head butting,” said Dr. Eric Snively, University of Calgary alumnus and post-doctoral researcher in biomedical engineering at Ohio University, a co-author of the study.

They wrote in the journal PLOS ONE that their analyses allowed them to “get inside their heads” by colliding the skulls virtually. They were able to look at the anatomical and engineering of the skull for the animals, as well as the actual tissue types.

Glowing Plants For Sustainable Lighting Wins Strong Crowdfunding Support

redOrbit Staff & Wire Reports – Your Universe Online

A project to use glowing plants to create sustainable light sources has generated a flurry of interest and financial support on the crowdfunding site Kickstarter.

With 30 days yet to go, the Glowing Plant initiative has already raised more than a quarter million dollars — far surpassing its initial goal of $65,000 — from more than 4,500 backers, each of which are promised seeds for glowing plants in exchange for their investment.

The team behind the project says they will use Synthetic Biology techniques and Genome Compiler‘s software to insert bioluminescence genes into Arabidopsis, a small flowering plant and member of the mustard family, to create a plant that visibly glows in the dark.

Arabidopsis was selected because it is easy to experiment with and carries only a slight risk for spreading into the wild. But the team hopes the same process will work for a rose, which will likely be more commercially appealing.

“Inspired by fireflies…our team of Stanford-trained PhDs are using off-the-shelf methods to create real glowing plants in a do-it-yourself bio lab in California,” the Glowing Plants team told BBC News.

The funds raised through Kickstarter will be used “to print the DNA sequences we have designed using Genome Compiler and to transform the plants by inserting these sequences into the plant and then growing the resultant plant in the lab,” wrote team leader Antony Evans on the project´s Kickstarter page.

Printing DNA costs a minimum of 25 cents per base pair, and the team will use sequences about 10,000 base pairs long.

“We plan to print a number of sequences so that we can test the results of trying different promoters — this will allow us to optimize the result,” Evans wrote.

Transforming the plant will initially be done using the Agrobacterium method, in which the printed DNA is inserted into a special type of bacteria that can insert its DNA into the plant.

“Flowers of the plant are then dipped into a solution containing the transformed bacteria,” Evans explained.

“The bacteria injects our DNA into the cell nucleus of the flowers which pass it onto their seeds which we can grow until they glow!”

Agrobacteria are increasingly being used in genetic engineering because they can transfer DNA between themselves and plants. The team posted a video of this process on their Kickstarter page.

The Agrobacterium method will only be used for prototypes, as the bacteria are plant pests and any use of such organisms is heavily regulated.

For the seeds that will be sent to the public, the team will use a gene gun that coats nanoparticles with DNA and inserts them into plants.

This step is more complicated, and there are risks the gene sequence gets scrambled, “but the result will be unregulated by the USDA and thus suitable for release,” Evans said.

The Kickstarter funds will also be used to develop an open policy framework for DIY Bio work involving recombinant DNA.

“This framework will provide guidelines to help others who are inspired by this project navigate the regulatory and social challenges inherent in community based synthetic biology.”

The framework will include recommendations for what kinds of projects are safe for DIY Bio enthusiasts, and recommendations for the processes that should be enacted.

All of the project´s output, including the DNA constructs and the plants, will be released open-source, the team said on its Glowing Plant Web site.

Harvard Medical School professor of genetics George Church, a backer of the project, said that biology could provide great inspiration for more sustainable light sources.

“Biology is very energy-efficient and energy packets are more dense than batteries. Even a weakly glowing flower would be a great icon,” he said according to BBC News.

Austen Heinz, founder of Cambrian Genomics, is another backer of the project. The Glowing Plants team will use Cambrian´s breakthrough laser printing system, which dramatically reduces the cost of DNA synthesis.

“DNA laser printing will change life as we know it, starting with glowing plants,” Heinz said.

Evans, along with fellow team leaders Omri Amirav-Drory, a synthetic biologist, and Kyle Taylor, a plant scientist, said they could envision glowing trees someday being used as streetlights.

With a month of fundraising left to go, the project seems off to a spectacular start.

European Commission Bans Three Pesticides For Two Years

Lee Rannals for redOrbit.com — Your Universe Online

Policy makers in Europe just imposed a two-year precautionary ban on a type of pesticide until more is known about how it may affect bees.

The European Commission said it would be adopting a proposal to restrict the use of three pesticides belonging to the nenicotinoid family; including clothianidin, imidacloprid and thiametoxam. An Appeal Committee voted on April 29, 2013 and failed to agree on restrictions for the pesticides. Because no qualified majority was reached, the responsibility of deciding whether to adopt the ban or not fell to the Commission.

“The Commission’s action is a response to the European Food Safety Authority’s (EFSA) scientific report which identified “high acute risks” for bees as regards exposure to dust in several crops such as maize, cereals and sunflower, to residue in pollen and nectar in crops like rapeseed and sunflower and to guttation in maize,” the Commission said.

It said the main elements of its proposal to Member States include restrictions on the pesticides for seed treatment, soil applications and foliar treatment on bee attractive plants and cereals. Authorized professionals will be able to use the three pesticides still for study purposes.

The restrictions will begin on December 1, 2013, and the Commission said it would review the conditions of approval of the three neonicotinoids in two years.

The EFSA made some fundamental mistakes when reviewing the pesticides, including serious over-estimation of the amount of pesticide bees are exposed to in the field. ESFA also ignored key studies and independent monitoring, including recent data from the UK government.

Bee health decline is a growing problem, and experts are having a tough time pinpointing the problem, but a big majority of them partially blame pesticides. The US Department of Agriculture said the number of beehives decreased for the third consecutive year in 2009, dropping 29 percent.

National environmental groups launched a campaign on Earth Day called BEE Protective to try and protect honeybees and other pollinators from pesticides. The launch came a month after beekeepers, Center for Food Safety, Beyond Pesticides, and Pesticide Action Network North America filed a lawsuit against the EPA.

“These toxic chemicals are being used without scrutiny in communities across the country, so much so that we´re facing a second Silent Spring. A growing number of concerned citizens are ready to step up to protect bees; this new educational campaign will give them the tools they need to have an impact,” said Andrew Kimbrell, executive director of Center for Food Safety.

Bats Get Tongue Erections To Soak Up Extra Nectar

Watch the video “A Dynamic Nectar Mop

April Flowers for redOrbit.com – Your Universe Online

What do busy janitors and nectar feeding bats have in common?  They both want to wipe up as much liquid as they can, as fast as they can. And it turns out, they both have specialized equipment for the job.

A new study, led by Brown University, describes the previously undiscovered mechanism used by the bat, Glossophaga soricina, to slurp up extra nectar from within a flower: a tongue tip that uses blood flow to erect scores of tiny hair-like structures at exactly the right time.

The findings were recently published in Proceedings of the National Academy of Sciences and demonstrate that the bat´s “hemodynamic nectar mop,” or tongue tip, features speed and reliability that would be the envy of industrial designers. As an example of the types of tools that nature can evolve, the tongue tip is surprising clever, according to Cally Harper a graduate student in the Department of Ecology and Evolutionary Biology at Brown University.

“Typically, hydraulic structures in nature tend to be slow like the tube-feet in starfish,” Harper said. “But these bat tongues are extremely rapid because the vascular system that erects the hair-like papillae is embedded within a muscular hydrostat, which is a fancy term for muscular, constant-volume structures like tongues, elephant trunks and squid tentacles.”

A mesh of muscle fibers contract the bat´s cylindrical tongue so that it becomes thinner and longer, allowing it to extend farther into the flower. The research team found that the same muscle contraction squeezes blood into the tiny hair-like papillae at the same time.

The papillae flare out perpendicular to the axis of the tongue as blood is displaced to the tip. The erect papillae add not only to the exposed surface area, but also add width. This allows the tongue to function as a highly effective nectar gathering tool.

The entire action — extension, nectar gathering and retraction — happens within an eighth of a second. Nectar feeding bats have to get a lot of calories very quickly to make the energy expenditure of hovering worthwhile.

Before this study, scientists were aware of the papillae, but regarded them as passive strings on a mop. Recent studies into the mechanics of hummingbird tongues inspired Harper to examine the shape of the bat´s tongue tip and how it is involved in nectar gathering.

Clear vascular connections between the main arteries and veins of the tongue and the papillae were observed in detailed anatomical studies. Harper was then able to recreate the erection of the papillae by pumping saline into the vascular system.

Harper said that while challenging to create, the color videos of bats feeding on nectar were especially convincing.

“That was one of my favorite parts of the study – the Aha moment,” she said. “We shot color high-speed video of the bats gathering nectar, which is challenging to obtain because color cameras require a lot of light and the one thing that bats don’t like is a lot of light.”

Harper, with help from professors Beth Brainerd and Sharon Swartz, was able to focus a lot of light right where the tongue tip would be without shining any of the light into the bats´ eyes. This allowed the team to see that the papillae extend, they turn from a light pink to a bright red as they fill with blood.

“That was really the icing on the cake as far as nailing this vascular hypothesis,” Harper said.

Harper is unsure if other nectar feeding bats have this same tongue mechanism, but the team speculates that the honey possum might also employ it. Other species, such as hummingbirds and bees, use different means of rapidly morphing their tongues for improved feeding. The researchers speculate that any or all of these highly evolved designs could provide technological inspiration.

“Together these three systems could serve as valuable models for the development of miniature surgical robots that are flexible, can change length and have dynamic surface configurations,” Harper, Brainerd and Swartz wrote.

Or perhaps, we might just improve that janitor´s mop.

Role Reversal: Younger Black Widow Males Cannibalize Older Females

Alan McStravick for redOrbit.com – Your Universe Online

The idea that the female of a species would, after completing coitus, approach, kill and eat her mate is a sentiment that instills, if not fear, unease in many the male reader. This scenario, in fact, is exactly how the Black Widow spider got its name. However, according to researchers out of Masaryk University in the Czech Republic, the popular knowledge behind this spider might not necessarily be so.

Study authors Lenka Sentenska and Stano Pekar found through the course of their research the male spiders of the Micaria sociabilis species are more likely to eat the females than be eaten.

The study, conducted over a two-year period, involved the collection of both male and female M. sociabilis spiders. The researchers noted the animals´ behaviors by mixing the males and females at different time points. This intermingling of the spiders allowed the observers to witness what happened when they paired young adult male spiders with single female spiders either from the same generation or from another generation. The researcher team wanted to learn, by pairing males with females of different size, age and mating status, if they were able to identify a form of reverse sexual cannibalism and whether or not it was an adaptive mechanism adopted for male mate choice.

The results show cannibalism took place early after the initial contact between the male and the female. Important to note, this cannibalism occurred before any mating took place. In addition to the females’ age and size relative to the male, the team also learned reverse cannibalism differed significantly based on the month it was occurring. Males from the summer generation, it was noted, tended to be bigger than males born in the spring. They also exhibited more cannibalistic tendencies. The team inferred male aggression in M. sociabilis may be related to size.

The highest frequency of reverse cannibalism occurred when the larger, summer males would be paired with older females from the spring generation. It is then understood the age of the female may be the deciding factor on whether or not the male opts to cannibalize her. Also interesting to the research team was there was no difference in male cannibalization of females who had previously mated or were virgins. This, they say, demonstrates how, in some species and in some cases, the male makes a very clear choice about who they will mate with.

“Our study provides an insight into an unusual mating system which differs significantly from the general model,” the authors stated. “Even males may choose their potential partners and apparently, in some cases, they can present their choice as extremely as females do by cannibalizing unpreferred mates.”

Results of their research were recently published in the journal Behavioral Ecology and Sociobiology.

Engineers Design Robot That Can ‘Discover’ New Objects

Lee Rannals for redOrbit.com – Your Universe Online

A robot developed by scientists at Carnegie Mellon University’s (CMU) Robotics Institute is able to analyze and learn about new objects.

Researchers developed a two-armed mobile robot called HERB that can ℠discover´ more than 100 objects in a home-like laboratory, including items like computer monitors and plants. Robots are typically designed with objects already preprogrammed into their software, so creating one that can understand objects on its own accord represents an important step forward into the future of robotics.

The CMU team built digital models and images of objects and then loaded them into HERB´s memory, or the “Home-Exploring Robot Butler.” Adding this feature allows HERB to discover objects on its own. Eventually, the team believes the implementation of HerbDisc will help people accomplish a variety of tasks in daily living.

Siddhartha Srinivasa, associate professor of robotics and head of the Personal Robotics Lab, says the robot’s ability to discover objects on its own sometimes even surprises the researchers. He said in one case, students left the remains of lunch in the lab, and the next morning when they returned, HERB had built digital models of both the pineapple and the bag and figured out how to pick each one up.

“We didn’t even know that these objects existed, but HERB did,” said Srinivasa, who jointly supervised the research with Martial Hebert, professor of robotics. “That was pretty fascinating.”

He said manually loading digital models of every object of possible relevance simply isn’t feasible, but giving robots the ability to do this on their own is crucial to eventually implementing robots more effectively in our daily lives.

HERB’s Kinect sensors give it a three-dimensional perspective and allow it to gather data about the shape of the objects being observed. The robot also is able to see whether a potential object can move on its own or whether it is moveable at all. HERB can note whether something is in a particular place at a particular time, and can use its arms to see if it can lift the object.

“The first time HERB looks at the video, everything ‘lights up’ as a possible object,” Srinivasa said.

He says that as the robot uses its domain knowledge, it becomes clearer what the object is and what it isn’t. Adding this knowledge nearly tripled the number of objects HERB was able to discover.

Robots are constantly under development, and there is still a long way to go before they are brought into our daily lives. However, scientists are making great strides. One researcher at the Norwegian University of Science and Technology is developing a way to control a robot with his mind. Researcher Angel Perez Garcia says he is able to make a robot move using an EEG and brain power.

Hanging Gardens Of Babylon Discovered 300 Miles Away In Nineveh

Lawrence LeBlond for redOrbit.com – Your Universe Online

The Hanging Gardens of Babylon have long been regarded as one of the Seven Wonders of the Ancient World, although not without controversy. It seems that the Gardens have also been regarded as purely legendary, with no evidence that this ancient site ever existed in Babylon.

For centuries, historians, archaeologists and others have imagined what the Hanging Gardens may have looked like and several artists, most notably Dutch artist Maarten van Heemskerck in the 16th century painted his concept of the Gardens, complete with the Tower of Babel in the background.

Now, a historian with Oxford University may have cracked the case wide open, potentially solving centuries-old theories of the Hanging Gardens.

Dr. Stephanie Dalley, of Oxford´s Oriental Institute, said the fabled Hanging Gardens of Babylon were not actually located in Hillah and were not built by King Nebuchadnezzar of Babylon. In fact, she says the site was not even in Babylon at all, but rather 300 miles north in Nineveh, and built by the Assyrian ruler Sennacherib.

According to The Independent, Dalley first proposed the idea in 1992 and has spent the better part of two decades piecing the mystery together. Now, after working diligently for so long on the project, she is set to reveal the findings in a book to be released later this month.

Poring over historical documents and descriptions of the legendary Gardens, Dalley has discovered that a nineteenth century bas-relief from Sennacherib´s palace in Nineveh showed trees growing atop a colonnade exactly as described by earlier accounts. She also found evidence that the Assyrian capital became known as ℠New Babylon´ after Assyria conquered Babylon in 689 BC. Dalley also uncovered several places in the region that were known by the name Babylon.

She also uncovered evidence that after the successful invasion of Babylon, the gates of Nineveh were renamed for those traditionally used for Babylon´s city gates. Furthermore, through geographical assessments of the flat land surrounding Babylon, it would have been impossible for a water delivery system to be implemented in that region. And Dalley found descriptions of the Gardens that were written by historians who actually had visited the Nineveh region, making it more likely that the Hanging Gardens actually existed in Nineveh.

Dalley found historical documentation that tells of Alexander the Great, whose army camped near the city in 331 BC, close to one of the great aqueducts that Dalley believes had carried water to the real site of the legendary Gardens.

Dalley told David Keys of The Independent that it has “taken many years to find the evidence to demonstrate that the gardens and associated system of aqueducts and canals were built by Sennacherib at Nineveh and not by Nebuchadnezzar in Babylon.”

“For the first time it can be shown that the Hanging Garden really did exist,” she added.

Dalley noted that a German team spent nearly 20 years last century looking for remnants of the Hanging Gardens, but never found a single clue confirming that Nebuchadnezzar built the site in Babylon. “To their dismay, they could not find any possible location with enough space in the vicinity of the palaces, nor did they dig out any written confirmation from the many texts they unearthed,” Dalley remarked in a statement to The Telegraph.

In the end, Dalley concludes that the Hanging Gardens were built in a different century, in a different location, and by a different king leading a different civilization.

Our Ancestors Had A Taste For Gazelle Brains

Michael Harper for redOrbit.com — Your Universe Online

A new study has once again shown that our human ancestors had no qualms about eating every part of their prey, including the brains.

After uncovering fossils in Kenya, anthropologist Joseph Ferraro of Baylor University and his colleagues discovered that the earliest humans living in East Africa had a taste for multiple parts of the antelope. These early humans would even scavenge the leftovers of larger predators and finish off the remains. Their research published in an April 25 report in the online open-access journal PLOS ONE.

According to Science News, the anthropologists uncovered three sets of fossilized animal bones in Kenya, giving them new insights into how these early humans hunted, scavenged and ate. The fatty tissue of the brain could have given early Homo erectus the added energy boost they needed to hunt another day. These new findings also correspond with earlier digs which uncovered small animal bones with marks in them, suggesting butchery by small stone tools. This has led scientists to believe that the earliest humans were skilled meat eaters.

Ferraro´s study found that humans living in this area 2 million years ago would have hunted small animals like gazelles and hauled their kill back to their homes in Kanjera South. The team arrived at this conclusion after noticing that small bones from these animals were separated from the rest of the animal found in Kanjera South. These gazelle bones also bore the markings of primitive stone tools, suggesting that even more butchery was practiced here.

The team also discovered that these early humans knew where the meatiest portions of the animal could be found and selected these cuts first. The humans may have stuck to a raw meat diet, as these sites contained no signs of burned wood or other telltale signs of cooking. Additionally, some of the bones even bore the markings of both stone tools and lion´s teeth, suggesting that the humans may have killed the animal first, but the lion may have robbed them of their dinner.

What lead Ferraro and team to believe that Homo erectus had a taste for brain was the disproportionately large number of skulls and lower jaws found at the dig site compared to other types of bone. According to Ferraro, this could mean the humans learned to eat what bigger predators, such as lions, may have left behind.

Some predators have been found to devour their prey rather quickly before moving on. The archaeological team suspects that the earliest humans may have kept an eye on such predators, visiting the kill site after the predators had walked away and scavenging whatever portions they could find. This, say the researchers, may have often included the skull and brains.

Ferraro and his team also believe that early humans may have relied heavily on hunting to provide their meals between scavenging sessions, though more research is required in order to understand just how much they relied on hunting as opposed to scavenging.

The difficulty for future studies, explains Ferraro, will be determining this ratio between hunting and scavenging because it can be difficult to discern from the fossil record alone. After all, the fossil of a gazelle that was killed by a lion and quickly chased off by a human may look the same as the fossil of a gazelle which was only killed by a human. Understanding these early behaviors, however, could help archaeologists and ancient historians better understand how our earliest ancestors lived and track the ways in which we´ve progressed throughout the millennia.

Prevent Alzheimer’s Disease With A Glass Of Champagne

Lawrence LeBlond for redOrbit.com – Your Universe Online

With the growing threat of Alzheimer´s for millions of Baby Boomers (those born from 1946 to 1964) it only makes sense to find ways that may prevent the mind-robbing disease. Research last year uncovered possible evidence that resveratrol found in red wine may help in preventing cognitive decline. Now, a new study is looking at another type of alcohol that may also ward off dementia and Alzheimer´s disease.

Researchers from Reading University have found that three glasses of champagne per week could help prevent the onset of brain disorders such as dementia and Alzheimer´s disease. The team discovered that a compound found in black wine grapes (Pinot noir and Pinot meunier) help fight forgetfulness.

Champagne, which is made using these types of grapes, could now be just the right beverage for tackling dementia before it has a chance to set in. This is not the first time the bubbly has been touted for its health benefits. The same Reading team found in 2009 that champagne is good for the heart and blood circulation.

The memory-helping compound in champagne, however, is much different: phenolic acid. About 80 percent of all champagne is made from the two black grape varieties blended together with a white Chardonnay grape.

The researchers, led by Jimmy Spencer, a biochemistry professor at Reading, found that phenolic acid provokes a noticeable boost to spatial memory, allowing the ability to recognize surroundings and help people find their way home.

“Dementia probably starts in the 40s and goes on to the 80s. It is a gradual decline and so the earlier people take these beneficial compounds in champagne, the better,” Spencer told Mail Online´s Valerie Elliott.

Spencer said that while his team´s study was conducted on rats, there is a strong confidence among the team that the results would be remarkably similar in the human brain.

For the study, rats were either given champagne daily, mixed in their feed for six weeks or not given champagne at all. Each rat was also allowed to run through a maze to find an edible treat. The task was then repeated five minutes later to see if the rat remembered where it had retrieved the first treat and where it could find another.

The rats that did not have champagne mixed in their feed had a 50 percent success rate in the maze experiment. However, after champagne was added to the feed, success rates shot up to 70 percent on average.

“The results were dramatic. After rats consumed champagne regularly, there was a 200 per cent increase of proteins important for determining effective memory. This occurred in rats after just six weeks. We think it would take about three years in humans,” Spencer told Mail Online.

“This research is exciting because it illustrates for the first time that moderate consumption of champagne has the potential to influence cognitive functioning such as memory,” he added.

Spencer and his team now hope to conduct the experiments on older human subjects, asking them to drink champagne for three years to further test the mind-aiding benefits of the fizz.

“This is an interesting study, especially for those who enjoy a glass of bubbly,” said a spokesperson for the Alzheimer´s Society. “However, people should not start celebrating just yet. This is the first time a link between champagne and dementia risk reduction has been found. A lot more research is needed.”

Brain Cell Transplant Cures Severe Form Of Epilepsy In Mouse Model

redOrbit Staff & Wire Reports – Your Universe Online

Scientists from the University of California – San Francisco (UCSF) have effectively cured epilepsy in mice by transplanting brain cells into the rodents´ hippocampus — research that they hope could one day be applied to help treat severe forms of the condition in humans.

Dr. Scott C. Baraban, who holds the William K. Bowes Jr. Endowed Chair in Neuroscience Research at UCSF, and colleagues took medial ganglionic eminence (MGE) cells, which inhibit signaling in overactive nerve circuits, and transplanted them in the area of the brain associated with seizures.

While previous research focusing on different cell types proved unsuccessful, Baraban´s team was able to control the seizures in epileptic mice following the transplantation. The results of their work, which is reportedly the first ever successful attempt to stop seizures in mouse models of adult human epilepsy, were published Sunday in the online edition of the journal Nature Neuroscience.

“Our results are an encouraging step toward using inhibitory neurons for cell transplantation in adults with severe forms of epilepsy,” Baraban, whose work was funded by the National Institutes of Health (NIH) and by the California Institute of Regenerative Medicine, said in a statement. “This procedure offers the possibility of controlling seizures and rescuing cognitive deficits in these patients.”

According to the researchers, epileptic seizures often lead to severe muscle contractions and a possible loss of consciousness. As a result, the patient can lose control of his or her body, falling and potentially becoming seriously injured. These seizures occur when too many excitatory nerves in the brain fire at the same time.

However, the UCSF researchers report that the inhibitory cells they implanted into the mice “quenched this synchronous, nerve-signaling firestorm, eliminating seizures in half of the treated mice and dramatically reducing the number of spontaneous seizures in the rest.”

“These cells migrate widely and integrate into the adult brain as new inhibitory neurons,” Baraban explained. “This is the first report in a mouse model of adult epilepsy in which mice that already were having seizures stopped having seizures after treatment.”

The model that the UCSF team worked on was designed to resemble an especially severe form of human epilepsy known as mesial temporal lobe epilepsy. Mesial temporal lobe epilepsy is typically resistant to medication, and in this condition, the seizures are believed to originate in the hippocampus.

While the transplantation into this region of the brain proved successful, transplants into the amygdala — a region of the brain involved in memory and emotion — was unable to end seizure activity in this same mouse model, Baraban´s team said. Furthermore, in addition to reducing seizures, the mice treated in this way became less agitated and less hyperactive, and also performed better in water-maze tests, they added.

Tobacco Users Smoke More Cigarettes If They Also Use Pot

redOrbit Staff & Wire Reports – Your Universe Online

Smokers who use both tobacco and marijuana tend to smoke more cigarettes per month than those who only use tobacco, according to new research presented Sunday as part of the Pediatric Academic Societies (PAS) annual meeting in Washington, DC.

“Contrary to what we would expect, we also found that students who smoked both tobacco and marijuana were more likely to smoke more tobacco than those who smoked only tobacco,” study author Dr. Megan Moreno, an associate professor of pediatrics at the University of Washington who is also affiliated with Seattle Children’s Research Institute, said in a statement.

Moreno and her colleagues randomly selected incoming college students from two universities (one in the Northwest and one in the Midwest) to participate in a longitudinal study of tobacco and cannabis smoking habits. They were asked about their attitudes, intentions and experiences with both substances twice — once before entering college and a second time after completing their freshman year.

Each participant was asked if they had used tobacco or marijuana at any point in their lives, as well as if they had used either substance within the past four weeks. The researchers also gathered information about the quantity and frequency of such use over that 28-day period.

They found that, prior to entering college, one-third of the 315 subjects had reported using tobacco, and 43 percent of those were currently smoking cigarettes. Tobacco users were also found to have been more likely than non-smokers to have smoked pot. Following their freshman year, two-thirds of pre-college tobacco users continued doing so, and 53 percent of them had reported concurrent marijuana use.

The smokers reported an average of 34 tobacco episodes per month. However, those who used both substances reported smoking cigarettes an average of 42 times per month, versus just 24 tobacco related episodes for those who did not also use cannabis.

“These findings are significant because in the past year we have seen legislation passed that legalizes marijuana in two states. While the impact of these laws on marijuana use is a critical issue, our findings suggest that we should also consider whether increased marijuana use will impact tobacco use among older adolescents,” said Dr. Moreno.

Future research, she added, should focus on educating people about the risk of using both substances together. The research was funded by a grant from the National Institutes of Health (NIH).

Despite Bans And Warnings, Teens Continue To Text And Drive

redOrbit Staff & Wire Reports – Your Universe Online

Despite numerous state laws prohibiting texting while operating motor vehicles and countless advertising campaigns warning of the dangers of distracted driving, nearly half of all teenagers admit that they still send and receive messages while behind the wheel.

According to research presented at the Pediatric Academic Societies (PAS) annual meeting on Saturday, approximately 43 percent of the driving-age high-school students that responded to a 2011 survey said that they had driven while texting at least once during the previous 30 days.

“Texting while driving has become, in the words of Transportation Secretary Ray LaHood, a ℠national epidemic,´” principal investigator Alexandra Bailin, a research assistant at Cohen Children’s Medical Center in New York, said in a statement.

“Although teens may be developmentally predisposed to engage in risk-taking behavior, reducing the prevalence of texting while driving is an obvious and important way to ensure the health and safety of teen drivers, their passengers and the surrounding public,” she added.

According to the researchers, the primary cause of death among teenagers is motor vehicle accidents, and using a phone while behind the wheel of a car or truck significantly increases the risk that drivers in this age group will become involved in an accident. In fact, they report that the risk of a texting driver being involved in a wreck is 23 times the normal accident rate.

In order to discover how common texting while driving was amongst teenage drivers, Bailin´s team looked at data collected by the US Centers for Disease Control and Prevention´s (CDC) 2011 Youth Risk Behavior Survey. The CDC conducts the survey every two years to track six types of risky behaviors that contribute to the primary causes of death, disability, and social problems among young Americans.

In 2011, 7,833 high school students completed the study, which for the first time asked participants whether or not they had texted or sent e-mails while driving a motor vehicle over the past 30 days. They were also attempting to determine whether or not other high-risk behaviors were linked with distracted driving, and whether or not state laws prohibiting texting while driving were effective among teenagers.

“Survey results showed that males were more likely to text while driving than females (46 percent vs. 40 percent), and the prevalence of texting increased with age (52 percent of those over 18 years; 46 percent of 17-year-olds; 33 percent of 16-year-olds; and 26 percent of 15-year-olds),” the American Academy of Pediatrics, one of the organizations that co-sponsor the PAS Annual Meeting, explained.

“Furthermore, teens who reported texting while driving were more likely to engage in other risky behaviors such as driving under the influence of alcohol, having unprotected sex and using an indoor tanning device,” they added. “The researchers also found that state laws banning texting while driving had little effect: 39 percent of teens reported texting in states where it is illegal vs. 44 percent of teens in states that have no restrictions.”

Bailin said that she and her colleagues hope that they can identify these types of risky behaviors in order to find new ways to keep high-school students from texting while driving.

“Although texting while driving was slightly less common in states that prohibit it, the reality is that millions of teens text while driving,” said senior investigator Andrew Adesman, the head Developmental and Behavioral Pediatrics at Cohen Children’s Medical Center.

“Regrettably, our analysis suggests that state laws do not significantly reduce teen texting while driving,” he added. “Technological solutions will likely need to be developed to significantly reduce the frequency of texting while driving“¦ phones will have to get smarter if they are to protect teens (and others) from doing dumb things.”

Pregnant Women Who Can’t Quit Smoking Should Take Vitamin C

redOrbit Staff & Wire Reports – Your Universe Online

Pregnant women who have difficulty quitting smoking can take vitamin C to help improve the lung function of her child while also helping to prevent wheezing during the baby´s first year of life, according to a study presented Saturday at the annual meeting of the Pediatric Academic Societies (PAS) in Washington, DC.

Lead researcher Cynthia T. McEvoy, an associate professor of pediatrics at Oregon Health & Science University (OHSU) Doernbecher Children’s Hospital, and colleagues recruited 159 women who were less than 22 weeks pregnant and unable to kick their cigarette habit.

Each study participant was randomly assigned to take either one 500 milligram capsule of vitamin C or a placebo along with a prenatal vitamin each morning. Neither the women nor the study investigators knew what was in the capsule, and a second group of nonsmoking expectant mothers was also analyzed as part of the study.

Forty-eight hours after each baby was born, the researchers tested their pulmonary function. They looked at how each infant inhaled and exhaled, how easily each child´s lungs moved, and how large the babies´ lungs were. The results showed that the children of smoking mothers who had taken vitamin C had “significantly improved lung function at birth” when compared to those who took a placebo.

Furthermore, the study authors kept in contact with the patents throughout the first year of their children´s lives in order to record any occurrences of wheezing or other respiratory symptoms. They discovered that the infants of the vitamin C group had “significantly less wheezing” in their first year of life than those in the placebo group.

“Specifically, 21 percent of infants in the vitamin C group had at least one episode of wheezing compared to 40 percent of those in the placebo group and 27 percent of infants born to nonsmokers,” the American Academy of Pediatrics, one of the organizations that co-sponsor the PAS Annual Meeting, said in a statement. “In addition, 13 percent of infants whose mothers were randomized to vitamin C needed medication for their wheezing compared to 22 percent of infants in the placebo group and 10 percent in the nonsmoking group.”

McEvoy called Vitamin C “a simple, safe and inexpensive treatment that may decrease the impact of smoking during pregnancy on childhood respiratory health.” Likewise, study co-author and OHSU Oregon National Primate Research Center senior scientist Eliot Spindel said that while the first priority was to get expecting mothers to stop smoking, their research has discovered “a way to potentially help the infants born of the roughly 50 percent of pregnant smokers who won’t or just can’t quit smoking no matter what is tried.”

Kids With Strep Throat May Not Need To Replace Toothbrushes

redOrbit Staff & Wire Reports – Your Universe Online

You might not need to throw out your tooth brush after recovering from a sore throat after all, according to a new study presented Saturday at the annual meeting of the Pediatric Academic Societies (PAS) in Washington DC.

While some health care professionals tell patients — especially children — to replace their toothbrushes after suffering from a cold, the flu, or a case of strep throat, researchers from the University of Texas Medical Branch (UTMB) at Galveston advise that doing so might not be necessary after all.

The UTMB researchers attempted to grow group A Streptococcus (GAS), the bacteria responsible for strep throat, on toothbrushes that had previously been exposed to the bacteria in laboratory conditions. They reported that the pathogen grew on the brushes, and that the microbes remained there for at least 48 hours.

However, two brand-new toothbrushes that had not been exposed to GAS and served as controls also grew the bacteria, despite having been taken out of their packaging under sterile conditions. An adult-sized toothbrush grew “gram-negative bacilli” while a child-sized one grew “gram-positive cocci, which was identified as Staphylococcus.”

From there, they attempted to grow GAS on toothbrushes used by children who had strep throat. They recruited 14 patients that had been diagnosed with strep throat, 13 patients that had sore throats but not strep, and 27 healthy patients, all of whom were between the ages of two and 20.

Each subject was told to brush their teeth for one minute using a new toothbrush, and those brushes were then placed in a sterile cover and transported to a laboratory for testing. They were only able to recover GAS from one toothbrush, and it had belonged to a patient that did not have strep. They were unable to grow GAS on the other brushes, but were successful in growing other common mouth bacteria.

“This study supports that it is probably unnecessary to throw away your toothbrush after a diagnosis of strep throat,” explained study co-author Judith L. Rowen, an associate professor of pediatrics in the UTMB Department of Pediatrics.

However, Lauren K. Shepard, another co-author of the study as well as a resident physician in the UTMB Department of Pediatrics, said that the study was small and that additional research with more participants would be necessary in order to confirm that GAS does not grow on toothbrushes used by children suffering from strep throat.

Baboons As Good As Human Children At Selecting Between Two Amounts

[ Watch the Video: Baboons Understanding Math ]

redOrbit Staff & Wire Reports – Your Universe Online

Baboons can be as accurate as human children when it comes to discriminating between different quantities of various objects, experts from the University of Rochester and the Seneca Park Zoo claim in a new study.

Jessica Cantlon, an assistant professor of brain and cognitive sciences at the university, and colleagues observed eight olive baboons between the ages of four and 14 in more than 50 separate trials which had the creatures attempt to guess which cup was filled with the most peanuts.

The investigators placed between one and eight peanuts into two different cups, always making sure that the numbers were different. The baboons were allowed to have all of the peanuts in whichever cup they selected, whether it contained the greatest number of treats or not.

In situations where the relative difference between the amount of peanuts in each cup were relatively high, the primates successfully selected the larger amount approximately 75 percent of the time. When the amounts were closer (such as six in one cup and seven in the other), their accuracy rate dropped to 55 percent.

“In this study we’ve shown that non-human primates also possess basic quantitative abilities. In fact, non-human primates can be as accurate at discriminating between different quantities as a human child,” Cantlon said Friday in a statement. “This tells us that non-human primates have in common with humans a fundamental ability to make approximate quantity judgments. Humans build on this talent by learning number words and developing a linguistic system of numbers, but in the absence of language and counting, complex math abilities do still exist.”

Reporting in the journal Frontiers in Psychology, Cantlon and her colleagues explained that the pattern helps solve a longstanding mystery surrounding animals´ understanding of the concept of quantity. While experts have speculated that non-humans use two different systems to evaluate numbers (one based on keeping track of discrete objects and the other based on comparing approximate differences between counts), the new study shows that the baboons relied on the latter approach when making their choices.

The researchers explain that the baboons were able to consistently discriminate pairs with larger numbers provided that the difference in the amount of peanuts in the cups was relatively large. That would not have been impossible under the first method, meaning that they used the second one, which is known as the analog method.

Previous research has shown that human children who have not yet learned how to count use these kinds of comparisons to select between number groups, as do adults when estimates are needed quickly, the researchers said. Furthermore, studies with other animals have revealed that birds, lemurs, chimpanzees, and even some fish possess a similar ability to estimate relative quantity, they added.

“A lot of people don’t realize how smart these animals are,” said co-author Jenna Bovee, who is the elephant handler and the primary keeper for the baboons at the Seneca Park Zoo. “Baboons can show you that five is more than two. That’s as accurate as a typical three year old, so you have to give them that credit.”

“In the same way that we underestimate the cognitive abilities of non-human animals, we sometimes underestimate the cognitive abilities of preverbal children,” Cantlon added. “There are quantitative abilities that exist in children prior to formal schooling or even being able to use language.”

18 Alaskan Teens Use Phishing Scam To Hack School System

Michael Harper for redOrbit.com — Your Universe Online
At least 18 Alaskan students are accused of using a phishing scam to gain control over the computers at their middle school. According to the Anchorage Daily News, these students hacked into the school-owned laptops after tricking a teacher into giving them an administrative password. Once inside, the unnamed students accessed their classmate´s laptops remotely using a feature built for teachers to monitor their students.
The Ketchikan school has now seized each of the 300 laptops loaned to students in order to identify any other students involved. Once the investigation is complete, the school district will determine the appropriate punishment for the adolescent cybercriminals.
In an interview with Ketchikan FM station KRBD, Casey Robinson, principal at Schoenbar Middle School, explained how the students were able to get administrator access and spy on their classmates. According to Robinson, these students used a trick most often used by the youngest of children: asking for a password to upgrade a piece of software.
“Students were manipulating their machines, so the teachers thought they were installing an upgrade of Java for example, and in the background something else was running that the teacher was actually logging into as well. And it only took one time,” explained Robinson.
Jurgen Johansen, the district´s technical supervisor, said the method used by the students to hijack the system has been used for many years. He also said he´s surprised this hasn´t happened sooner, but a few rookie mistakes made by the students blew their cover pretty quickly.
Once the students had access to the teacher´s account, they began making new administrator accounts for one another. With these accounts the students spied on each other and their peers. Some classmates noticed their laptops were acting unusual, as if being controlled by another person, and notified their teachers. Robinson said the students first hacked into the system last Friday and students reported the prank the next Monday.
“We’ve got some really good kids here,” said Robinson in an interview with the Associated Press. “When they know something’s not right, they let an adult know.”
The school has since retrieved each of the 300 laptops that were available to students and have begun an investigation to determine what else these students did while they had access to administrative accounts. Robinson says that so far, they haven´t found any additional hardware or software issues.
“I don´t think there was any personal information compromised. Now that we have all the machines back in our control, nothing new can happen.”
Johansen said his IT team will be putting in a lot of overtime to complete their investigation and find each of the students involved in this hack. Though he said he was surprised this kind of attack hasn´t happened sooner, Johansen and the district are using this incident to review their loaned computer policies. This means the agreement signed by students and parents and the student code of conduct will likely have to be amended to prevent this sort of attack from happening twice.
“How we do business is definitely going to have to change when it comes to updating programs and resources on the machines,” said Robinson.
“Yes, something new is going to have to happen.”

Researchers Look At Muscle Adaptation In Response To Minimalist Running

Alan McStravick for redOrbit.com – Your Universe Online

Just two months ago, redOrbit learned we had a fairly large and vocal audience within the running community when we published an article on the rise of minimalist running shoes and how, without proper gradual introduction, the risk of injury to the runner is a distinct possibility.

Individuals who commented on the March 7 article represented views on both the pro and con side of the argument for the adoption of the relatively new minimalist footwear. But one thing was clear: barefoot and near-barefoot running, perceived to be a more natural form of the sport and therefore less injurious to feet and legs, is here to stay.

The primary difference between traditional and minimalist shoes lays in the stride used by the runner. A traditional shoe, due to the cushioning, causes the runner to land on the heel. Barefoot and minimalist runners land on the forefoot. This landing difference has a direct effect on how the muscles of the legs and feet respond and develop.

But, exactly, how do the muscles change when adapting to a new running style? This question led researchers at the University of Virginia (UVa) to conduct a new one-of-a-kind study of runners currently transitioning from traditional to minimalist running.

“We want to know what happens to the muscles of the leg and foot when recreational runners make the switch to minimalist footwear,” said Geoffrey Handsfield, a UVa PhD student in biomedical engineering who is leading the study. “Many minimalist shoe manufacturers make claims that their shoes will lead to strengthening the muscles of the calf and feet while avoiding common running injuries. However, there is little scientific evidence supporting these claims.”

The purpose of the study was to learn exactly which muscles are strengthened and which are weakened; which elongate and which shorten; and if some muscles involved in the act of running don´t change at all.

To conduct their study Handsfield and his co-investigators, biomedical engineering professor Silvia Blemker and third-year undergraduate biomedical engineering student Natalie Powers, utilized both static and dynamic MRI in conjunction with motion capture cameras and an instrumented treadmill to track the running technique and muscle tissue adaptations of recreational runners transitioning to the minimalist running technique.

“Most studies and discussions have been about running form and the effects on bones and joints, but we´re taking a different approach,” Handsfield said in a statement. “We think it´s relevant to look at the muscles´ adaptations, which also affect the bones and joints in their interactions.”

Handsfield claims this is among the first longitudinal studies of runners during a transition to a new running technique. Additionally, this is the first study to use advanced imaging to study the effects on muscles as a result of different running techniques.

“Dynamic MRI allows us to image the tissue very rapidly so that we can observe displacements of the muscle tissue as our subject performs a controlled cyclic exercise,” Handsfield said. “We’re also using static MRI to determine the subjects´ muscle volumes and lengths before and after their transition to minimalist footwear, allowing us to quantify how their muscles changed with minimalist training.”

It is important to note the purpose of the study is not so the researchers can claim one style of running is better than another. The overall purpose is solely to learn the affects of the change on the muscles runners use. They claim their eventual results could help an individual to make their own decisions with regard to which footwear and running style they wish to adopt.

According to Blemker, “Shoe companies are generally not equipped to undergo fundamental studies aimed at understanding how shoe designs affect muscles.” She continued, “At a university, we are able to focus on this type of research that ultimately both advances our fundamental understanding of muscle adaptation and potentially provides a scientific basis for future shoe designs.”

As of the publishing of this article, the researchers have completed phase one of their study — mapping the muscles of study participants who run in standard footwear. Phase two, soon to begin, will map changes that occur within the muscles as their subject runners begin the transition to the more minimalist footwear. Subject runners for the study range in age from 23 to 30 years and are classified as recreational runners who run between 12 and 30 miles per week.

In addition to receiving UVa “Double ℠Hoo” grant funding, intended to pair a graduate and undergraduate students on research projects, the study benefited from a gift from the Merrell shoe company, a manufacturer of minimalist footwear.

This most recent study was able to take advantage of technology originally developed under a project funded by the UVa-Coulter Translational Research Partnership. Previous studies using this new technology have focused on studying muscles in children with cerebral palsy, adults with knee pain, and elite and collegiate athletes.

Tesla Museum To Be Built At Recently Purchased Wardenclyffe Laboratory

April Flowers for redOrbit.com – Your Universe Online

The last remaining laboratory of scientist, visionary and inventor Nikola Tesla has been sold this week by the Agfa Corporation to Friends of Science East, Inc. dba Tesla Science Center at Wardenclyffe. Tesla Science Center at Wardenclyffe is a 501 (c) 3 not-for-profit corporation dedicated to saving and restoring Wardenclyffe, with the aim of turning it into a science learning center and museum.

Wardenclyffe is a 15.69 acre site in Shoreham, New York, where Tesla planned to build his wireless communications and energy transmission tower in the early 1900s. The tower was completed, but only one test was made in July 1903. Shortly after, Tesla suffered some financial reversals, and in 1917, the tower was taken down and sold for scrap metal.

Tesla was one of the most influential scientists of the late 19th and early 20th century. His contributions to commercial electricity, radio, magnetism and the invention of the AC (alternating current) motor helped to usher in the Second Industrial Revolution. He also made contributions to the fields of robotics, remote control, radar, computer science, ballistics, nuclear physics and theoretical physics. Nikola Tesla was one of the most famous scientists of his time in the United States, “but because of his eccentric personality and somewhat unbelievable and bizarre claims about scientific and technological developments, Tesla became disliked and was regarded as a mad scientist.”

Tesla is perhaps best known today for the controversy over the invention of the radio. A debate still rages between Tesla supporters and those who favor Guglielmo Marconi over who truly invented the first radio. According to the US Supreme Court in 1947, it was Tesla.

Newsday reports Friends of Science East, Inc. partnered with online comic Matthew Inman of TheOatmeal.com in August 2012 to host an online crowdfunding project on Indiegogo.com. They raised $1.37 million towards the purchase price of the Wardenclyffe site. The campaign reached the $1 million mark in just over a week, with the help of 33,000 contributors from 108 countries.

“This is a major milestone in our almost two-decade effort to save this historically and scientifically significant site. We have been pursuing this dream with confidence that we would eventually succeed,” said Gene Genova, Vice President of the organization, in a recent statement. “We are very excited to be able to finally set foot on the grounds where Tesla walked and worked.”

Friends of Science East, Inc. isn´t done yet, though.

“Now begin the next important steps in raising the money needed to restore the historic laboratory,” said Mary Daum, treasurer. “We estimate that we will need to raise about $10 million to create a science learning center and museum worthy of Tesla and his legacy. We invite everyone who believes in science education and in recognizing Tesla for his many contributions to society to join in helping to make this dream a reality.”

The organization is planning many fundraising events in the future to raise the capital to restore and run the site as a museum. You can find more information on these events on their website, at the Facebook page, and via Twitter.

Promise Of Better Workplace Safety Offered By On-site Asbestos Detector

The Optical Society

First portable, real-time detector uses laser-based light scattering technique to identify harmful fibers

Asbestos was once called a miracle material because of its toughness and fire-resistant properties. It was used as insulation, incorporated into cement and even woven into firemen’s protective clothing. Over time, however, scientists pinned the cause of lung cancers such as mesothelioma on asbestos fiber inhalation. Asbestos was banned in the many industrialized countries in the 1980s, but the threat lingers on in the ceilings, walls and floors of old buildings and homes. Now a team of researchers from the University of Hertfordshire in the U.K. has developed and tested the first portable, real-time airborne asbestos detector. They hope that the prototype, described in a paper published today in the Optical Society’s (OSA) open-access journal Optics Express, will be commercialized in the U.K. in the next few years, providing roofers, plumbers, electricians and other workers in commercial and residential buildings with an affordable way to quickly identify if they have inadvertently disturbed asbestos fibers into the air.

“Many thousands of people around the world have died from asbestos fiber inhalation,” says Paul Kaye, a member of the team that developed the new detection method at the University of Hertfordshire’s School of Physics, Astronomy and Mathematics. “Even today, long after asbestos use was banned in most Western countries, there are many people who become exposed to asbestos that was used in buildings decades earlier, and these people too are dying from that exposure.”

Currently, the most common way to identify hazardous airborne asbestos at worksites is to filter the air, count the number of fibers that are caught, and later analyze the fibers with X-ray technology to determine if they are asbestos. The approach requires expensive lab work and hours of wait time. An alternative method to evaluate work site safety is to use a real-time fiber detector, but the current, commercially available detectors are unable to distinguish between asbestos and other less dangerous fibers such as mineral wool, gypsum and glass. The University of Hertfordshire team’s new detection method, in contrast, can identify asbestos on-site. It does so by employing a laser-based technique that takes advantage of a unique magnetic property of the mineral.

When exposed to a magnetic field, asbestos fibers orient themselves to align with the field. The property is virtually unique among fibrous materials. “Asbestos has a complex crystalline structure containing several metals including silicon, magnesium and iron. It is thought that it is the iron atoms that give rise to the magnetic properties, but the exact mechanism is still somewhat unclear,” says Kaye. Kaye notes that his team wasn’t the first to try to exploit the magnetic effect to develop an asbestos detector. “Pioneering U.S.-based scientist Pedro Lilienfeld filed a patent on a related approach in 1988, but it seems it was not taken forward, possibly because of technical difficulties,” he says.

The Hertfordshire team’s new detection method, developed under the European Commission FP7 project ‘ALERT’ (FP7-SME-2008-2), works by first shining a laser beam at a stream of airborne particles. When light bounces off the particles, it scatters to form unique, complex patterns. The pattern “is a bit like a thumbprint for the particle,” says Kaye, sometimes making it possible to identify a particle’s shape, size, structure, and orientation by looking at the scattered light. “We can use this technique of light scattering to detect single airborne fibers that are far too small to be seen with the naked eye,” he says. After identifying the fibers, the detector carries them in an airflow through a magnetic field, and uses light scattering again on the other side to tell if the fibers have aligned with the field. “If they have, they are highly likely to be asbestos,” Kaye says.

The team has tested their detector in the lab and has worked with colleagues in the U.K. and Spain to develop prototypes that are now undergoing field trials at various locations where asbestos removal operations are underway. “Our colleagues estimate that it will take 12 to 18 months to get the first production units for sale, with a target price of perhaps 700-800 U.S. dollars,” Kaye says. As production increases after the initial product launch, Kaye hopes that costs may be cut even further, making the detectors even more affordable for an individual plumber, electrician or building renovator. “These tradespeople are the most frequently affected by asbestos-related diseases and most who get the diseases will die from them,” Kaye says. The team hopes that, over time, the new detector will help to reduce the 100,000 annual death toll that the World Health Organization attributes to occupational exposure to airborne asbestos.

On The Net:

Lethal Lips: Study Highlights Toxic Content Of Lipstick

Brett Smith for redOrbit.com – Your Universe Online

Known for her lethal lips, Batman villainess Poison Ivy might appreciate a new study from researchers at the University of California, Berkeley who found dangerous levels of lead, chromium“¯and other metals in a number of commonly sold lipsticks.

Previous research, including a 2011 FDA study, has found toxic metals in commercial lipsticks, but the UC Berkeley team has specifically studied how long-term exposure to various concentrations of these metals relates to current health guidelines.

“Just finding these metals isn´t the issue; it´s the levels that matter,” said lead author S. Katharine Hammond, a professor of environmental health sciences at UC Berkeley. “Some of the toxic metals are occurring at levels that could possibly have an effect in the long term.”

The researchers say that the detrimental effects of these cosmetics depend on how often and how much of the product is applied. According to the study, which appeared in the journal Environmental Health Perspectives, the average user applies lipstick 2.3 times a day and ingests about 24 milligrams of the product. A heavy user goes through as many as 14 applications per day and ingests an average of 83 milligrams, the study said.

Average lipstick users, as determined by this study, already expose themselves to excessive amounts of chromium, which has been linked to stomach cancer. Heavy users of these products may also be overexposed to aluminum, cadmium and manganese, the study warned. Of these metals, manganese has been connected to toxicity in the nervous system.

“Lead is not the metal of most concern,” Hammond told USA Today.

She noted that the heavy metal is found in 24 of the products, but at levels considered to be safe for adults. However, exposing children to any amount of lead is considered unsafe.

“I believe that the FDA should pay attention to this,” said lead author Sa Liu, a UC Berkeley environmental health sciences researcher. “Our study was small, using lip products that had been identified by young Asian women in Oakland, Calif. But, the lipsticks and lip glosses in our study are common brands available in stores everywhere.”

In their conclusion, the authors said that tossing out these products may be premature, but the findings do demonstrate a need for more supervision by health regulators. There are currently no federal standards for metal content in cosmetics. The European Union considers cadmium, chromium and lead to be unacceptable ingredients in cosmetic products.

“Based upon our findings, a larger, more thorough survey of lip products — and cosmetics in general — is warranted,” Liu added.

In response to the study´s findings, Personal Care Products Council spokesperson Linda Loretz said finding trace amounts of metals in cosmetics needs to be put into a larger context.

“Food is a primary source for many of these naturally present metals, and exposure from lip products is minimal in comparison,” Loretz said in a statement.

She added that the trace amounts of chromium or cadmium found in the Berkeley study are less than 1 percent of the exposure people get in a typical diet.

Study Finds Less-Used Regimen For Treating Children In Africa With HIV Is More Effective

Researchers from CHOP, Penn find better outcomes for efavirenz over nevirapine in children over age 3 in low-resource settings
Researchers from The Children’s Hospital of Philadelphia and the Perelman School of Medicine at The University of Pennsylvania, along with colleagues at the Botswana-Baylor Children’s Clinical Centre of Excellence, conducted the first large-scale comparison of first-line treatments for HIV-positive children, finding that initial treatment with efavirenz was more effective than nevirapine in suppressing the virus in children ages 3 to 16. However, the less effective nevirapine is currently used much more often in countries with a high prevalence of HIV. The results of the study of more than 800 children are published today in the Journal of the American Medical Association (JAMA).
There are more than 3 million HIV-positive children in the world, and more than 90 percent of them live in sub-Saharan Africa. Currently, the World Health Organization (WHO) recommends both efavirenz and nevirapine for first-line pediatric use in resource-limited settings such as sub-Saharan Africa. Lead author Elizabeth Lowenthal, MD, MSCE of The Children’s Hospital of Philadelphia, says this study has the potential to change the standard of care in the parts of the world where most HIV-infected children live.
“Because nevirapine costs less than efavirenz and is more widely available in pediatric formulations, it is currently the more frequent choice for initial treatment in these children. However, our study suggests that efavirenz produces better outcomes,” said Dr. Lowenthal.
Senior author Robert Gross, MD, MSCE, an associate professor of Infectious Diseases and Epidemiology at Penn Medicine, adds, “Given this evidence, it is very reasonable to adjust pediatric HIV treatment guidelines. However, as we move towards such changes, more work should be done to make efavirenz a more financially viable option for children on anti-retroviral therapy in these resource-limited settings.”
Previous studies favoring efavirenz over nevirapine in adults have resulted in treatment guidelines for adults in many countries, including a few in resource-limited settings, to recommend the use of efavirenz over nevirapine. “In these low-resource settings, Non-Government Organizations typically work with countries’ medical programs to forecast their HIV-related drug needs and lobby companies to lower prices for bulk purchases,” explained Lowenthal. “Through such programs, drugs that were once more expensive can become cost-effective.”
Doctors Lowenthal and Gross applaud the work that the government of Botswana has done to both bring high-quality HIV treatment to its citizens and to facilitate the generation of knowledge to help improve treatment options. “Botswana has been extremely supportive of clinical trials and epidemiological studies, and is very forward thinking in its willingness to inform the world. For such a small country, the amount of research that comes out of Botswana on HIV and tuberculosis is tremendous, which has not only benefited their public health, but public health for all.”

On the Net:

Mozilla Claims Spyware Company Hijacked Firefox

Peter Suciu for redOrbit.com — Your Universe Online

On Wednesday the Mozilla Foundation sent a cease-and-desist letter to British-based Gamma International Ltd., claiming that the latter was passing off its FinFisher spy software as a Firefox product. Mozilla is the maker of the open source-based browser, and called Gamma International´s tactic abusive.

“We are sending Gamma, the FinFisher parent company, a cease and desist letter demanding that these practices be stopped immediately,” Mozilla executive Alex Fowler said in an emailed statement, picked up by the Washington Post.

This comes as researchers have reportedly found samples of Gamma´s FinFisher spy software disguised as a Firefox file, apparently as a way to fool computer users into believing the spyware was in fact harmless. Gamma provides its surveillance software to governments and law enforcement. It markets its software as “remote monitoring” programs that government agencies can use to take control of computers to “snoop” on data and communication.

According to Citizen Lab, which released a summary of its latest findings, FinSpy makes use of Mozilla´s trademark and code. Citizen Lab identified FinFish Command & Control servers in 11 new countries, including Hungary, Turkey, Romania, Panama, Lithuania, Macedonia, South Africa, Pakistan, Nigeria, Bulgaria and Austria.

Citizen Lab consists of researchers from the University of Toronto´s Munk School of Global Affairs.

“We identify instances where FinSpy makes use of Mozilla´s Trademark and Code. The latest Malay-language sample masquerades as Mozilla Firefox in both file properties and in manifest. This behavior is similar to samples discussed in some of our previous reports, including a demo copy of the product, and samples targeting Bahraini activists,” wrote researchers Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri and John Scott-Railton.

The FinFisher commercial network intrusion malware was used in the targeting of activists in Bahrain. More ominously the software “appears to be specifically targeting Malay language speakers, masquerading as a document discussing Malaysia´s upcoming 2013 General Elections,” the researchers added.

This still falls short of providing evidence that FinFisher was in fact being used by one government or another, but does show the global reach of such software.

“It really shows the ubiquity of this type of software,” Citizen Lab´s Morgan Marquis-Boire told the Washington Post on Wednesday.

“It´s important to note that the spyware is not connected with any Mozilla product, including Firefox, in how it is installed or operates on a person´s computer or mobile device. Only our brand and trademarks are used by the spyware as a method to avoid detection and deletion,”said Mozilla in a statement, as cited by Wired.

Mozilla launched Firefox as an open source browser in 2002 as an alternative to Microsoft´s then dominant Internet Explorer. The name reportedly came from a nickname for a red panda, but that didn´t inspire the right image for the browser´s logo. The irony in this is that Firefox was also the name of a 1982 spy thriller starring Clint Eastwood, where he steals an advanced Soviet aircraft named “Firefox.”

This is also not the first time Gamma International has found itself in the spotlight. This past March the company was reportedly identified as one of five ℠corporate enemies of the Internet´ by journalists´ lobbying group Reporters Without Borders. Last month the rights group Privacy International also sued the British government, alleging that Gamma had illegally exported its surveillance technology.

Gamma International denied the accusations.

Economic Factors Have Greatest Impact On Fertility Rates

redOrbit Staff & Wire Reports – Your Universe Online
Economic factors, rather than medical or cultural influences, will have the greatest impact on global population levels over the next decade, according to a recent University of Missouri College of Arts and Science (COAS) study.
According to United Nations figures, the Earth´s population could exceed 8 billion people by the year 2023 if current trends continue.
But improvements in economic development, such as higher educational attainment, increasing employment levels and the shift away from agriculture have a powerful effect in raising standards of living, and correlate with declining fertility rates, said study leader Mary Shenk, assistant professor of anthropology at University of Missouri (MU).
Understanding these causes of declining birth rates may lead to improved policies designed to reduce competition for food, water, land and wealth, she said.
The study also found that intervention programs to reduce fertility rates had a profound effect, and achieved the best results.
“For example, although advertising campaigns encouraging lower fertility may reach a wider audience for less money, face-to-face intervention campaigns providing health services or access to contraception provide better results and are thus a better use of resources,” said Shenk in a statement.
The researchers used data collected since 1966 from approximately 250,000 people in rural Bangladesh, along with detailed interviews of nearly 800 women from the region.
Sixty-four factors related to family size were considered and organized according to three possible explanations for declines in fertility rates:

  • Risk and mortality — according to this theory, parents have fewer children when they have more hope that their children will survive into adulthood.
  • Economic and investment — This explanation suggests that rising costs of children and higher payoffs to investing in self and children reduce fertility with the shift to a market economy.
  • Cultural transmission — This concept holds that social perceptions of the value of children, ideal family size and acceptance of contraception influence fertility rates.

Using custom-designed data collection and statistical methods, Shenk and colleagues found that “economic and investment” factors were most clearly correlated with lower fertility.
However, the three explanations were interlaced, she noted.
And while economic factors were significantly more influential, other issues such as mortality rates and health interventions also affected fertility decline in Bangladesh.
“Few studies have compared those three possible explanations for fertility declines to determine which had the strongest effect,” Shenk said. “Population growth rates have fallen globally, starting in 18th century Western Europe, but the exact cause was intensely debated because there are so many different explanations in the literature. Our study created a framework by which different explanations could be explicitly compared.”
Shenk said population data from any region could be analyzed using the methods employed in her study, which could help researchers, policymakers, health workers and others understand the key drivers of demographic change in many areas of the world.

Treating Asthma With Text Messages And Autism With Social Media

redOrbit Staff & Wire Reports – Your Universe Online

Pediatric asthma patients who received a daily text message asking about their symptoms or providing information about the ailment demonstrated improved pulmonary function and a better understanding of their condition within four months, say researchers from the Georgia Institute of Technology in a new study.

“It appears that text messages acted as an implicit reminder for patients to take their medicine and by the end of the study, the kids were more in tune with their illness,” study leader Rosa Arriaga, a senior research scientist at the Atlanta-based university´s School of Interactive Computing, explained in a statement.

She and colleague TJ Yun, a former Georgia Tech Ph. D. student, presented their findings at the Association for Computing Machinery (ACM) and Special Interest Group on Computer—Human Interaction (SIGCHI) Conference on Human Factors in Computing Systems 2013 in Paris on Tuesday.

According to the university, their work “won a best paper award in the Replichi category, which highlights best practices in study methodology,” and was also “a replication study of an SMS health intervention for pediatric asthma patients originally published in early 2012 in the Proceedings of the 2nd ACM SIGHIT International Health Informatics Symposium.

“Asthma is the most prevalent chronic respiratory disorder in the U.S., affecting about 17.3 million individuals, including more than 5 million children. Medication is the main way patients manage symptoms, but research shows less than 30 percent of teens use their inhalers regularly,” they added. “Texting, on the other hand, is something teens do regularly and enjoy. Nearly 75 percent of American teens have mobile devices.”

In order to determine whether or not this technology could be used to help asthmatic youngsters better manage their condition, Arriaga and Yun recruited 30 asthmatic children between the ages of 10 and 17 years of age from a private Atlanta-based pediatric pulmonology clinic.

They each owned a mobile phone and could read at a minimum fifth-grade level. Each participant was randomly assigned to one of three groups: one that did not receive any text messages, one that received an SMS message every other day, and one that received an SMS message daily.

Over a four-month period, the intervention groups received and responded to text messages 87 percent of the time, with an average response time of 22 minutes. Following the conclusion of the study, the Georgia Tech researchers analyzed those patients who had follow-up visits with their doctors.

They found that sending at least one SMS message per day that contained a question about asthma or specific symptoms improved the patients´ overall health. According to Arriaga, the study findings “indicate that both awareness and knowledge are crucial to individuals engaging in proactive behavior to improve their condition.”

In a separate but related study, Arriaga, Georgia Tech Regents Professor of Interactive Computing Gregory Abowd, and graduate student Hwajung Hong looked into whether or not social media could help individuals with autism improve their connectedness and expand the network of people from whom they can obtain advice.

“The study involved three individuals with Asperger´s Syndrome, a diagnosis that reflects average or above average language skills, but impaired social skills and patterns of behaviors and interests,” the university said. “Individuals with Asperger´s Syndrome can have difficulty using traditional social networking sites such as Facebook because it requires a degree of social nuance. They also may be vulnerable to users trying to take advantage of them.”

In order to overcome those concerns, the researchers established a special social network for individuals using GroupMe, a mobile group messaging app. Each individual was linked with a small, pre-determined number of friends or family members that he or she could contact to discuss day-to-day issues or ask questions. Over four weeks, each subject communicated with their controlled clique, reducing his or her dependence on a primary caregiver.

“The circle actively engaged and shared the responsibility for responding to the participant´s queries,” said the researchers, who also reported their findings at the SIGCHI Conference. “Primary caregivers gave positive reviews of the specialized social network, saying that they were happy with the diversity of feedback that the system provided and that the load felt lighter thanks to the help of the circle members. Results indicate that positive online interactions lead to real-life interactions between the individuals and their circle members.”

Mothers’ Junk Food Habits Could Be Passed Down To Their Children

redOrbit Staff & Wire Reports – Your Universe Online

Mothers who regularly consume high-fat, high-sugar foods while pregnant effectively program their children to be addicted to junk food by the time they are weaned, researchers from the University of Adelaide have discovered.

Laboratory studies led by Dr. Bev Mühlhäusler, a postdoctoral fellow in the university’s FOODplus Research Centre, discovered that a diet high in fatty or sugary foods during pregnancy and lactation desensitized the body´s normal reward system.

The study, which was published in a recent edition of The FASEB Journal, is said to be the first to examine the impact of a mother´s junk food consumption during the early stages of her offspring´s life.

According to Mühlhäusler and her colleagues, opioids, which are produced by the body as a reward in response to such things as fat and sugar, stimulate the production of the hormone dopamine. Dopamine, in turn, helps create a good feeling within a person.

They discovered that mothers who ate junk-food heavy diets while pregnant produced offspring with a less sensitive opioid signaling pathway. As a result, the child would need to consume more fat and sugar in order to trigger his or her reward system to obtain the same positive feeling — increasing their preference for junk good while also encouraging them to overeat in order to achieve those sensations.

“In the same way that someone addicted to opioid drugs has to consume more of the drug over time to achieve the same ‘high’, continually producing excess opioids by eating too much junk food results in the need to consume more foods full of fat and sugar to get the same pleasurable sensation,” Dr. Mühlhäusler said in a statement. “Mothers eating a lot of junk food while pregnant are setting up their children to be addicted.”

“Although our research shows that many of the long-term health problems associated with maternal junk food diets can be avoided if offspring carefully follow a healthy diet after weaning, they are always going to have a predisposition for overconsumption of junk food and obesity,” she added. “It’s going to make it much more difficult for them to maintain a healthy body weight.”

The doctor explains that it is essential to attempt to understand the impact of a pregnant mother´s diet in the earliest stages of the offspring. They hope to determine whether or not the problem can be reversed biologically, but thus far, Dr Mühlhäusler said that their findings suggest that the changes to the opioid receptors are permanent.

“The take-home message for women is that eating large amounts of junk food during pregnancy and while breastfeeding will have long-term consequences for their child’s preference for these foods, which will ultimately have negative effects on their health,” the lead researcher said.

Late Night Snack Cravings Linked To Internal Circadian Rhythm

[ Watch the Video: OHSU´s Dr. Steven Shea Discusses His Research ]

Peter Suciu for redOrbit.com — Your Universe Online

While late night commercials for eateries and snack foods no doubt make some people head to the refrigerator, a new study has found that an internal circadian rhythm could be the cause of increased appetite in the evening.

The study, which was published in the journal Obesity, found that the body´s internal clock — known as the circadian system — could increase hunger and cravings for sweet, starchy and salty foods in the evening. In other words, those ads on TV are only partially to blame.

These cravings may have helped people in times when food was scarce and our ancestors may have had the cravings to store energy. However in an era when high-calorie food is all around us, the result of those late night snacks could result in significant weight gain, as eating a lot in the evening can be counterproductive since the human body actually handles nutrients differently throughout the day.

“Of course, there are many factors that affect weight gain, principally diet and exercise, but the time of eating also has an effect. We found with this study that the internal circadian system also likely plays a role in today´s obesity epidemic because it intensifies hunger at night,” said Steven Shea, Ph.D., senior author on the study. “People who eat a lot in the evening, especially high-calorie foods and beverages, are more likely to be overweight or obese.”

Shea is director for the Center for Research on Occupational and Environmental Toxicology (CROET) at Oregon Health & Science University. His research at CROET focuses on the basic and applied research that helps workers stay healthy. CROET´s mission is to promote health, and prevent disease and disability among working Oregonians and their families.

The research has shown that as the human body handles nutrients differently at various times during the day, consuming more calories in the evening is counterproductive as humans don´t typically expend as much energy after an evening meal in comparison to morning and mid-day meals. Additionally, factors such as artificial light enable people to stay up later than they should, and as a result they don´t get enough sleep.

“If you stay up later, during a time when you´re hungrier for high-calorie foods, you´re more likely to eat during that time,” Shea said. “You then store energy and get less sleep, both of which contribute to weight gain.”

The research, which was conducted by Shea along with two Boston-area researchers, Frank Scheer, Ph.D. and Christoper Morris, Ph.D. of Brigham and Women´s Hospital and Harvard Medical School, examined the appetite and food preference of 13 healthy non-obese adults throughout a 13-day laboratory stay in very dim light. During the study all behavior was scheduled, including time of meals and even sleep. The researchers found that the internal circadian system regulated hunger, with participants feeling the least hungry in the morning and the most hungry in the evening. The study further concludes that the internal body clock caused an evening peak in appetite that may promote people to eat larger, higher-calorie meals before the fasting period necessitated by sleep.

“Our study suggests that because of the internal circadian regulation of appetite, we have a natural tendency to skip breakfast in favor of larger meals in the evening. This pattern of food intake across the day is exactly what Sumo wrestlers do to gain weight.” said Steven Shea. “So, it seems likely that the internal circadian system helps with efficient food storage. While this may have been valuable throughout evolution, nowadays it is likely to contribute to the national epidemic of obesity.”

As a result of the findings Shea suggests that those larger meals should then be earlier in the evening.

“If weight loss is a goal, it´s probably better to eat your larger, higher-calorie meals earlier in the day,” said Shea. “Knowing how your body operates will help you make better choices. Going to bed earlier, getting enough sleep and choosing lower-calorie foods rather than higher-calorie foods in the evening can all help with weight loss.”

Tablets to Become Irrelevant in Five Years, Says BlackBerry CEO

Enid Burns for redOrbit.com — Your Universe Online
As BlackBerry works to rebuild its marketshare, the company is placing all of its emphasis on smartphones. The company CEO recently spoke about a limited lifespan for the tablet platform.
“In five years I don’t think there’ll be a reason to have a tablet anymore,” Chief Executive officer Thorsten Heins said, in an interview with Bloomberg News at the Milken Institute Conference in Los Angeles. “Maybe a big screen in your workspace, but not a tablet as such. Tablets themselves are not a good business model.”
The comment speaks to tablets used in the business market – a clear target for BlackBerry and the company’s new BlackBerry 10 platform. While tablets have some adoption in the workplace, the market has taken off like wildfire in the consumer market. Tablets are on track to displace desktop and laptop sales in the next few years. A recent forecast from Gartner expects the tablet market to more than double by 2017. That’s a growth from 116.1 million units shipped last year, to 197.2 million this year, and 468 million units projected to ship in 2017.
“BlackBerry thinking tablets are going away is not paying any attention to what is actually taking place,” said industry analyst Jeff Kagan. “I agree tablets won’t capture the kind of attention they do today, but they do fill an important niche between laptops and smartphones.”
BlackBerry, which was known as RIM until it changed its name to the company’s main product brand when it introduced the BlackBerry 10 platform, has not introduced a tablet for its BlackBerry 10 platform. The company’s BlackBerry PlayBook was critically received.
Remarks from Heins about tablets could be taken out of context, and can be misinterpreted. Tablets are increasingly used in the workplace, however the platform is not overtaking PCs and large display monitors for a large segment of the workforce. The statement also fails to address the consumer marketplace, where sales continue to rise. The tablet format could also see adaptation, which could spur growth in the consumer and business categories.
“I imagine tablets will still be around,” said Roger Kay, principal analyst with Endpoint Technologies Associates. “In fact, I expect to see a proliferation of form factors to serve progressively more finely targeted markets. So, specialization will increase, and tablets may become a larger category that includes a number of sub-categories of hybrids, true slates, and others at a variety of price points.”
Given better context, it is possible to see where Heins draws his views. “But a broader look at Heins’ past remarks shows a vision that actually isn’t so radical,” a PC World article reports. “Essentially, he believes that the smartphone will be the center of your computing universe, and provide the processing muscle and data for a vast array of smart displays.”
In a previous interview with the New York Times, Heins said, “Whenever you enter an office, you don’t have your laptop with you, you have your mobile computer power exactly here,” Heins held up a BlackBerry 10 phone. “You will not carry a laptop within three to five years.”

Intravenous Therapy

Intravenous therapy, commonly known as IV therapy, is known as the administration of a liquid substance directly into a vein. It is also known as drip therapy, because most often the liquid is suspended above the IV site by an infusion pump, and runs through a drip chamber, which prevents air from entering the line. IVs are the preferred method of drug administration in hospital settings because they are the fastest known route of getting medication to the body. Intravenous therapy can also be used for replacing fluids, blood transfusions, and chemotherapy treatments.

Substances that are safe to infuse through an IV line include blood-based products, medications, volume expanders, and nutritional products. A blood-based product is any component of blood collected from a donor for intended use in a transfusion. While some medications are only able to be administered intravenously, many medications are best taken up by the body when administered within a vein. Volume expanders, often called crystalloids, are substances that are made up of water-soluble molecules. The most commonly used crystalloid is normal saline, which is a solution of sodium chloride at 0.9 percent concentration. Lactated Ringer’s solution is another crystalloid used with intravenous fluid administration. Buffer solutions are introduced to the body in order to correct states of acidosis or alkalosis and bring the blood pH back to an isotonic state. The most common buffer solution used in an IV is intravenous sodium bicarbonate. Parenteral nutrition is a means of feeding a person intravenously, so as to bypass eating and digestion. Recreational drugs, while said to be very unsafe, are commonly injected intravenously as well.

Intravenous therapy is most often initiated by a process known as “starting a line.” In a conventional peripheral IV, a hypodermic needle is inserted into any peripheral vein that is not inside the thoracic or abdominal cavities. With the beveled side up, the needle breaks through the layers of the skin and pierces the vein. When the needle is removed, it leaves behind a flexible, plastic catheter. The most common vein used for intravenous therapy is the medial cubital vein, primarily because it is the most easily accessed.  The size of the catheter is recognized as the gauge, which identifies the diameter of the cannula. 12-14 gauges are the largest, and they are known as trauma lines because their increased size allows medical personnel to deliver large volumes of fluid in short amounts of time. 16 gauges are used primarily for blood transfusions and donations, while 22 gauges are used for pediatric lines. 18-20 gauges are the most commonly used as they are the standard size used for blood draws and infusions. The part of the IV that is left outside the skin is called the connecting hub. It joins with the infusion lines and can also be connected to a syringe.

A more invasive method of administration for intravenous therapy is through a central line. Most central lines are inserted into a large vein, such as the superior or inferior vena cava. Central lines allow IV medications to be delivered safely by eliminating the risk of the substance damaging a weaker peripheral vein. It also ensures that medications reach the heart immediately, as opposed to having to be pumped through the peripheral vascular system first. Their size allows for more than one medication to be administered at a time. However, central lines have a much higher risk of complication than a peripheral IV. They can be difficult to insert because large veins are not generally palpable, so the surrounding structures have the potential to be damaged. Bleeding, infections, and embolisms are also risks of central lines.

A peripherally inserted central catheter, also known as a PICC line, is the preferred method for prolonged intravenous therapy or when the substance being infused would cause quick damage to the peripheral line, and attempting to start a central line may be too dangerous. The risk of infection from a PICC line is much less than that of another central line because of the location of insertion. While it is safer to insert than other central lines, a PICC line is usually inserted with the help of ultrasound guidance.

Another type of central line that is common in intravenous therapy is known as a central venous line. This particular line is started in the subclavian or internal jugular veins, with some started in femoral veins. Central venous lines are favored in emergency medicine because they allow for larger amounts of medication to be administered and diluted quickly, while greatly reducing the risk of damage to the vessel walls, especially if the medication being administered has a concentration that would pose a risk to the vein.

A port is a special type of insertion device that does not have an exterior connecting piece. Instead, it is surgically inserted beneath the skin and is readily accessed with a small needle. Ports are typically used in patients needing long-term treatment therapies, such as chemotherapy, because they have a lower risk of infection and cause much less inconvenience.

Continuous intravenous therapy requires the use of an infusion pump, which regulates the amount and rate that the fluid or medication is being administered to the patient. The standard infusion set comes with a bag of fluids and a drip chamber, which allows one drop at a time to pass through. A long sterile tube is attached, and some IV infusion pumps have Y-tubes, which allows medications to be “piggy-backed.” Clamps are attached to adjust or stop the flow at any time; if the clamps of a line are completely open, the infusion is referred to as a gravity drip. Rapid infusers allow intravenous lines to work at high flow rates.

Infection is always a risk with intravenous therapy because it creates a new opening for bacteria and other foreign pathogens to enter the body, especially the bloodstream. Peripheral edema is another complication that occurs when a vein is blown and fluid is still administered, because the fluid will then travel outside the vein and into the surrounding tissues. The most serious complication associated with IVs is an embolism, which is when either a blood clot or a small mass of air travels through the line and reaches a vein. If overlooked, embolisms can be fatal.

Image Caption: Neutron(TM) Catheter Patency Device. Credit: Calleamanecer/Wikipedia (CC BY-SA 3.0)

Rising From The Ashes: How Plants Benefit From Forest Fires

Brett Smith for redOrbit.com – Your Universe Online
Forest fires are a major cause of plant death and destruction, but they can also be a source of life as some dormant seeds begin to germinate in the aftermath of a raging inferno.
Previous research has shown chemicals in the smoke of burning trees called karrikins“¯are responsible for this phoenix-like rebirth. Now, a new study in the Proceedings of the National Academy of Sciences has described new details on the mechanism behind karrakin-mediated seed activation.
“This is a very important and fundamental process of ecosystem renewal around the planet that we really didn’t understand,” said co-author Joseph P. Noel, professor and director of biology at the Salk Institute in San Diego. “Now we know the molecular triggers for how it occurs.”
In the study, scientists first investigated the structure of a plant protein called KAI2 present in dormant seeds that binds to karrikins. By comparing the protein in both its bound and unbound forms, the team was able to determine how KAI2 enables a seed to recognize the presence of karrikin in its environment. According to the report, “crystallographic analyses and ligand-binding experiments” allowed the team to identify the molecular bonds between karrikins and KAI2.
“But, more than that, we also now know that when karrikin binds to the KAI2 protein it causes a change in its shape,” said co-author Yongxia Guo, a structural enzymologist and researcher at the Salk Institute.
Salk research associate and plant geneticist Zuyu Zheng, added that KAI2 shape change may cause a chain reaction of signals to other proteins in the seeds.
“These other protein players, together with karrikin and KAI2, generate the signal causing seed germination at the right place and time after a wildfire,” Zheng said.
The researchers conducted their experiments on Arabidopsis, a popular laboratory plant among researchers. However, the same karrikin-KAI2 chemical signaling is most likely found in many plant species, the scientists said.
“In plants, one member of this family of enzymes has been recruited somehow through natural selection to bind to this molecule in smoke and ash and generate this signal,” Noel explained. “KAI2 likely evolved when plant ecosystems started to flourish on the terrestrial earth and fire became a very important part of ecosystems to free up nutrients locked up in dying and dead plants.”
The researchers concluded their report by saying more work needs to be done in order to understand the complete mechanism from start to finish, but the existing results could still be taken into consideration by other scientists and policy makers.
Forest conservation strategies have changed considerably over the past 50 years. The US park service used to actively suppress forest fires until they realized mature forests are somewhat dependent on periodic fires that release important minerals and chemicals.
“When Yellowstone National Park was allowed to burn in 1988, many people felt that it would never be restored to its former beauty,” said co-author James J. La Clair, a researcher the University of California. “But by the following spring, when the rains arrived, there was a burst of flowering plants amid the nutrient-rich ash and charred ground.”

VEGF May Not Be Relevant Biomarker For Advanced Prostate Cancer

‘This study confirms that VEGF is not a path forward to tackling this disease’

The well-studied protein VEGF does not appear to have any prognostic or predictive value for men with locally advanced prostate cancer, researchers from the Department of Radiation Oncology at Thomas Jefferson University Hospital and other institutions found in a retrospective study published online April 25 in the journal BMC Radiation Oncology.

VEGF, or vascular endothelial growth factor, induces blood vessel growth, a process known as angiogenesis, which is a key element in solid tumor growth and metastasis. It is overexpressed, along with its receptors, in various cancers, including breast, renal cell carcinoma and gliomas, and has been shown to help predict response to certain drugs.

However, conflicting data in the literature has left the role of VEGF in prostate cancer as a useful biomarker unclear and controversial.

Here, in one of the largest studies of VEGF expression in prostate cancer, senior author Adam P. Dicker, MD, PhD, Chair of the Department of Radiation Oncology at Jefferson, and colleagues retrospectively analyzed data from two groups of men with locally advanced prostate cancer: those who had only radiation therapy and those who had short-term neoadjuvant and concurrent androgen deprivation therapy and radiation therapy.

Data was collected using pathologic material of over 100 men from the Radiation Therapy Oncology Group 8610 phase III randomized control trial to explore VEGF’s potential as a biomarker, one that could be used to improve the treatment of prostate cancer patients through better targeted therapies.

Based on the results, however, researchers posit that the VEGF protein may not be a relevant biomarker for this patient group. They found no statistically significant difference in pre-treatment characteristics among men with varying VEGF levels and no correlation between VEGF expression and overall survival, distant metastasis, local progression, disease-free survival, or biochemical failure.

What’s more, there was no difference between the two treatment arms, those who had androgen therapy and radiation therapy and those who just had radiation. The median follow up time was for all surviving patients was 12.2 years.

“VEGF in this disease does not have a driver role,” said Dr. Dicker. “The clinical trials using VEGF inhibitors did not have clinical benefit, so this study confirms that this is not a path forward to tacking this disease.”

The results are not definitive statements about VEGF, the authors explain, but reporting on this well-characterized population with long-term follow is a significant contribution to the literature.

“This study is among the larger studies of VEGF expression in prostate cancer, and we urge the research community to avoid the misrepresentation of the literature with a lack of publication of even well-designed large negative studies, a publication bias against negative trials, as the current literature in this area appears to be predominated by only small exploratory positive trials, with a lack of subsequent confirmation with larger, longer prospectively designed trials,” the authors write.

Other institutions included Prince Edward Island Cancer Treatment Centre, University of Pennsylvania, Abington Hospital, University of California, San Francisco, Melre M. Mahr Cancer Center, University of Miami, and the Intermountain Medical Center.

On the Net:

Researchers Work To Find Out How Gravity Affects Antimatter

John P. Millis, Ph.D. for redOrbit.com — Your Universe Online

All fundamental matter in the Universe — protons, electrons, etc. — has counterparts known as antimatter. In most ways, antimatter simply mirrors regular matter. For instance, the antimatter counterpart to the electron is the positron, a particle of the same mass of the electron, but possessing the opposite charge and opposite spin.

When normal matter encounters its antimatter counterpart, the pair will annihilate and convert its mass into energy. This is not as common an occurrence as one might first expect, however. The reason is the Universe is dominated by, what we perceive to be anyway, normal matter. Antimatter, conversely, is relatively scarce, arising only during very specific processing, such as interactions of very high-energy photons in our atmosphere.

The small number of samples, as well as the difficulty in containing the antimatter because it can´t be allowed to touch normal matter for risk of annihilation — means there are still some things we don´t know about these particles. But a new study out of Lawrence Berkeley National Laboratory, in concert with their colleagues at CERN´s ALPHA experiment, has revealed new insights into one of the long standing questions about antimatter.

It is well known that normal matter falls in a gravitational field, such as that created by Earth. But some researchers had speculated antimatter could possibly interact differently with gravitational fields and, as a consequence, fall up instead of down. After all, since antimatter is characterized by having opposite quantum properties as their normal matter counterparts, why not opposite gravitational interactions as well?

This is an important issue as Joel Fajans of Lawrence Berkeley National Laboratory explains, “in the unlikely event that antimatter falls upwards, we´d have to fundamentally revise our view of physics and rethink how the universe works.”

Using data from CERN´s ALPHA experiment, which studied 434 antihydrogen atoms, the team realized they would be able to detect anomalies in the gravitational interactions if they were strong enough.

Antihydrogen atoms are created in a chamber at CERN and are held in a strong magnetic trap. The magnetics are then turned off and the antiatom will move toward a wall of normal matter and annihilate. The team could then, knowing the original position and speed, calculate the influence of gravity on its motion.

The experiment proved such measurements are possible, and future work on this experiment could reveal the nature of gravitational interactions with antimatter. However, the results presented in this first round of tests were inconclusive as the team could not pinpoint with enough accuracy the initial parameters of the atoms to see effect. So if any antigravity effect is present, it is relatively small.

“Is there such a thing as antigravity? Based on free-fall tests so far, we can´t say yes or no, “ reports Fajans. “This is the first word, however, not the last.”

The ALPHA experiment is currently being prepared for an upgrade, dubbed ALPHA-2, that would allow a significant improvement in sensitivity over the current system. Team members, therefore, hope within the next five years they can bring closer an answer to this question that has long been on the minds of theoretical physicists.

The researchers have published their findings in the April 30, 2013 edition of Nature Communications.

Examples Of Genetic Engineering: Bizarre Yet Beneficial Uses Of Modern Biotech

Rayshell Clapper for redOrbit.com – Your Universe Online

After learning about human genetic engineering, many readers might want to find out about some examples of genetic engineering. Both bizarre and beneficial, the following article highlights some truly fascinating and pragmatic examples of modern genetic engineering.

The Biotechnology Forums, a website for professionals and students in biotechnology (the area that studies genetic engineering) recently explained some of these examples. The first animal example of genetic engineering is the spider goat. Yes, you read that correctly. A spider goat is able to produce the strong, stretchable silk used by spiders to create their webs. This silk web is one of the strongest natural materials known to man, stronger even than steel.

Nexia Biotechnologies Company inserted the gene from a golden orb-weaver spider into the genome of goat in such a way that the goat secretes the protein of the spider web in its milk. The milk was then used to create a what Nexia called (and trademarked) BioSteel, a material with characteristics similar to spider webs.

Beyond goats capable of secreting spider webs in their milk, there are a number of other really cool examples of genetic engineering in animals. In one redOrbit blog, this author reported about a cat that glows in the dark. The glow-in-the-dark feline has a fluorescence gene that makes it glow under an ultraviolet light. As the Biotechnology Forum outlines, here is how South Korean scientists first created the glowing cat in 2007:

“They took skin cells from Turkish Angora female cat (species that were originally tamed by Tatars, but was later transferred to Turkey and is now considered the country’s national treasure), and using the virus they inserted the genetic code for the production of red fluorescent protein. Then they put genetically modified nuclei into eggs for cloning and such cloned embryos are returned to the donor cat. It thus became the surrogate mother’s own clones.”

And why make a cat that glows in the dark? The researchers explained that this was no frivolous experiment and that potential benefits exist in medicine for treating and testing for human diseases caused by genetic disorders. And just today, researchers in Uruguay announced that they had successfully created a genetically modified glowing sheep. Though not directly applicable to medical technology, the researchers had this to say about the purpose of their research: “Our focus is generating knowledge, make it public so the scientific community can be informed and help in the long run march to generate tools so humans can live better, but we´re not out in the market to sell technology.”

Moving on, two other good example are the less-flatulent cow and the so-called Ecopig. As Mother Nature Network explains, cows produce a lot of methane gas, which is second only to carbon dioxide in contributing to the greenhouse effect. So scientists at the University of Alberta identified the bacteria responsible for producing methane and designed a breed of cows that create 25 percent less methane than the average cow. This is one genetic engineering example that directly and practically addresses one of the major problems facing modern man.

The Ecopig (aka “enviropig” or “Frankenswine”) is yet another of the many examples of genetic engineering that positively contribute to the environment. The Ecopig has been genetically altered to better digest and process phosphorus. The reason is that pig dung is high in phytate, a form of phosphorous that farmers use it as fertilizer but which over stimulates the growth of algae which can deplete oxygen in the watersheds and thus kill marine life. The Ecopig has been genetically modified by adding E. Coli and mouse DNA to the pig embryo, which reduce the pig´s phosphorous output by about 70 percent.

Each of these bizarre examples point to some of the pros of genetic engineering, highlighting how researchers are striving to bring modern science and technology to the aid of humanity and some of its most pressing problems. Whether the goat that produces spider silk or the cow that doesn´t produce as much flatulence, these animal examples of genetic engineering shows biotechnology in action.

Fear Of Missing Out More Prevalent Due To Internet, Social Media

Alan McStravick for redOrbit.com – Your Universe Online

If you´ve ever been faced with having to decide between two options, chances are you have experienced what is simply and psychologically referred to as the “Fear of Missing Out” (FoMO). This phenomenon, according to researchers from the University of Essex, has become more prevalent with the ever-present nature of the Internet in our day-to-day lives. And for the first time, the ability to accurately measure FoMO has been devised.

FoMO is generally understood as a concern people have that others may be having more fun and rewarding experiences than they are. With the rise of websites like Facebook and Twitter and their easy accessibility on our tablets and phones, individuals are able to keep current on their friends movements like never before. This ability has led to the hidden curse of FoMO.

The Essex research team, with colleagues from the University of California and the University of Rochester, will be publishing their findings in the July issue of the journal Computers in Human Behavior.

The researchers claim theirs is the first study to examine this phenomenon in depth. As FoMO really only came to light in the last 3 years, with respect to the pervasive nature of the Internet in our daily lives, the team focused their research on a subject group of individuals under age 30.

Lead researcher and psychologist Dr. Andy Przybylski claims one of the main issues regarding individuals with a high level of FoMO is that they may choose to become so involved in monitoring their social network friends´ activities that they will often ignore what they are actually enjoying themselves.

“I find Facebook rewarding to use, but how we are using social media is changing,” claimed Przybylski in a statement. “It is no longer something we have to sit at a computer and log into as we have access all the time on our phones. It is easier to get into the rhythm of other people´s lives than ever before as we get alerts and texts.”

“We have to learn new skills to control our usage and enjoy social media in moderation. Until we do, it creates a double-edged sword aspect to social media,” he said.

The international collaborative research team was able to devise a way of measuring an individual´s level of FoMO. They have even put a version of their test online that compares your own level of FoMO to those involved in the study. To test your own level, go to ratemyfomo.com.

Study subjects who were more affected than others typically saw their use of social media as being an important tool making them more dependent on social media as part of their social development.

Przybylksi pointed out that social factors are important indicators. If an individual´s “psychological needs were deprived” they were more likely to seek out social media. Additionally, FoMO was able, in those individuals, to bridge this deprivation gap. This offered a viable explanation for why certain people use social media more than others.

The researchers point out their findings show people who exhibit a high level of FoMO are typically more likely to give in to the temptation of both checking and composing text messages and e-mails while driving. Additionally, they will allow social media to distract them from important daily tasks like university lectures. Overall, these individuals present more mixed feelings with regard to their social media use.

As this is just the first such study to have been conducted, the research team hopes further study will be conducted into the phenomenon of the fear of missing out and how it affects an individual´s general well-being.

Upper Arm Lift Procedures Increased 4,000 Percent In Past Decade

Lawrence LeBlond for redOrbit.com – Your Universe Online

Plastic Surgery, a procedure that has grown in popularity over the past two-to-three decades, is performed for various reasons. Tumor removal, scar repair and breast reduction are among the most popular reasons for men and women to get reconstructive surgery, according to previous data from the American Society of Plastic Surgeons (ASPS).

But new data from the group has shown another form of the surgery has exploded over the last decade. Upper arm lifts have surged more than 4,000 percent in the past 12 years, rising from 300 procedures in 2000, to more than 15,000 last year. The ASPS data shows that the rise in popularity has been fueled partly by the sleeveless fashion trend as well as a bigger focus on strong-armed celebrities.

The majority of arm lift surgeries over the period were for women (98 percent) and mostly for those over 40 years old. Forty-three percent of patients were age 40 to 54 and 33 percent were 55 and older. Overall, 15,136 women got arm lifts in 2011 — a jump of 4,378 percent from 2000. A total of $61 million was spent on just arm lift procedures in 2011.

Most upper arm lifts include liposuction or brachioplasty, where loose skin is removed from the back of the arms.

“Women are paying more attention to their arms in general and are becoming more aware of options to treat this area,” said ASPS President Gregory Evans, MD. “For some women, the arms have always been a troublesome area and, along with proper diet and exercise, liposuction can help refine them. Others may opt for a brachioplasty when there is a fair amount of loose skin present with minimal elasticity.”

While doctors say there is no single reason behind the increase in upper arm lifts, celebrity influence could be a major factor. A recent poll conducted on behalf of the ASPS found that women are paying more attention now to the arms of female celebrities.

According to the poll, the most admired celebrity arms are those of first lady Michelle Obama, actresses Jennifer Aniston, Jessica Biel, Demi Moore and daytime talk show host Kelly Ripa.

“I think we are always affected by the people that we see consistently, either on the big screen or on TV,” said ASPS Public Education Committee Chair David Reath, MD, based in Knoxville, Tennessee. “We see them and think, ‘yeah, I’d like to look like that’.”

Allen Rosen, MD, an ASPS member from Montclair, New Jersey, said in a blog post that “new technologies and techniques have made it more reasonable to treat heavy upper arms, which have driven many patients crazy for years. It is no surprise that procedures to reshape the upper arms have increased over the past 10 years since the science and technology have finally come together to meet this demand.”

Celebrity influence undoubtedly plays a role in women aspiring to look and feel better about themselves, but Dr. Rosen maintains that many women get those arm lifts because diet and exercise often does not transition to tighter arms. Some people retain areas of excess loose skin and fat, even after years of diet and exercise.

In the past, the presence of unsightly arm scars have kept many women from having upper arm lifts. But for those “who undergo massive weight loss have hanging “bat wings” and gladly trade this off for an inner arm scar,” said Dr. Rosen in the blog.

And while the procedure is mostly performed on older patients, Dr. Rosen noted that he has seen many patients from “teens to the elderly,” who have had massive weight loss through diet and exercise, and they want to see those flabby arms go.

With Today´s scar management technology, which includes barbed sutures, fractional laser treatment and silicone-gel sheeting, more and more people are making the move and getter better results with brachioplasty than ever before. However, some scarring may still remain.

Brachioplasty requires an incision from the elbow to the armpit and generally is performed on the back of the arm, which can leave a visibly permanent scar. For many, a scar is much easier to deal with than excess flabby skin, but Dr. Reath cautions patients to weigh out the pros and cons before getting any arm lift.

“It’s a tradeoff. We get rid of the skin, but we leave a scar,” he said in a statement. “So, as long as there’s enough improvement to be made in the shape of the arm to justify the scar, then it’s a great procedure.”

While Dr. Reath maintains the importance of a good diet and lots of exercise as part of a healthy lifestyle, he acknowledges that some women simply cannot achieve that look on their own. For those who want to tighten their upper arms, but do not have excess skin, liposuction is often a better choice than brachioplasty.

Dr. Rosen said that for some patients who have excess arm skin, brachioplasty may not be an option, or they may have been turned down for the surgery. Newer techniques in liposuction are also available which can achieve similar results to brachioplasty. Improved liposuction includes “fat freezing” and “thermal melting,” which reduce fat and give better upper arm tone.

He maintains, however, that the best option for skin tightening is brachioplasty. And with techniques getting “better by the minute,” he noted that more patients can be offered the procedure with less down time and minimal to no scarring.

“The ASPS annual statistics showing the exponential growth of upper arm lifts definitely reflect the result of this “perfect storm” of new technologies/techniques and patient demand,” concluded Dr. Rosen.

Glow In The Dark Sheep Created By Uruguay Scientists

redOrbit Staff & Wire Reports – Your Universe Online

A team of scientists from Uruguay have genetically modified a flock of nine young sheep, causing the lambs to glow in the dark whenever they are exposed to ultraviolet light.

According to Slashgear´s Brian Sin, the scientists altered the creatures using the fluorescent protein from an Aequorea jellyfish. The sheep, which were born last October at the Animal Reproduction Institute of Uruguay, give off a glowing green color when exposed to some types of UV rays but are said to be developing normally.

“We did not use a protein of medical interest or to help with a particular medicine because we wanted to fine-tune the technique,” lead researcher Alejo Menchaca said, according to James A. Foley of Nature World News. “We used the green protein because the color is easily identifiable in the sheep’s tissues.”

Menchaca added that the lambs have been spending as much time out in the field as their non-genetically modified counterparts, but in “better conditions, not the traditional breeding system.” He also was quoted by Foley as saying that the creatures were being “well looked after, well fed and very much loved.”

Menchaca, who worked on the project alongside Martina Crisp of the Pasteur Institute, told Merco Press, “The technique is complex and demands much work and is one of the limiting factors, so despite the global interest and demand it is still a slow process. Our focus is generating knowledge, make it public so the scientific community can be informed and help in the long run march to generate tools so humans can live better, but we´re not out in the market to sell technology.”

Menchaca explained that scientists can select a specific gene with biological or pharmaceutical value to humans, and then add it to the embryo of a cow, lamb, goat or similar creature so that the gene is incorporated into that animal´s DNA. Once it grows and matures, the creature can produce milk that contains that substance of interest. That milk then undergoes a complex procedure which makes it available for consumption so that people can benefit from it.

“While these sheep may be the first glow-in-the-dark sheep to exist, they´re not the first living creatures that scientists have genetically modified,” Sin said. “Scientists have also genetically modified zebrafish using the same green fluorescent protein from the Aequorea jelly fish to make them glow-in-the-dark. These zebrafish were them renamed ℠GloFish´ and have since been genetically modified using various other [fluorescent color] proteins.”

Daspletosaurus

Daspletosaurus, meaning “frightful lizard” is a genus of tyrannosaurid theropod dinosaur that resided in western North America between 77 and 74 million years ago, during the Late Cretaceous Period. Fossils of the only named species were found in Alberta, although other possible species from Alberta and Montana wait for description.

Daspletosaurus is closely related to the much larger and more current Tyrannosaurus. Like most of the known tyrannosaurids, it was a multi-ton bipedal predator with dozens of sizable and sharp teeth. Daspletosaurus encompass the small forelimbs that are typical of tyrannosaurids, although they were proportionately longer than in the other genera.

As an apex predator, it was at the top of the food chain, mostly likely preying upon large dinosaurs such as the ceratopsid Centrosaurus and the hadrosaur Hypacrosaurus. In some areas, it coexisted with another tyrannosaurid, Gorgosaurus, although, there is some evidence of niche differentiation between these two. While Daspletosaurus fossils are more infrequent than other tyrannosaurids, the available specimens allow some examination of the biology of these animals, such as their social behavior, their diet, and their life history.

While it is very large by the standard of modern predators, it wasn’t the largest tyrannosaurid. The adults could reach a length of 26 to 30 feet from its snout to its tail. The mass estimates averaged around 2.75 short tons but have ranged between 2.0 and 4.1 short tons.

Daspletosaurus had a gigantic skull that could reach more than 3.3 feet in length. The bones were heavily constructed and some, including the nasal bones, were fused for strength. Large openings in the skull decreased its weight. An adult was armed with about six dozen teeth were very long but oval in cross section rather than like a blade. Unlike its other teeth, those in the premaxilla at the end of its upper jaw had a D-shaped cross section, an example of heterodonty that is always seen in tyrannosaurids. Special skull features included the rough outer surface of the maxilla, which is the upper jaw bone, and the pronounced crests around its eyes on the lacrimal, postorbital, and jugual bones. The orbit, or the eye socket, was a total oval, somewhere in between the circular shape that’s seen in Gorgosaurus and the ‘keyhole’ shape seen in Tyrannosaurus.

It shared the same body form as the other tyrannosaurids, with a short and S-shaped neck supporting the enormous skull. It walked utilizing its two thick hindlimbs, which ended with its four-toed feet, although the hallux, or the first digit, didn’t contact the ground. In contrast, the forelimbs were very small and had only two digits, although Daspletosaurus had the longest forelimbs in proportion to the body size of any tyrannosaurid. A long and heavy tail operated as a counterweight to the head and the torso, with the center of gravity over the hips.

It belongs to the subfamily Tyrannosaurinae within the family Tyrannosauridae, along with Tarbosaurus, Alioramus and Tyrannosaurus. The animals in this subfamily are more closely related to Tyrannosaurus than to Albertosaurus and are known, with the exception of Alioramus, for their strong build with proportionally larger skulls and longer femora than in the other subfamily, Albertosaurinae.

Daspletosaurus is commonly considered to be closely related to Tyrannosaurus rex, or even a direct ancestor through anagenesis. Gregory Paul reassigned D. torosus to the genus Tyrannosaurus, making the new combination Tyrannosaurus torosus, but this has not been formally accepted.

The type specimen of Daspletosaurus torosus (CMN 8506) is an incomplete skeleton including the skull, shoulder, a forelimb, pelvis, a femur and all of the vertebrae from the neck, hip and torso, as the first eleven tail vertebrae. It was discovered in the year 1921 near Steveville, Alberta by Charles Mortram Sternberg, who thought it was a new species of the Gorgosaurus. It wasn’t until the year 1970 when the specimen was fully described by Dale Russell, who made it the type of a new genus, Daspletosaurus. The type species is Daspletosaurus torosus, the specific name torosus being Latin for “muscular” or “strong”. Aside from the type, there is only one other renowned specimen, RTMP 2001.36.1, a relatively complete skeleton discovered in the year 2001. Both of the specimens were recovered from the Oldman Formation in the Judith River Group in Alberta. The Oldman Formation was deposited during the middle of the Campanian stage in the Late Cretaceous, from about 77 to 76 million years ago. A specimen from the younger Horseshoe Canyon Formation in Alberta has been alternately assigned to Albertosaurus sarcophagus.

Along with the holotype, Russell designed a specimen that was collected by Barnum Brown in the year 1913 as the paratype of D. torosus. This specimen (AMNH 5438) consists of parts of the hindleg, pelvis, and some of its associated vertebrae. It was discovered in the upper area of the Oldman Formation of Alberta. This upper section has been renamed since then to Dinosaur Park Formation, which dates back to the middle Campanian. In 1914, Brown collected a nearly complete skeleton and skull; forty years later, his American Museum of Natural History sold this specimen to the Field Museum of Natural History out of Chicago. It was mounted for display and labeled as Albertosaurus libratus for many years, but after several skull traits were later found to be modeled in plaster, including most of its teeth, the specimen (FMNH PR308) was alternately reassigned to Daspletosaurus. Totally, 8 specimens have been collected from the Dinosaur Park Formation over the years since; most of them were within the boundaries of Dinosaur Provincial Park. Phil Curries believes that the Dinosaur Park specimens are representations of a new species of Daspletosaurus, distinguished by certain features of its skull. Pictures of this new species have been published, but it still awaits a name and a full description in print.

A new tyrannosaurid specimen (OMNH 10131) that included skull fragments, parts of the hindlimb and ribs, was documented from New Mexico in the year 1990 and assigned to the now-defunct genus Aublysodon. Many authors have reassigned this specimen since then, along with a few others from New Mexico, to yet another unnamed species of Daspletosaurus.

A young specimen of the Dinosaur Park Daspletosaurus species (TMP 94. 143. 1) shows bite marks on its face that were inflicted by another tyrannosaur. The bite marks are healed over, showing that the animal endured the bite. A fully grown Dinosaur Park Daspletosaurus also shows tyrannosaur marks from a bite, showing that attacks to their faces weren’t limited to the younger dinosaurs. While it’s possibly that the bites were attributable to other species, aggression between species, including facial biting, is common amongst predators.

Some evidence stating that Daspletosaurus lived in social groups comes from a bonebed found in the Two Medicine Formation of Montana. It includes the remains of three Daspletosaurus, including a large adult, a small juvenile, and another individual of an intermediate size. At least five hadrosaurs are preserved at the same location. Geologic evidence shows that the remains were not brought together by currents of a river but that all of the animals were buried simultaneously in the same location. The hadrosaur remains are scattered and have numerous marks from tyrannosaur teeth, showing that the Daspletosaurus were feeding on the hadrosaurs at the time of death, although the cause of death isn’t known. Currie guesses that the daspletosaurs formed a pack, although it can’t be stated with certainty. Other scientists are skeptical of the evidence for social groups; Brian Roach and Daniel Brinkman have proposed that Daspletosaurs social interaction would have more closely looked like the modern Komodo Dragon, where non-cooperative individuals mob carcasses, often attacking and even cannibalizing each other in the process.

Paleontologist Gregory Erickson and colleagues have performed studies on the growth and life history of tyrannosaurids. Analysis of bone histology is able to determine the age of a specimen when it passes on. Growth rates can be examined when the age of various individuals are mapped with their size on a graph. Erickson has shown that after a long time, as juveniles, tyrannosaurs underwent extreme growth spurts for the span of about four years midway through their lives. After the quick growth phase ending with sexual maturity, growth slowed down significantly in adult animals.

By charting the number of specimens of each age group, Erickson and his colleagues could draw conclusions about the life history in a population of Albertosaurus. Their analysis depicted that while juveniles were rare in the fossil record, subadults in the quick growth phase and the adults were far more common. While this could be because of preservation or collection biases, Erickson hypothesized that the difference was because of low mortality among the juveniles over a specific size, which is also seen in some modern sizable mammas such as elephants. This low mortality might have resulted from a lack of predation, since tyrannosaurs surpassed all contemporaneous predators in size by the age of two. Paleontologists haven’t found enough Daspletosaurus remains for a similar analysis, but Erickson notes that the same usual tend seems to apply.

All of the known Daspletosaurus fossils have been recovered from formations dating to the middle to late Campanian stage of the Late Cretaceous Period, between 77 and 74 million years ago. Since the middle of the Cretacous, North America had been divided by the Western Interior Seaway, with much of Montana and Alberta below the surface. However, the uplift of the Rocky Mountains in the Laramide Orogeny to the west, which started during the time of Daspletosaurus, forced the seaway to move eastwards and southwards. Rivers flowed down from the mountains and drained into the seaway, carrying some sediment along with them that formed the Two Medicine Formation, the Judith River Group, and other sedimentary formations in that region. About 73 million years ago, the seaway started to advance towards the west and north again, and the entire region was covered by the Bearpaw Sea, represented throughout the western United States and Canada by the huge Bearpaw Shale.

They lived in an enormous floodplain along the western shore of the interior seaway. Sizable rivers watered the land, occasionally flooding and covering the region with new sediment. When the water was plentiful, the region could support a lot of plant and animal life, but periodic droughts also struck the region, ensuing in mass mortality as preserved in the many bonebed deposits found in Two Medicine and Judith River sediments, including the Daspletosaurus bonebed. Conditions much like them exist today in East Africa. Volcanic eruptions from the west periodically blanket the region with ash, also ensuing in large-scale mortality, while at the same time, enriching the soil for future plant growth. It’s these ash beds that enable precise radiometric dating, too. Fluctuating sea levels also resulted in a variety of other environments at different times and locations within the Judith River Group, including nearshore and offshore marine habitats, deltas and lagoons, and coastal wetlands, in addition to the inland floodplains.

Image Caption: Daspletosaurus mount at the Field Museum of Natural History, Chicago. Credit: ScottRobertAnselmo/Wikipedia (CC BY-SA 3.0)

Scientists Develop Breathalyzer That Can ‘Smell’ Drugs

Brett Smith for redOrbit.com — Your Universe Online

For years, roadside breathalyzer tests have been an accurate and effective way to determine the blood alcohol level of a motorist and new research from the Karolinska Institutet in Stockholm, Sweden shows a similar device could be used to detect blood levels of cocaine, amphetamines, and cannabis.

Driving under the influence of illegal drugs has been against the law in most states and the new device could pave the way for a more aggressive pursuit of these types of offenders.

According to the Swedish team´s study in Journal of Breath Research, the device returned an 87 percent accuracy rate on individuals at a drug clinic who said they had used drugs recently — the same rate as blood and urine tests performed on the same samples.

“Considering the samples were taken 24 hours after the intake of drugs, we were surprised to find that there was still high detectability for most drugs,” said lead author Olof Beck, a toxicologist at the Institute.

In the study, the research team tested the breath of 46 individuals who were checked into a drug addiction emergency clinic and agreed to participate in the study. Each participant was instructed to exhale about 20 times over the course of two to three minutes into the experimental breathalyzer.

The device was able to trap tiny solid and liquid microparticles suspended in the breath for analysis. The small particles replicate the composition of a person´s bloodstream, because molecules from the blood diffuse into the fluid that lines our lungs and is eventually exhaled.

The trapped particles were then tested using liquid chromatography and mass spectrometry analyses that return either a positive or negative result. While most of the detected drugs matched with self-reports and blood tests, 23 percent of the breath tests also indicated the presence of a drug that hadn´t been reported.

The researchers said this degree of accuracy was higher than previous studies, but acknowledged they are slowly refining their system to reduce false positives and improve the detection rate.

In its current state, the new breathalyzer system, called SensAbues, requires a collected sample be sent elsewhere for analysis.

“In cases of suspected driving under the influence of drugs, blood samples could be taken in parallel with breath when back at a police station,” Beck said. “Future studies should therefore test the correlation between blood concentration of drugs of abuse and the concentrations in exhaled breath.”

The research team also said advances to reduce cost and increase portability of chemical analysis systems could eventually result in the same type of roadside breath testing for drugs that is currently used for alcohol.

“There is a possibility that exhaled breath will develop into a new matrix for routine drug testing and present an alternative to already used matrices like urine, blood, oral fluid, sweat and hair,” the study said. “Since exhaled breath may be as easy to collect as in alcohol breath testing it may present a new more accessible matrix than blood at the roadside.”