Empathy And Analytical Thinking At Odds With Each Other In Our Brains
Written By: editor
April Flowers for redOrbit.com – Your Universe Online
Even the most intelligent, complex brains can be taken by a swindler’s story, a new study from Case Western Reserve University shows, even stories that prove false upon a second look.
The new study, published online in the journal NeuroImage, reveals that when the brain fires up the network of neurons that allows a person to empathize, the network used for analysis is suppressed. Our ability to appreciate the human cost of our action is repressed when the analytic network is engaged.
Our brains cycle between the social and analytical networks at rest, when presented with a task, however healthy adults engage the appropriate neural pathway. We have built-in neural constraints on our ability to be both empathetic and analytic at the same time, the study shows.
The findings of this new study suggest that established theories about two competing networks within the brain need to be revised. It also provides insights into how a healthy mind works versus a mentally ill or developmentally disabled brain.
“This is the cognitive structure we’ve evolved,” said Anthony Jack, an assistant professor of cognitive science at Case Western Reserve. “Empathetic and analytic thinking are, at least to some extent, mutually exclusive in the brain.”
Previous studies revealed two large-scale brain networks are in tension in the brain: the default mode network and the task positive network. Other researchers have suggested that different mechanisms drive this tension. For example, one theory says that we have one network for engaging in goal directed tasks, positing that our second network allows the mind to wander.
Jack and his colleagues show that adults presented with social or analytical external stimuli problems consistently engaged the appropriate neural pathway to solve the problem. The other pathway was repressed. They used functional magnetic resonance imaging to record the see-sawing brain activity.
The question which inspired the study was a philosophical query. Jack said, “The most persistent question in the philosophy of mind is the problem of consciousness. Why can we describe the workings of a brain, but that doesn’t tell us what it’s like to be that person?”
“The disconnect between experiential understanding and scientific understanding is known as the explanatory gap,” Jack said. “In 2006, the philosopher Philip Robbins [professor of philosophy at the University of Missouri] and I got together and we came up with a pretty crazy, bold hypothesis: that the explanatory gap is driven by our neural structure. I was genuinely surprised to see how powerfully these findings fit that theory.”
The same neural phenomenon drives the explanatory gap as when we look at a visual illusion such as the duck-rabbit, the study suggests. When you look at the drawing, you can see the duck facing one way or the rabbit facing the other, but not both at once.
“That is called perceptual rivalry, and it occurs because of neural inhibition between the two representations,” Jack said. “What we see in this study is similar, but much more wide-scale. We see neural inhibition between the entire brain network we use to socially, emotionally, and morally engage with others, and the entire network we use for scientific, mathematical and logical reasoning.”
“This shows scientific accounts really do leave something out – the human touch. A major challenge for the science of the mind is how we can better translate between the cold and distant mechanical descriptions that neuroscience produces, and the emotionally engaged intuitive understanding which allows us to relate to one another as people.”
Forty-five healthy college students were recruited, and each was asked to take five 10-minute turns inside a magnetic resonance imager. During these sessions, the researchers presented them with 20 written and 20 video problems that required them to think about how others might feel. They were also presented with an equal number of written and video problems that required physics to solve.
Each video required the students to provide an answer to a yes-no question within 7 seconds. The sessions in the imager included twenty 27-second rest periods, and variable delays of 1,3,or 5 seconds. During each rest period, the students would look at a red cross on the screen in front of them and relax.
The MRI images revealed that social problems deactivated the brain regions associated with analysis and activated the social network, which held true regardless of the media the question was presented in. The physics questions, on the other hand, deactivated the brain regions associated with empathizing and activated the analytical network.
“When subjects are lying in a scanner with nothing to do, which we call the resting state, they naturally cycle between the two networks,” Jack said. “This tells us that it’s the structure of the adult brain that is driving this, that it’s a physiological constraint on cognition.”
The results of this study will have bearing on neuropsychiatric disorders from anxiety and depression to ADHD and schizophrenia, which are all characterized by social dysfunction of some sort.
“Treatment needs to target a balance between these two networks. At present most rehabilitation, and more broadly most educational efforts of any sort, focus on tuning up the analytic network. Yet, we found more cortex dedicated to the social network.”
The findings most clearly impact the study of developmental disabilities such as autism and Williams syndrome. People with autism usually have poor social skills, but a strong ability to solve visuospatial problems, such as mentally manipulating two and three-dimensional figures. Williams syndrome sufferers are warm and friendly, but perform very poorly on visuospatial tests.
Jack warns that even healthy people can come to rely too heavily on one network or the other.
“You want the CEO of a company to be highly analytical in order to run a company efficiently, otherwise it will go out of business,” he said. “But, you can lose your moral compass if you get stuck in an analytic way of thinking.”
“You’ll never get by without both networks,” Jack continued. “You don’t want to favor one, but cycle efficiently between them, and employ the right network at the right time.”
Further research is needed to test the theory, and the group is currently investigating whether brains will shift from the social to the analytical when shown people depicted in dehumanizing ways — such as an animal or object. They are also studying how disgust and social stereotyping confound our moral compass by activating the analytical network.
Study Proposes Eliminating Excessive Health Care Spending
Written By: editor
editor
Connie K. Ho for redOrbit.com — Your Universe Online
A new study by researchers from the University of California, Los Angeles (UCLA) found that, by redirecting excessive spending of $750 billion each year, the U.S. could increase the health and well being of citizens.
In particular, the group of scientists believes that the sum of money is due to a number of factors including inflated prices, fraud, unnecessary services and extra administrative costs.
“If cut from current health care expenditures, these funds could provide businesses and households with a huge windfall, with enough money left over to fund deficit reduction on the order of the most ambitious plans in Washington,” explained Frederick J. Zimmerman, a professor and chair of the department of health policy and management at the UCLA Fielding School of Public Health, in a prepared statement. “The money could also cover needed investments in transportation infrastructure, early childhood education, human capital programs, rural development, job retraining programs and much more. And it could transform America with little to no reduction in the quality of, or access to, health care actually provided.”
The researchers stated that alternative options proposed in the study could prove to be beneficial.
“When the fastest-growing part of the economy is also the least efficient, the economy as a whole loses its ability over time to support our current living standards,” noted the study´s co-author Jonathan Fielding, a UCLA professor of health policy and management and director of the Los Angeles County Department of Public Health, in the statement. “The U.S. has become irrationally attached to its inefficient health care system. Recognizing the opportunity costs of this attachment is the first step in repairing the system.”
The authors proposed one scenario as to how the money could be used. They believe that over $410 billion per year or a 55 percent of savings could be received by the private sector to be used however they wanted to spend the funds. An additional $202 billion or 27 percent could help in deficit reduction. Another $104 billion or 14 percent could help fund more investments in human capital and physical infrastructure.
Furthermore, $18 billion or two percent of the savings could be used to increase urban and rural quality of life; they proposed that the funds could help better the environment around local campuses, increase and modernize the number of public libraries, improve waste water treatment and give rural development grants to small towns in the country. The extra money would also provide job training for almost 50,000 unemployed individuals while two percent of the savings would pay for transportation projects that could decrease road congestion and increase public transit options.
Even with the number of options the funding could be used for, the researchers believe that it would be difficult to decrease the excess expenditures as the costs are spread over many different areas. To overcome this issue, economic sectors, governmental agencies and other organizations will have to work together cohesively. If successful in redirecting approximately $750 billion per year, it could be extremely helpful for people in the U.S. and pave the way for greater change.
“This will not be an easy fight,” concluded Zimmerman in the statement. “But we believe reconceptualizing our excess health care spending by looking at its opportunity cost to society is an important first step.”
A little bit ago, in a galaxy not so far away, news broke that Disney had purchased Lucasfilm Ltd and is planning to finish what George Lucas never started – the remaining Star Wars episodes.
The Walt Disney Company announced it would be purchasing Lucasfilm for about half of the $4.05 billion, plus 40 million stock shares.
“It’s now time for me to pass Star Wars on to a new generation of filmmakers,” Lucas said in a statement. “I’ve always believed that Star Wars could live beyond me, and I thought it was important to set up the transition during my lifetime.”
Disney will be taking control of all Lucasfilm’s assets, including the Star Wars franchise and the Indiana Jones franchise.
The Mickey Mouse company also confirmed that Industrial Light & Magic and Skywalker Sound are included in the transaction.
Disney said it is planning to release the Star Wars Episode 7 movie sometime during 2015 and also confirmed that it plans to release a new feature Star Wars film every two to three years.
Star Wars has always been rumored to be built to go beyond the original Episodes, 4, 5 and 6. A New Hope and Empire Strikes Back producer Gary Kurtz proposed story elements for the 7, 8, and 9 episodes back in the 1970s.
For those of you worried about the direction Disney might take the upcoming Star Wars movies, Lucas will be serving as a creative consultant for the remaining Star Wars films, according to a Walt Disney statement.
“This transaction combines a world-class portfolio of content including ‘Star Wars,’ one of the greatest family entertainment franchises of all time, with Disney’s unique and unparalleled creativity across multiple platforms, businesses, and markets to generate sustained growth and drive significant long-term value,” Disney chairman and chief executive officer Robert Iger said in a statement
Disney pointed out in its statement this transaction follows the successful acquisitions of Pixar and Marvel, neither of which have yet to have Mickey or Goofy cameos.
Kathleen Kennedy, current Co-Chairman of Lucasfilm, will become President of Lucasfilm and will serve as executive producer of new Star Wars feature films.
Disney said more feature films would be expected past Episode 7 to continue the Star Wars saga and grow the franchise well into the future.
Beneficial Bacteria In Honeybees Show Resistance To Antibiotics
Written By: editor
editor
Brett Smith for redOrbit.com – Your Universe Online
Overuse of antibiotics in American agriculture and food production has long been a public concern over the potential for engendering more resilient microbes. A new study from Yale University shows that beneficial bacteria living in the gut of honeybees are demonstrating signs of resistance to such antibiotics.
A genetic analysis of the digestive bacteria showed eight different resistance genes for the antibiotic tetracycline in samples taken from U.S. honeybees, according to the research team´s report in mBio, the open-access journal for the American Society for Microbiology.
“It [resistance] seems to be everywhere in the U.S.,” said senior author Nancy Moran, from the Department of Ecology and Evolutionary Biology at Yale. “There’s a pattern here, where the U.S. has these genes and the others don’t.”
Honeybees have been given doses of the antibiotic oxytetracycline over the past 60 years to guard against “foulbrood,” a bacterial epidemic that can wipe out a hive before beekeepers can take action. Many of the genes that convey the resistance to oxytetracycline also convey resistance for tetracycline, a common broad-spectrum antibiotic that was historically used to treat cholera.
Using a metagenomic analysis, the Yale biologists screened honeybees from several locations across the United States and also from Switzerland, the Czech Republic and New Zealand. The team also took samples from wild bumblebees from the Czech Republic.
They found that American honeybees´ microbiota has a more abundant and diverse set of tetracycline resistance genes than those from other countries, which have banned the use of the drug by beekeepers on their hives. However, the researchers noted that the genes were less prevalent in U.S. colonies that had not been exposed to the antibiotic for 25 years.
Moran said that the bacteria responsible for foulbrood have also been found to be resistant to the antibiotics. Since oxytetracycline is a broad-spectrum antibiotic, it was likely capable of selecting for resistant genes in a wide range of bacteria.
“They carry tetL, which is one of the eight resistance genes we found,” she said. “It’s possible that the gene was transferred either from the gut bacteria to the pathogen or from the pathogen to the gut bacteria.”
The authors of the study point out the irony of fostering resistance and altering the bacteria that live in honeybee guts. In attempting to wipe out or prevent foulbrood, the decades of antibiotic doses may have actually been detrimental to honeybees´ overall health, which includes their microbiota.
Previous studies have suggested that these gut bacteria benefit the bees by neutralizing toxins in their diet, supplementing their nutrition, and defending them against pathogens. Therefore, a strategy designed to protect the bees from harm may actually have weakened their ability to fight off other pathogens.
Moran pointed out that the study could have implications for how honeybee diseases are prevented or treated; the antibiotic-resistance genes in the honeybee gut bacteria doesn’t pose a direct risk to humans. These microbes “don’t actually live in the honey, they live in the bee,” Moran said.
“We’ve never actually detected them in the honey. When people are eating honey, they’re not eating these bacteria.”
Swimming Pool Exercise Provides Same Benefit As Regular Exercise
To anyone who has ever compared a 30-minute jog to a 30-minute swim, this next report may not surprise you.
There seems to be a preconception about exercise that, unless it´s a miserable experience in every way, it´s not really effective. Alternatively, anything done in a pool is often perceived as fun or play. Therefore, the notion of exercising in a pool, not a wacky thought, is often seen as more play than work.
At least, that´s the attitude Dr. Martin Juneau assumes people have towards combining exercise and water.
In a study to be presented today at the Canadian Cardiovascular Congress, Dr. Juneau has found that those who exercise in a pool receive the same health benefits as those land-loving health nuts.
Specifically, those who spent time on an immersible ergo cycle (fancy talk for “underwater bike”) had a near equivalent workout experience as those who rode a traditional bike on land.
“If you can´t train on land, you can train in the water and have the same benefits in terms of improving aerobic fitness,” explained Dr. Juneau, director of prevention at the Montreal Heart Institute in a press statement.
His new study flies against the possible assumption that the resistance of water makes for an easier workout. To conduct his studies, Dr. Juneau observed “healthy people” as they biked on land and in the water. These participants had their intensity levels increase each minute until they could no longer pump the pedals.
In both exercises, the maximal oxygen consumption was the same across the board. This maximal oxygen consumption is a way to determine just how effective a work out has been.
Dr. Juneau´s partner in the study, a clinical exercise physiologist at the Montreal Heart Institute, even went so far as to say a water-based exercise might even be better than a land-based exercise.
“Exercise during water immersion may be even more efficient from a cardiorespiratory standpoint,” adds Dr. Mathieu Gayda.
Dr. Juneau remains cautiously optimistic, however, noting that another study had shown that the heart rates between the exercises were different. Specifically, those who had pedaled the bike on dry ground had a slightly higher heart rate than those who pedaled underwater. This, says Dr. Juneau, could be a simple result of physics.
“You pump more blood for each beat, so don´t need as many heartbeats, because the pressure of the water on your legs and lower body makes the blood return more effectively to the heart. That´s interesting data that hasn´t been studied thoroughly before,” says Dr. Juneau.
There are some, like those who have joint pain or are very overweight, who have trouble with traditional exercises, such as cycling and running. This test proves that there are other alternatives to these kinds of exercise, says Dr. Juneau, adding that while not everyone can swim, it remains the best form of exercise available.
“This is a great alternative,” he says.
Dr. Beth Abramson, a spokesperson for the Heart and Stroke Foundation is also pleased with these results, saying that any exercise is good exercise, and more people need to be doing it.
“Inactive people who become physically active can reduce their risk of heart attack risk by 35 to 55 per cent, plus lower their chance of developing several other conditions, cut stress levels and increase energy,” explains Dr. Abramson in the press release.
“Even if you have difficulty moving more, there are always solutions, as this study shows. This is encouraging given the aging population. It´s never too late or too difficult to make a lifestyle change,” noted Abramson.
Public Smoking Bans Are Driving Down Hospitalization Rates For Heart Attack
Despite a rise in ER visits due to obesity, diabetes and other common health issues, the ban on smoking in restaurants and other public establishments have led to a sharp decline in hospitalizations for heart attacks, strokes and respiratory illnesses such as asthma and emphysema, according to new analysis of several studies.
The analysis, published in the American Heart Association journal Circulation, covers 45 studies from more than 30 smoke-free laws at local and state levels in the US and from a number of countries including New Zealand and Germany.
The data show that comprehensive smoke-free laws were associated with a 15 percent decline in heart attack hospitalizations and a 16 percent decrease in stroke hospitalizations following the adoption of public smoking bans. The adopted laws were also rapidly followed by a 24 percent drop in hospitalizations for respiratory diseases. In fact, the regions with the most comprehensive smoke-free laws resulted in the highest health benefits.
Senior study author Stanton Glantz, Ph.D., director of the Center for Tobacco Control Research and Education at the University of California, San Francisco (UCSF), said: “The public, health professionals and policy makers need to understand that including exemptions and loopholes in legislation — such as exempting casinos — condemns more people to end up in emergency rooms. These unnecessary hospitalizations are the real cost of failing to enact comprehensive smoke-free legislation.”
Findings of this analysis support the AHA´s stance that smoke-free laws must be comprehensive and apply to all workplaces and public environments. The analysis is also consistent with other studies that have found that smoke-free laws were followed by significant decreases in acute heart attack and other cardiac-related hospitalizations.
One such study stems from a 2002 law banning smoking in restaurants only in Olmsted County, Minnesota. The study, published Monday in the Archives of Internal Medicine found that the law had no affect on heart attack hospitalization outcomes, until the law was extended in 2007 to include all workplaces and public environments. After the comprehensive law was put into effect, heart attacks fell by 33 percent, according to information gleaned from Minnesota´s Mayo Clinic.
That drop was impressive, given that people in Minnesota were getting less healthy in the same time frame, with higher rates of diabetes and obesity. The study found rates of high blood pressure and unhealthy cholesterol levels stayed the same.
Raising taxes on tobacco and more smoking-cessation campaigns caused many in Minnesota to quit smoking during the study period, which lasted from 2002 to 2007, according to the authors of that study. But the trends did not fully explain the drop in heart attacks and sudden cardiac deaths (which fell by 17 percent).
While the Olmsted researchers said smoking bans in public environments may have helped lower the hospitalization rates, an accompanying editorial in the journal said people who continued to smoke did not smoke more at home to compensate for the public smoking restriction. In fact, a widespread establishment of no-smoking zones in homes have seemed to follow the public smoking bans, suggesting that many smokers are smoking less than they once did.
In another study examining data from an Indiana Adult Tobacco Survey, researchers found nearly 73 percent of Hoosiers support a statewide workplace smoking ban.
The results of this study could be important in increasing focused public awareness strategies aimed at reducing exposure to secondhand smoke, said study leader Terrell Zollinger, professor of epidemiology at the Richard M. Fairbanks School of Public Health at Indiana University-Purdue University Indianapolis (IUPUI).
In the study, Zollinger found three variables that were the most important predictors of garnering support: People who never or formerly smoked were more supportive, as were females and those who were more aware of the health hazards of secondhand smoke.
About 32 percent of respondents to the survey who are current smokers support indoor workplace smoking bans; 68 former smokers did so; and 85 percent of those who had never smoked said they support indoor workplace bans on smoking.
Zollinger said the results of his study suggest that efforts to gain additional support for smoke-free-air laws should focus on men, people unaware of the health hazards from secondhand smoke, smokers and former smokers.
Zollinger will present the findings of his study at 3:30 p.m. EDT today (Oct. 30)
Raymond Gibbons, a cardiologist and past president of AHA, said smoking bans are meant to protect non-smokers. Secondhand smoke can trigger heart attacks in non-smokers with underlying heart disease, he noted.
Secondhand smoke affects a non-smoker´s blood vessels in as little as five minutes, causing changes that increase the risk of heart attack, according to the Mayo Clinic.
About 46,000 non-smoking Americans die from secondhand smoke exposure each year, according to the National Cancer Institute.
Smoking bans also reduce health care costs for individuals, health plans and government payers, Glantz said. Total savings ranged from $302,000 in all health care costs in Starkville, Miss., to nearly $7 million just in heart attack-related hospitalizations in Germany, according to information in the Circulation study.
“If politicians are serious about cutting medical costs, they need to look at this,” Glantz said. “The best way to keep health care costs down is to not get sick. … There is nothing else you can do to have these big an effect on hospital admissions.”
Glantz said lawmakers should also consider the findings from past studies when voting to exempt certain facilities from smoke-free laws. “The politicians who put those exemptions in are condemning people to be put into the emergency room,” Glantz warned.
David Sutton, a spokesman for Philip Morris USA, the top cigarette maker in the country, noted that his company agrees that secondhand smoke is dangerous, but said smoking bans are not always necessary, and that businesses such as restaurants can accommodate non-smokers through separate rooms or ventilation.
“Reasonable ways exist to respect the comfort and choices of both non-smoking and smoking adults,” Sutton said. “Business owners — particularly owners of restaurants and bars — are most familiar with how to accommodate the needs of their patrons and should have the opportunity and flexibility to determine their own smoking policy. The public can then choose whether or not to frequent places where smoking is permitted.”
In conclusion, Glantz said: “Stronger legislation means immediate reductions in secondhand smoke-related health problems as a byproduct of reductions in secondhand smoke exposure and increases in smoking cessation that accompany these laws. Passage of these laws formalize and accelerate social change and the associated immediate health benefits.”
Stem Cell Study Probes Cartilage Injury and Osteoarthritis
Written By: editor
John
Connie K. Ho for redOrbit.com — Your Universe Online
Researchers from Duke University Medical Center recently revealed that they have been able to engineer cartilage from pluripotent stem cells, which will help in studies regarding cartilage injury and osteoarthritis.
In particular, the pluripotent stem cells were induced and then successful developed to be used in tissue repair. The scientists believe that the induced pluripotent stem cells (iPSCs) could become a source for patient-specific particular cartilage tissue.
“This technique of creating induced pluripotent stem cells — an achievement honored with this year’s Nobel Prize in medicine for Shinya Yamanaka of Kyoto University – is a way to take adult stem cells and convert them so they have the properties of embryonic stem cells,” noted the study´s senior author Farshid Guilak, a professor of Orthopedic Surgery at Duke, in a prepared statement.
The researchers explained how articular cartilage acts as a shock absorber tissue in joints which helps people climb stairs, walk, jump and perform other activities without feeling any pain. Everyday use of joints or an injury can help decrease the effectiveness of the tissue and articular cartilage cannot be easily repaired. As such, damage and osteoarthritis are believed to be the main causes of impairment in older people and leads to people needing joint replacement surgeries.
“Adult stems cells are limited in what they can do, and embryonic stem cells have ethical issues,” continued Guilak in the statement. “What this research shows in a mouse model is the ability to create an unlimited supply of stem cells that can turn into any type of tissue — in this case cartilage, which has no ability to regenerate by itself.”
In the study, the scientists utilized recent technologies that targeted adult stem cells taken from the bone marrow or fat tissue. They worked to create a differentiated population of chondroyctes, which are cells found in healthy cartilage, that were still uniform. These cells could be used in the production of collagen and the maintenance of cartilage. In order to produce the specific cells, the researchers utilized iPSCs taken from adult mouse fibroblasts by using a culture that had been treated with a growth medium. The cells were also tailored to express green fluorescent protein when the cells had developed into chondrocytes. When the iPSCS differentiates, the chondrocyte cells that expressed the green fluorescent protein could be identified quickly. The researchers believe that the tailored cells also created larger amounts of cartilage components and displayed an ability to work well in repairing cartilage problems in the body.
“This was a multi-step approach, with the initial differentiation, then sorting, and then proceeding to make the tissue,” explained Brian Diekman, a post-doctoral associate in orthopedic surgery, in the statement. “What this shows is that iPSCs can be used to make high quality cartilage, either for replacement tissue or as a way to study disease and potential treatments.”
Moving forward, the researchers will focus on how to utilize human iPSCS to examine the cartilage-growing method.
“The advantage of this technique is that we can grow a continuous supply of cartilage in a dish,” concluded Guilak in the statement. “In addition to cell-based therapies, iPSC technology can also provide patient-specific cell and tissue models that could be used to screen for drugs to treat osteoarthritis, which right now does not have a cure or an effective therapy to inhibit cartilage loss.”
The findings are published online in the journal of the Proceedings of the National Academy of Sciences (PNAS).
Migraines Affect Children’s School Testing Scores
Written By: editor
Connie K. Ho for redOrbit.com — Your Universe Online
Researchers recently found that children who have migraines have a greater likelihood of having test scores that are lower than the average school performance of students who didn´t have any migraines.
According to the American Academy of Neurology (AAN), migraines can be described as throbbing pain or intense pulsing around the head.
“Studies have looked at the burden of migraine for adolescents, but less work has been done to determine the effect of migraine on younger children,” explained the study´s author Dr. Marcelo E. Bigal, a member of Merck & Co. and the American Academy of Neurology, in a prepared statement.
The study included 5,671 Brazilian kids who were between the ages of five and 12. The teachers of the students gave the researchers data focused on their students´ performances. The instructors also completed a screening questionnaire based off of the emotional and behavioral problems as well as interviewed parents on past medical history regarding headaches and other symptoms.
Based on the findings, the scientists discovered that those with migraines had a 30 percent higher chance of having test scores that were below the average school performance of children who did not have suffer from headaches. As well, the researchers discovered that 0.6 percent of the children suffered from chronic migraines or a migraine for 15 or more days per month. From the participants, nine percent of the children reported having episodic migraines. Furthermore, 17.6 percent stated that they had probable migraines, where they had all the factors but one of the criteria for migraines and so didn´t have the full criteria that corresponded to other forms of headache syndrome.
Overall, the connection between headaches and poor academic performance was seen more so with children who had migraines that lasted for a longer duration of time, those who had migraines that were of intense pain, or for kids that suffered from chronic migraines as well as for students who suffered from emotional or behavioral problems.
“With approximately one-fourth of school-age children having headaches with migraine features, this is a serious problem, especially for those with frequent, severe attacks that do not subside quickly,” continued Bigal in the statement. “Parents and teachers need to take these headaches seriously and make sure children get appropriate medical attention and treatment.”
AAN provides a number of resources for individuals who are interested in learning more about migraines. The organization reported that migraines could be found three times more in women than in men, affecting over 10 percent of the population throughout the world. A variety of factors can trigger migraines for individuals, including anxiety, hormonal changes, stress, lack of food or sleep, and dietary substances. Even though there is no absolute cure for migraines as researchers attempt to understand the pathophysiology of headaches, experts recommend that people work to prevent the attacks with medications or behavioral changes as well as attempt to lessen the symptoms during attacks.
The findings were recently published in the October 30 edition of Neurology.
Mass Extinction In The Cretaceous Period Was Worsened By Ecosystems
Written By: editor
editor
April Flowers for redOrbit.com – Your Universe Online
A mass extinction, wiping out numerous species including the dinosaurs, marked the end of the Cretaceous Period. A new study, published in Proceedings of the National Academy of Sciences (PNAS), reveals that the structure of North American ecosystems made the extinction worse than it might have been.
Mexico’s Yucatan Peninsula is home to the now-buried Chicxulub impact crater, caused by a mountain-sized asteroid. This impact is almost certainly the ultimate cause of the Cretaceous mass extinction, which occurred 65 million years ago.
Jonathan Mitchell, Ph.D. student at the University of Chicago’s Committee on Evolutionary Biology, said, “Our study suggests that the severity of the mass extinction in North America was greater because of the ecological structure of communities at the time.”
The research team, which included Peter Roopnarine of the California Academy of Sciences and Kenneth Angielczyk of the Field Museum, reconstructed the terrestrial food webs for 17 Cretaceous ecological communities, seven of which existed within two million years of the Chicxulub impact. The ten remaining food webs came from the preceding 13 million years.
The analysis comes from a computer model, developed by Roopnarine, showing how disturbances spread through the food web. The simulation’s purpose was to predict how many animal species would become extinct from a plant die-off since that was a likely consequence of the impact.
“Our analyses show that more species became extinct for a given plant die-off in the youngest communities,” Mitchell said. “We can trace this difference in response to changes in a number of key ecological groups such as plant-eating dinosaurs like Triceratops and small mammals.”
The study’s results paint a picture of the late Cretaceous period in North America in which pre-extinction changes to food webs were likely driven by a combination of environmental and biological factors. These changes resulted in communities that were more fragile when faced with large disturbances.
“Besides shedding light on this ancient extinction, our findings imply that seemingly innocuous changes to ecosystems caused by humans might reduce the ecosystems’ abilities to withstand unexpected disturbances,” Roopnarine said.
The computer model describes all plausible diets for the animals in the study. For example, one time a Tyrannosaurus rex might only eat Triceratops, a second run might have the T. rex eating only duck-billed dinosaurs, while in a third it might eat a more varied diet. This variability stems from the uncertainty around what exactly Cretaceous animals ate. This uncertainty, however, actually worked to the study’s benefit.
“Using modern food webs as guides, what we have discovered is that this uncertainty is far less important to understanding ecosystem functioning than is our general knowledge of the diets and the number of different species that would have had a particular diet,” Angielczyk said.
Modern food web data helped the simulations account for phenomena such as how specialized animals tend to be, or how body size relates to population size and their probability of extinction.
A large number of specific food webs from all the possible food webs in their general framework were selected. The team then evaluated how this sample of webs respond to a perturbation like the death of plants. The same relationships and assumptions were used to create food webs across all of the different sites. This means that the differences between sites stems only from differences in the data rather than from the simulation itself, making the simulation a fundamentally comparative method.
“We aren’t trying to say that a given ecosystem was fragile, but instead that a given ecosystem was more or less fragile than another,” Roopnarine said.
If the asteroid hit during the 13 million years preceding the latest Cretaceous communities, according to the model there would almost certainly still have been a mass extinction, but one that would have been less severe than what actually happened in North America. The team concluded that it was most likely a combination of changing climate and other environmental factors that caused some types of animals to become more or less diverse in the Cretaceous. They suggest that a shallow sea in North America drying up may have been one of the main factors leading to the observed changes in diversity. However, the study provides no evidence that the latest Cretaceous communities were on the verge of collapse before the asteroid hit.
“The ecosystems collapsed because of the asteroid impact, and nothing in our study suggests that they would not have otherwise continued on successfully,” Mitchell said. “Unusual circumstances, such as the after-effects of the asteroid impact, were needed for the vulnerability of the communities to become important.”
The findings of this study have implications for current day conservation efforts.
“Our study shows that the robustness or fragility of an ecosystem under duress depends very much on both the number of species present, as well as the types of species,” he said, referring to their ecological function. The study also shows that more is not necessarily better, because simply having many species does not insure against ecosystem collapse.
“What you have is also important,” Angelczyk said. “It is therefore critical that conservation efforts pay attention to ecosystem functioning and the roles of species in their communities as we continue to degrade our modern ecosystems.”
Image 2 (below): This illustration depicts the food web for ecological groups in the late Cretaceous Period as reported in a new paper published in the Proceedings of the National Academy of Sciences. Each ecological group includes a set of species that share the same set of potential predators and prey. Silhouettes show iconic members of each group. Arrows show who eats whom. Credit: Courtesy of Jonathan Mitchell, Peter Roopnarine and Kenneth Angielczyk
Sweden: Making Money And Energy Off Of Euro Trash
Written By: editor
editor
Brett Smith for redOrbit.com — Your Universe Online
At 96 percent, Sweden has one of the highest rates of recycling. So when the country began generating heat and energy from its trash, it should have come as no surprise they would eventually run out of garbage.
“We have more capacity than the production of waste in Sweden and that is usable for incineration,” Catarina Ostlund, Senior Advisor for the Swedish Environmental Protection Agency told Public Radio International.
This situation has caused the Scandinavian country to begin importing 800,000 tons of trash from its neighbors and charging them for it, allowing the Swedes to truly claim that one man´s trash is another man´s treasure.
In addition to being a revenue stream, Sweden´s waste-to-energy program generates about 20 percent of the country´s district heating that functions by pumping heated water into pipes that run through residential and commercial buildings. The program also generates electricity for a quarter-million Swedish homes.
Of all the countries that pay to export their trash to Sweden, Norway contributes the most. Sweden imports the trash from their neighbors to the north, incinerates it to produce heat and energy, and then ships the byproducts of the process, mainly ashes containing dioxins and heavy metals, back to Norway to be buried in a landfill. The entire process is still more cost effective for the Norwegians than placing the trash directly into a landfill.
According to Ostlund, the Scandinavian country maximizes the energy released by the trash by capturing both heat and energy from the incineration process.
“So that´s why we have the world´s best incineration plants concerning energy efficiency. But I would say maybe in the future, this waste will be valued even more so maybe you could sell your waste because there will be a shortage of resources within the world,” Ostlund said.
One concern associated with the waste-to-energy process is the emissions produced by the power plants. An official statement from the Swedish EPA asserts the government has been steadily working to reduce the amount of harmful emissions released by these plants.
“Sweden has had strict standards limiting emissions from waste incineration since the mid-1980s,” the statement reads. “Most emissions have fallen by between 90 and 99 per cent since then thanks to ongoing technical development and better waste sorting.”
She also anticipated Sweden would begin looking to profit by removing trash from countries across Europe that currently rely heavily on landfills.
“I hope that we instead will get the waste from Italy or from Romania or Bulgaria or the Baltic countries because they landfill a lot in these countries. They don´t have any incineration plants or recycling plants, so they need to find a solution for their waste,” Ostlund said.
If Sweden is to begin receiving both trash and revenues from other European countries, they must act soon as waste-to-energy initiatives have been introduced in Italy, Romania, Bulgaria, and Lithuania.
Ostlund added the future of the Swedish waste program should continue to focus on reducing waste and not just generating power from it.
“This is not a long-term solution really, because we need to be better to reuse and recycle, but in the short perspective I think it´s quite a good solution,” she said.
Apple Store Sells Wi-Fi Enabled Philips Light Bulbs
The idea of making ordinary household objects extraordinary with the addition of Wi-Fi isn´t a new one. The Nest thermostat, for example, works with a home´s Wi-Fi set up to enable wireless and remote control operations. Instead of getting up to turn the air off, you only have to reach as far as your iPhone.
The light bulb has also seen its day in the glow of Wi-Fi greatness. The LIFX bulb, one example of many, will enable users to control energy efficient LED bulbs remotely and even change the bulb´s color, all via a smartphone app.
Today, Philips is announcing their entrance into the Wi-Fi-enabled light bulb market and, just like the Nest thermostat before it, shoppers will be able to pick it up at any Apple store across the nation. Actually, the new Philips system will be available at Apple stores only in the beginning, starting October 30th.
Called “Hue,” Philip´s smart bulb system is built on top of the ZigBee Light Link standard. Philips´ competitors, such as GE and Sylvania are also working with the same standard.
While placing such energy efficient LEDs in the Wi-Fi halo is already a fine idea, these bulbs are often only as good as the apps that run them. This is where Hue looks to excel. Each bulb can be controlled separately, meaning each bulb can change into different colors. What makes Hue stand out is the way in which these colors can be changed. Rather than pick a color from a spectrum wheel, the Hue app lets you choose colors from pictures. It´s a feature which Phillips is touting as a way to recreate the special moments in your life. For instance, if you´ve taken a picture of a candle lighting ceremony, you´ll be able to pick which oranges and yellows you´d like your bulbs to take on.
The app even allows users to place their bulbs on timers, dimming and lighting up at specific times. This feature is not only handy for those with kids, but it could also make waking up a little more pleasant. Imagine waking up to a soft and warm light as opposed to a loud ringing alarm and the harsh rays of a fluorescent bulb. These timers can also add another important piece to the home-automation puzzle, with lights turning on and off in perfect cycle. All you have to do is simply live in the house and move from room to room.
The app also lets you turn on the lights when you´re away from home, an added security feature for those who don´t want their houses to look particularly empty.
Finally, Philips has created 4 pre-programmed light settings which they call “LightRecipes.”
These settings are the result of research conducted by Philips to help facilitate different moods. For instance, users who have just come home from work can select “relax” for a soothing glow. Those who need to spend some time studying can choose “concentrate” to change their room into a well-lit area conducive for work.
Philips is even opening up Hue to the developer community and have created an open source platform to be used and integrated into other apps.
“Philips hue is a game-changer in lighting — a completely new way to experience and interact with light. In the way phones, media and entertainment have been revolutionized by digital technology, now we can also personalize light and enjoy limitless applications,” said Jeroen de Waal, Head of Marketing & Strategy at Philips Lighting, in a press release.
The starter kit is priced at $199 and comes with 3 bulbs and a Smart Bridge, a device which plugs into your Wi-Fi router and connects your smartphone with your bulbs. After that, the bulbs cost $59 a piece, a bit of a premium considering how much average LED bulbs cost.
RedOrbit Exclusive Interview: Dr. Matthew Longo, Birkbeck University Of London
Written By: editor
Jedidiah Becker for redOrbit.com — Your Universe Online
When it comes to providing an objective account of the world around us, the human brain can be a notoriously inaccurate and biased reporter. Scholarly literature in the fields of psychology and neuroscience are full of studies that demonstrate how the brain ‘fudges’ the picture of reality with which it presents our conscious mind, often favoring a useful or convenient interpretation of our surroundings over a strictly accurate one. And as a number of studies have shown in recent years, our perception of our environment is never more distorted than when we find ourselves in emotionally intense situations.
In the results of a groundbreaking study recently published in the journal Current Biology, two psychologists have shown that a sense of fear and impending danger can actually alter our perception of space and distance when we are being approached by threatening objects. Psychology researcher Dr. Matthew Longo of Birbeck University of London recently talked with redOrbit about he and his colleague´s research into the effects of fear on spatial perception.
RO: Professor Longo, what attracted you and your colleague Dr. Lourenco to this research topic — the ability of fear to affect our spatial perception?
Longo: Stella Lourenco and I started collaborating when we were both doctoral students at the University of Chicago, using some seed funding from our department to start a project investigating how we perceive the ‘near space’ immediately surrounding the body differently from the space farther away. After we both graduated, we continued this research, first in Stella’s lab at Emory University, and now as well in my lab at Birkbeck.
Initially, we focused on the role of near space in guiding action. Our early results showed that altering people’s ability to act produced flexible alterations of how much space around the body the brain codes as ‘near.’ For example, we showed that using a tool expands the size of near space, while putting heavy weights on the arms causes it to contract. More recently, we’ve became interested in the idea that near space might be involved in a quite different function, namely protecting the body against potentially threatening objects. Specifically, we speculated that near space might be involved in feelings of claustrophobia in constrained spaces. We reasoned that if stimuli in near space were coded as potentially threatening, that people with a larger near space would feel more anxious in any given enclosed environment since more things would impinge on their near space. Indeed, we found that people with a larger near space around their body, measured using the methods we developed in our earlier research, reported more claustrophobic fear on a standard questionnaire than people with a smaller near space.
Our recent study emerged from that research. We were interested in how stimuli which many people find intrinsically threatening — like snakes or spiders — alter our perception. We used a phenomenon called ‘looming’ in which objects on a direct collision course with an observer produce a specific pattern of expansion on the retina. Traditionally, looming has been considered a purely optical cue specifying the time until objects will collide with an observer. In contrast to that view, we showed that people underestimate the time-to-collision of threatening stimuli (snakes and spiders) compared to less threatening stimuli (butterflies and rabbits). Further, this bias is larger in people who report more fear of snakes and spiders.
RO:Your colleague Professor Lourenco mentioned that our tendency to underestimate collision time when faced with a potentially threatening object probably had some clear survival advantages for our early ancestors. As anyone even casually interested in science knows, evolution and natural selection have been some of the most powerful tools in the advancement of the biological sciences in the last hundred years — from genetics, biochemistry and cell biology to ecology and physiology. However, we don’t typically hear as much about the role of evolutionary theory in the field of psychology. In your estimation, how important is evolution in the study of modern psychology?
Longo: Discussion of evolution is perhaps less conspicuous in many areas of psychology than in other areas of the biological sciences, but forms the essential background against which almost all research in psychology is conducted. One of the central questions of psychology over the past century has been what changes in the evolution of human cognition led to the tremendous complexity and sophistication of human social organization, which appeared unprecedented elsewhere in the animal kingdom. This remains a major topic of research, with proposed answers including such things as language, the opposable thumb, and (more recently) ‘theory of mind.’ The past two decades have seen increasing prominence of more explicit discussion of evolutionary issues in psychology, notably in the fields of comparative psychology and what´s become known as evolutionary psychology.
The study of looming visual stimuli is a good example of the importance of evolutionary considerations in shaping psychological research. Research on looming emerged from the tradition of ‘ecological optics’ developed by Professor James Gibson of Cornell University in the 1960s. Gibson argued forcefully that perception could only be understood by considering how animals actually use their senses in their actual environments. Thus, rather than present his participants with abstract shapes or other meaningless stimuli, Gibson gave detailed consideration to what cues in complex sensory environments provided information about aspects of the world critical for an organism´s survival.
Gibson’s work on looming was motivated by his analysis of the optics of objects on a collision course with an observer. He and his colleagues showed that monkeys made repeated defensive responses when shown a shadow on a projection screen which increased in size in a specific pattern mimicking what happens to light projected on the eyes when an object moves directly towards us. Subsequent research showed similar responses in human infants and even in human adults whose attention was distracted.
In parallel, research in neurophysiology documented nerve cells in the eyes of species such as mice and wasps which appear specialized for detecting looming visual stimuli. Thus, converging evidence from studies of animal behavior and from physiology suggested that the nervous systems of a wide range of animals have been shaped by evolution to rapidly detect and respond to rapidly approaching stimuli. Our recent findings are consistent with this interpretation, but show further that perception of looming stimuli is also modulated by our emotional reactions to the specific type of stimulus which is approaching.
RO:The study mentions that we don’t yet understand exactly how fear is affecting our spatial perception. You explained that it may be causing the brain to perceive an approaching danger as moving faster than it actually is, or that it may simply be causing the individual to experience an enlargement of their personal sphere — their ‘safety zone,’ if you will. Have you got any professional ‘hunches’ about which of these two mechanisms it might be, and what would an experiment to test this look like?
Longo: I suspect that both types of explanation may be true to some extent. For example, there is evidence that stressful situations can cause time to seem to slow down, which could lead to underestimation of when an object would collide with us. On the other hand, there is also evidence from neurophysiological studies suggesting that rapidly approaching objects can expand the size of the ‘near space’ immediately surrounding the body. There are well established methods in the literature for measuring both the speed of the mental ‘clock’ and the extent of near space. We are currently planning experiments using these methods to try to assess these two interpretations.
RO:You and Professor Lourenco mentioned that you see your research as having implications for how we understand clinical phobias. Could you elaborate a bit on this?
Longo: Our research, both on claustrophobic fear and on fear of snakes and spiders, has measured individual differences in fear using an unselected sample of the general population, rather than people with clinical-level phobias. Of course, these are things that almost all of us experience some level of anxiety about, though to different degrees. What our results show is that the person-to-person differences in how intensely we experience these fears relate to the magnitude of the perceptual distortions we measure: People who report more fear show larger perceptual effects. Our suspicion is that the perceptual biases we have described would be even larger in people with clinical phobias.
An important question about clinical phobias is whether they arise from the top-down as a result of mistaken beliefs or attitudes about the feared object, or from the bottom-up as a result of biases in low-level aspects of perception. By showing that person-to-person differences in fear relate systematically to differences in basic aspects of vision, our results provide some support for the ‘bottom-up’ perspective, suggesting that pathological fears might reflect differences in the basic organization of sensory processing. Of course, this interpretation remains speculative, but it could have important implications for understanding where phobias come from and how they might be effectively treated.
RO: In recent years, researchers like the neuroscientist David Eagleman have been drawing popular attention to just how much of our brain’s activity is completely outside of our conscious control as well as to the fact that our brains often present us with an ‘edited’ picture of reality that is not always objective or entirely accurate. Your specialty field of research focuses on how the brain creates and maintains distorted “body representations.” Could you tell us a bit more about this as well as where this recent study fits into your broader research interests?
Longo: My research has focused on understanding how the brain constructs representations of what our body is like and how these representations shape how we perceive the world. There is no mystery why the brain has distorted representations of the body at some level. Consider touch: it is obvious that some parts of the body have exquisite tactile sensitivity (such as the fingertips and lips), while other parts have much poorer sensitivity (such as the back).
It has long been known that maps in the brain which process tactile information devote many more resources to highly sensitive than to less sensitive skin regions. Every introductory psychology textbook shows a picture of the little man whose body is in the proportions of tactile maps with enormous hands and lips, commonly known as the ‘Penfield homunculus‘ (after the eminent Canadian neurosurgeon Wilder Penfield who described this map in his patients). So it’s not surprising that there are distorted representations of the body underlying basic processing of touch, as it is clearly advantageous to have a few body parts with exquisite sensitivity rather than having homogenously mediocre sensitivity all over the body.
What has been surprising in my recent research is that these distortions appear to be preserved (though in reduced form) in higher-level representations of the body underlying more complex aspects of perception. For example, when we cover up a participant’s hand and ask them to indicate where they perceive the tip and knuckle of each finger as being, they place the knuckles much too far apart, as if the hand were represented as much wider and fatter than it actually is. Similarly, people overestimate how far apart two points touching the skin are when they are oriented across the width of the hand, compared to when they are oriented along the length of the hand.
Our recent research on looming emerges from a longstanding collaboration with Stella Lourenco, who I mentioned earlier. The main focus of our research has been on understanding how the brain represents the space immediately surrounding the body differently from the space farther away. As I described previously, our initial research on this issue was centered on how the body shapes space perception, both by comparing people with longer and shorter arms or by manipulating the body’s ability to act, such as with tool use or putting heavy weights on the arms. We have gradually become increasingly interested, though, in how emotion affects our perception of space, and vice versa, which was the focus of our present study on looming.
RO: Judging from your list of publications, you’ve been a very busy researcher. Do you already have your eyes on your next research project, and would you care to give us a sneak peek?
Longo: As I described before, our recent work on space perception has shown intimate links between space perception and emotions, such as fear. One of the things I’m most excited about is extending this line of research to understand the relationship between emotion and how we represent our body.
One of the key aspects of clinical conditions such as body dysmorphia and some eating disorders is the abnormal emotions and attitudes that patients have towards their own body. I suspect that the various distortions of body representations that I’ve described may have implications for understanding emotional attitudes about the body and the factors that might alter them. This line of research remains at an early stage, but I’m optimistic that it will provide important insights into the relation between emotion and body representation.
RO:Dr. Longo, thanks very much for taking the time have a chat with us. On behalf of the redOrbit team and our readership, we wish you the best of luck on your future research and look forward to reading about your next research project.
Biography
Matthew Longo is a Lecturer in the Department of Psychological Sciences at Birkbeck, University of London where he directs the Body Representation Laboratory. He received his B.A. in Cognitive Science from the University of California at Berkeley in 2000 before completing his PhD in Psychology at the University of Chicago in 2006. After his PhD he moved to London to conduct Postdoctoral Research at the Institute of Cognitive Neuroscience at University College London. He has been at Birkbeck since 2010.
His research investigates how the brain constructs representations of the body and how these affect how we perceive the world around us. By combining a range of methods from cognitive neuroscience and perceptual psychology, he has shown that the brain maintains a diverse set of models of the body which have pervasive influences on perceptual abilities including the perception of touch, proprioception, pain, and visual space perception. He is an author of more than 50 scientific papers. His research has been supported by awards from the National Science Foundation (NSF) and the Royal Society of London.
Rainfall In The South Pacific Could Become Difficult To Predict
Written By: editor
redOrbit Staff & Wire Reports – Your Universe Online
Two different competing climatic effects — increasing temperatures and changes in atmospheric water transport — will determine how much (or how little) rainfall the South Pacific islands will receive in the future, claims a new study published in Sunday’s online issue of the journal Nature Climate Change.
According to the study, which was written by Matthew Widlansky and Axel Timmermann of the International Pacific Research Center and the University of Hawaii at Manoa and an international team of colleagues, those two phenomenons occasionally counteract each other, meaning that future rainfall projections will be extremely difficult to predict.
The island nations in the South Pacific rely upon the 8,000km long South Pacific Convergence Zone (SPCZ), the biggest rainband in the Southern Hemisphere, for the bulk of their precipitation, the researchers said. Any changes that occur in that cloud and precipitation formation would have “severe consequences” for the “vulnerable island nations already having to adapt to accelerating sea level rise.”
However, researchers know little about how the SPCZ will react to climate change created as a result of increasing greenhouse gas emissions.
The reason for that, according to Widlansky, a postdoctoral fellow at the International Pacific Research Center, is that many existing climate models are “notoriously poor” when it comes to simulating the band.
In order to try and make South Pacific climate simulations more accurate, he and his colleagues removed some deviations in observed sea surface temperature. In doing so, they were able to pinpoint a pair of competing climatic mechanisms which impacted precipitation trends in the region.
“We have known for some time that rising tropical temperatures will lead to more water vapor in the atmosphere,” Timmermann, the professor of oceanography at University of Hawaii at Manoa, explained. “Abundant moisture tends to bring about heavier rainfall in regions of converging winds such as the SPCZ.”
“Nearly all climate change model simulations, however, suggest the equatorial Pacific will warm faster than the SPCZ region. This uneven warming is likely to pull the rainband away from its normal position, causing drying in the Southwest Pacific and more equatorial rainfall,” he added.
The two competing mechanisms are dubbed the “wet gets wetter” and the “warmest gets wetter” climate change mechanisms, respectively, the scientists said, and they are said to be the cause of the uncertainty in SPCZ rainfall predictions.
“The scientists found that depending upon the degree of tropical warming expected this century, one or the other mechanism is more likely to win out,” the university said. “With moderate warming, weaker sea surface temperature gradients are likely to shift the rainband towards the equator, potentially causing drying during summer for most Southwest Pacific island nations. For much higher warming possible by the end of this century, the net effect of the opposing mechanisms is likely a shift towards more rainfall for the South Pacific islands.”
“To be more definite in our projections, however, we need more extensive observations in the South Pacific of how clouds and rainfall form and how they respond to such climate phenomena as El Niño,” Timmerman added. “Before we have more confidence in our calculations of the delicate balance between the two climate change mechanisms, we need to be able to simulate cloud formations more realistically.”
Tsunami Hits Hawaii Following Earthquake In British Columbia
Written By: editor
editor
redOrbit Staff & Wire Reports – Your Universe Online
A magnitude 7.7 earthquake struck the Canadian province of British Columbia late Saturday night, leading to short-lived tsunami warnings and mass evacuations as far away as Hawaii, according to various media reports.
The earthquake was centered in the Queen Charlotte Islands area (also known as Haida Gwaii) of British Columbia, US Geological Survey (USGS) officials told Mark Thiessen and Oskar Garcia of the Associated Press (AP), and was followed by a 5.8-magnitiude aftershock shortly thereafter. There were no immediate reports of damage, although the seismic event was felt as far away as southeastern Alaska.
“Residents near the center of the quake said the violent jolting lasted for up to a minute, but no injuries or major damage had been reported,” the Canadian Press confirmed early Sunday morning. One resident in the vicinity told the news agency that the quake lasted approximately 40 seconds, and Simon Fraser University earth scientists Brent Ward called it the second largest to hit Canada since 1949.
The earthquake, which CBC News reported hit at approximately 8pm PT, caused at least three tsunami waves spotted off the coast of the province, and led to evacuations and tsunami warnings in Haida Gwaii and Port Edward, near Prince Rupert. The Canadian Press also reports that similar warnings had been issued in Alaska and Hawaii.
Shortly before 6am ET Sunday morning, NBC News confirmed that Hawaii had been hit by a tsunami, and that at least 100,000 people had been evacuated and directed to higher elevations. According to their reports, the Pacific Tsunami Warning Center said that the first wave was “three feet high and less forceful than expected. Some forecasts had predicted a wave of up to six feet high.”
“The tsunami hit with little warning and an alert, issued at short notice due to initial confusion among scientists about the quake’s undersea epicenter, caused massive traffic congestion as motorists made a mass exodus from low-lying areas,” NBC News reported.
Tsunami waves were spotted in other areas as well, according to the CBC.
“Dennis Sinnott of the Canadian Institute of Ocean Science said a 69-centimeterre wave was recorded off Langara Island on the northeast tip of Haida Gwaii, formerly called the Queen Charlotte Islands,” they said. “Another 55-centimeter wave hit Winter Harbour on the northeast coast of Vancouver Island, while a 12-centimeter wave was recorded in Tofino, on Vancouver Island’s west coast.”
Additional tsunami waves were believed possible in Hawaii. However, in southern Alaska and British Columbia, the warning was downgraded to an advisory shortly after 5am ET Sunday morning, Thiessen and Garcia said. They also noted that new advisories had been issued for parts of northern California and southern Oregon.
Ward told the Canadian Press that he was not surprised that the tsunami warning was shortlived in most areas, adding that tsunamis are not typically triggered by strike-slip movements along faults. He said that “a vertical movement of the sea floor” is required to displace water and create the massive waves. “Because it’s sliding across each other” in a strike-slip movement, “you’re not generally moving the water,” he added.
Brain Scans Used To Determine Content Of Dreams
Written By: editor
editor
redOrbit Staff & Wire Reports – Your Universe Online
Can’t remember what you dreamt about last night? Never fear, because a team of Japanese researchers has reportedly discovered a way to determine what thoughts are going through a person’s mind about while they sleep.
Yukiyasu Kamitani, a member of the ATR Computational Neuroscience Laboratories in Kyoto, and colleagues recruited three male volunteers and monitored them while they slept, using electroencephalography to record their brain waves and studying the results in search of “changes in activity which could be related to the content of their dreams,” explained Telegraph Science Correspondent Nick Collins.
When the researchers detected changes in the brain waves of the subjects — a sign that they had started dreaming — they woke up the subject and asked him what he had been dreaming about. The subject was then allowed to return to sleep. The process was repeated between seven to 10 times per day, in three-hour blocks, for each participant, Mo Costandi of Nature added.
The researchers compiled approximately 200 dream-related reports from their volunteers, he added.
“Researchers reported that while some of the dreams were out of the ordinary — for example a discussion with a famous actor — most involved more mundane experiences from everyday life,” Collins said. “From the dream accounts they picked out 20 of the most commonly occurring themes, such as ‘car’, ‘man’, ‘woman’ and ‘computer’, and gathered pictures which represented each category.”
“The participants were then asked to view the images while their brains were scanned a second time,” he added. “By comparing the second set of brain activity data with the recordings made just before the volunteers had been woken up, the researchers were able to identify distinctive patterns in three key brain regions which help us process what our eyes see. They also found that activity in a number of other brain regions with more specialized roles in visual processing, for example in helping us recognize objects, varied depending on the content of the dreams.”
Those three brain areas — V1, V2, and V3 — are involved in the primary stages of visual processing, Costandi explained. They encode contrast, edge orientation, and other essential features of visual scenes.
Four years ago, Kamitani and his associates reported that they had figured out a way to decode the brain activity linked to the earliest stages of visual processing, and used that information to recreate images that they then showed to volunteers, the Nature writer said. Now, they’ve built upon those findings and adapted them to dreams.
Their work was presented earlier this month at the Society for Neuroscience‘s annual meeting, held this year in New Orleans.
“We built a model to predict whether each category of content was present in the dreams,” Kamitani told Costandi. “By analyzing the brain activity during the nine seconds before we woke the subjects, we could predict whether a man is in the dream or not, for instance, with an accuracy of 75-80%.”
“This is an interesting and exciting piece of work. It suggests that dreaming involves some of the same higher level visual brain areas that are involved in visual imagery,” added University of California, Berkeley, neuroscientist Jack Gallant, who was not involved in the study. “It also seems to suggest that our recall of dreams is based on short-term memory, because dream decoding was most accurate in the tens of seconds before waking.” he adds.
The Java mouse deer (Tragulus javanicus) is an even-toed ungulate that can be found in forested areas of Java and possibly Bali. Its native range includes areas in Indonesia and Malaysia. It prefers a habitat at higher elevations, although it does appear at lower elevations between 1,312 and 2,296 feet. It is thought that other chevrotains may occur on the edges of this mouse deer’s habitat, making it a logical assumption that it appears in thick undergrowth like other mouse deer.
The taxonomic status of the Java mouse deer is debatable, although it is currently classified as a mouse deer. Its scientific name was given by Grubb in 2005, although some experts suggest that it is not accurate. Because there is no physical data to study the species with that can be verified, it has been placed in the genus Tragulus with other mouse deer, or chevrotains. Its name was commonly used to represent large chevrotains, but it was found that these most likely do not occur on Java, and therefore, cannot be included with the Java mouse deer. It was suggested by Meijaard and Groves in 2004 that this smaller chevrotain is unique to the island of Java, making it a distinct species from all other chevrotains, but this has not affected its current classification.
Future review on the classification of this species is needed, because there are many species of chevrotain, or mouse deer, and it is not known how many exist alongside the Java mouse deer. There have been reports of two smaller subspecies of the Java mouse deer, which are related in appearance. However, this is not enough information to classify any subspecies under the Java species. There have also been findings of larger chevrotains on the island, making it even more difficult to properly classify the Java mouse deer.
There have been no field studies pertaining to the Java mouse deer specifically, but it is thought that populations are small due to lack of sightings when compared to other species. On the Dieng Plateau in Java, individuals were sighted in only five of teen study sites in 2008, leading experts to believe that the Java mouse deer is very shy in demeanor. One expert previously noted that this species was relatively abundant, so it is thought that the lack of sightings could also be caused by a decline in population numbers. Although a decline in population numbers is not the only possible cause of this, it is becoming rare to find in some markets located in Java.
Natural habitats in Java have become highly fragmented due to human populations growing. When the Dutch settled in Java, many protected areas were created, but these went into decline after Java gained independence until the 1970’s. Beginning in 1982, after Indonesia hosted the World Parks Conference, many more protected areas were designated, and conservation efforts increased due to funding by donors including the World Bank. Other protected areas like game and nature reserves that did not receive much funding remain in poor condition. Between the 1980’s and 1990’s, gun regulation caused an increase in snare and trap hunting, which may have affected the populations of chevrotain negatively. Habitat loss is not a major threat to this species, although some loss did occur due to illegal logging and agriculture. In 1997, after a large socio-political change affected the citizens view of police and law, hunting and illegal practices increased in protected areas.
The Java mouse deer is often found in markets in areas like Jakarta, Malang, Yogyakarta, and Surabaya. It is difficult to overlook these in markets, because they are typically found in small cages. The number of individuals found in most markets is relatively high, conflicting with the findings that they are rare in the wild. This species has been trapped and hunted for many years to be sold as pets and game meat. In recent years, the Java deer mouse has not been spotted as often as in the past, leading some to believe that the species is becoming harder to trap, and therefore, may be in decline. Hunting is thought to be a possibly major threat to the species. Because the Java mouse deer is still considered abundant in markets, it is thought that large populations must still exist in Java, or that the individuals found are imports from other areas of Indonesia.
The Java mouse deer has been protected by law since 1931, but hunting does occur. Conservation efforts must begin with a clearing up of any taxonomic confusion associated with this species, including information about the origin of each studied specimen. It is thought that efforts to understand this species must include searching through secondary forests and forest edges, where many species are not sought. This mouse deer appears on the IUCN Red List with a conservation status of “Data Deficient” and more information is needed about its habits, range, and population before any changes to its status can occur.
Image Caption: Tragulus Javanicus (Lesser Malay Mouse-deer) in the Jerusalem Zoo. Credit: Levg/Wikipedia (CC BY-SA 3.0)
Eye Drops And Nose Sprays Can Be Deadly To Children If Ingested
Written By: editor
editor
redOrbit Staff & Wire Reports – Your Universe Online
The US Food and Drug Administration (FDA) is warning that some types of eye drops and nasal decongestant sprays could be seriously harmful to children if swallowed.
The offending over-the-counter health products can be poisonous if misused by young children, and can cause “serious health consequences” if they are consumed, FDA pharmacist Yelena Maslov said in a statement, according to CBS News writer Ryan Jaslow.
The federal agency said that it has received several reports of “serious health issues from kids who ingested products containing tetrahydrozoline, naphazoline and oxymetazoline,” Jaslow added. “Tetrahydrozoline is found in Visine Original, Walgreens Redness Reliever Advanced Eye Ophthalmic Solution and other products, while naphazoline is found in All Clear Ophthalmic Solution, Naphcon A Ophthalmic Solution and other products. Oxymetazoline is found in nasal spray brands including Afrin, Dristan and Sudafed sprays.”
In fact, Daniel J. DeNoon of WebMD Health News said that more than 4,500 children under the age of five had been injured by eyedrops between 1997 and 2009. During that same period, over 1,100 kids were injured by nasal sprays, according to statistics originating from the US Consumer Product Safety Commission (CPSC).
The medications, which DeNoon described as “surprisingly powerful,” can be consumed by children because they do not come in child-resistant packaging. The FDA told WebMD Health News that swallowing less than one-fifth of a teaspoon can cause serious harm to a child, including breathing issues, a decrease in heart rate, and a loss of consciousness. The CPSC said that symptoms can begin in as little as one hour.
A complete list of products that fall under the FDA’s warning can be found at the agency’s website
According to HealthDay News, earlier this year the CPSC proposed a rule requiring child-resistant packaging for any product containing at least 0.08 milligrams of an imidazoline derivative. The rule has not yet been approved, though, and officials are urging parents to call the toll-free Poison Help Line at 1-800-222-1222 if they believe their children have accidentally ingested eye drops or nasal spray.
The agency says “parents should also practice safety when storing medications and potentially harmful substances,” Jaslow said. “Tips to reduce risk include storing medications in safe locations too high for children to reach, never leaving pills or vitamins out on counters, re-locking safety caps, not taking medication in front of children“¦ and reminding guests to put purses, bags or coats away and out of sight when visiting homes with children”
Research: More Exercise Means A More Satisfied Life
Written By: editor
Brett Smith for redOrbit.com — Your Universe Online
Feeling down in the dumps? Go for a run. Already run every day? Go for a longer run than usual.
That´s what a group of Penn State researchers are saying after conducting a study that examines the psychological benefits of exercise.
“We found that people’s satisfaction with life was directly impacted by their daily physical activity,” said Jaclyn Maher, Penn State graduate student in kinesiology. “The findings reinforce the idea that physical activity is a health behavior with important consequences for daily well-being and should be considered when developing national policies to enhance satisfaction with life.”
In the study, which was published in the journal Health Psychology, the Penn State team focused on young adults, ages 18 to 25, because these are the ages when people tend to experience the most life transitions and uncertainty.
“Emerging adults are going through a lot of changes; they are leaving home for the first time and attending college or starting jobs,” said Maher. “As a result, their satisfaction with life can plummet. We decided to focus on emerging adults because they stand to benefit the most from strategies to enhance satisfaction with life.”
Study participants were divided into two different groups. The larger first group, of 190 volunteers, was told to enter information into a diary every day for 8 days. The smaller second group, consisting of 63 participants, entered information online every day for 14 days.
Both groups were asked to answer questions about their satisfaction with life, amount of regular physical activity, and self-esteem. To establish a baseline reading, all participants in the first group were assessed at the outset of the study to determine their personal dispositions.
The second group was studied to see if different amounts of physical activity had an effect on participants’ increased satisfaction with life. Researchers also measured the group´s relative mental health, fatigue and Body Mass Index.
“Shifts in depression, anxiety and stress would be expected to influence a person’s satisfaction with life at any given point in time,” said David Conroy, professor of kinesiology at the university. “In addition, fatigue can be a barrier to engaging in physical activity, and a high Body Mass Index associated with being overweight may cause a person to be less satisfied in a variety of ways.”
Based on their findings, the researchers confirmed a greater amount of physical activity can positively improve satisfaction with life.
“Based on these findings, we recommend that people exercise a little longer or a little harder than usual as a way to boost satisfaction with life,” said Conroy.
The Penn State study coincides with two other new studies that illustrate the benefits of exercise. According to a report in the Journal of Applied Physiology, Taiwanese researchers showed that regular physical activity could mitigate the effects of aging on the animal brain by studying laboratory mice that spent regular time on a moving treadmill.
Another study from University of Georgia researchers showed regular exercise can stimulate the development of new mitochondria within the body´s cells, resulting in increased energy levels over time.
Anesthesia More Similar To Sleep Than Originally Understood
Written By: editor
Connie K. Ho for redOrbit.com — Your Universe Online
Close your eyes, inhale slowly, and then exhale. Lie back and feel your body slowly relax as you melt into the cushion. Everything fades away as darkness slowly takes over and a feeling of peace passes over.
For those who love sleep, the aforementioned sentiments might seem familiar. They might even seem familiar to those who have been anesthetized for surgery, where the brain turns off and tunes into sleep mode.
In particular, a new study published in the current edition of Current Biology, a Cell Press publication, discovered that drugs not only switch wakefulness “off” but push all sleep circuits to be turned “on.”
Max Kelz, an anesthesiologist, was interested in finding out about the state of sleep for patients who are put under anesthesia.
“Despite more than 160 years of continuous use in humans, we still do not understand how anesthetic drugs work to produce the state of general anesthesia,” explained Kelz, a researcher at the University of Pennsylvania, in a press release. “We show that a commonly used inhaled anesthetic drug directly causes sleep-promoting neurons to fire. We believe that this result is not simply a coincidence. Rather, our view is that many general anesthetics work to cause unconsciousness in part by recruiting the brain’s natural sleep circuitry, which initiates our nightly journey into unconsciousness.”
With anesthesia, there is a difference between natural sleep and unconsciousness. As such, those who are in deep sleep any regular night can be aroused but those who receive anesthesia will maintain deep sleep through any intensive surgery.
“General anesthetics have been used to manipulate consciousness in patients for nearly 170 years, but it is still not known how these drugs impart hypnosis. At the molecular level, the number of possible effector sites is staggering: dozens of molecules are known to be sensitive to anesthetic agents,” wrote the researchers in the introduction of the article.
The researchers studied the region of the brain located deep inside the hypothalamus, which can increase activity when an individual falls asleep. They discovered that isoflurane, a type of anesthetic drug, was able to increase activity in the sleep-promoting brain area of mice. They used a mix of direct electrical recording and other methods to determine that animals who lacked function of those specific neurons in the brain eventually became more resistant to falling into a state of sleep with anesthesia.
“This work demonstrates that anesthetics are capable of directly activating endogenous sleep-promoting networks and that such actions contribute to their hypnotic properties,” noted the researchers in the article.
Based on the findings, researchers believe that more aspects of anesthesia are understood and the researchers intend to study the topic further in the future.
“The development of anesthetic drugs has been hailed as one of humankind’s greatest discoveries in the last thousand years,” concluded Kelz in the statement. “Anesthetics are annually given to over 230 million patients worldwide. Yet as a society, and even within the anesthesia community, we seem to have lost our curiosity for how and why they work.”
Adding Microbes To Water To Benefit Consumers
Written By: editor
Brett Smith for redOrbit.com – Your Universe Online
The award-winning Global Challenges/Chemistry Solutions podcast produced by the American Chemical Society (ACS) has consistently been putting forth groundbreaking, research-based solutions to problems facing people around the world.
The latest episode, based on a paper recently published in the ACS journal Environmental Science & Technology, explains how water filtration systems might be used to encourage the growth of beneficial microbes in “purified” drinking water that would benefit consumers and outcompete harmful bacteria.
“Municipal drinking water treatment plants also add chlorine or other disinfectants to kill bacteria and prevent them from thriving in water distribution pipes,” Lutgarde Raskin, a co-author of the journal report and professor of environmental engineering at the University of Michigan explained in the podcast.
“Even with disinfection, it´s not possible to totally eliminate bacteria, which makes it important to determine how (filtration) and other water treatment steps impact the types and amounts of bacteria that remain,” Raskin said. “That´s why we set out to do this in a study at a drinking water treatment plant in Ann Arbor, Michigan.”
In their study, Raskin and her team produced some surprising results they said could be used to improve how drinking water is filtered.
“We found that certain types of bacteria attach to the filters when they form bio-films, from which small clumps can break off and make it into the drinking water supply,” Raskin said. “But what´s surprising in our results is that the majority of the bacteria that ended up in the finished water originated from the filter and not from the river and the well waters that were used as source waters.”
“This finding provides us with the opportunity to select for beneficial bacteria in drinking water,” she added.
The podcast, which incidentally coincides with the 40th anniversary of the Clean Water Act, went on to assert that pH modulation could play a role in selecting bacteria that end up in filtered water as it was the strongest factor to affect the bacterial populations.
The results of the study could have massive ramifications not only for American water plant facilities, but also for those in developing nations where access to clean water still remains a problem.
Jacques Morisset, an economist with the World Bank, wrote in a recent blog post that this problem is particularly pronounced in certain parts of Tanzania, even though it has “three times more renewable water resources than Kenya.”
“Few households have access to clean drinking water from a piped source,” he wrote. “Only a small fraction of rural households can access water to irrigate their farms. “
The World Health Organization (WHO) is also sounding the alarm with regard to clean water access. According to the organization, 80 percent of all illnesses in developing countries are the result of unsafe drinking water.
“At any given time, one-half of all people in the developing world are suffering from one, or more of the six main diseases (diarrhea, ascaris, dracunculiasis, hookworm, chistosomiasis, trachoma),” according to a recent statement from the WHO.
Is Moderate Drinking As Bad For Brain As Binge Drinking?
Written By: editor
editor
Lee Rannals for redOrbit.com — Your Universe Online
There have been numerous scientific studies that show how a glass or two of red wine can help promote cardiovascular and brain health. However, a new study by Rutgers University indicates that there is a fine line between moderate and binge drinking.
The study published in the journal Neuroscience found that moderate to binge drinking significantly reduces the structural integrity of the adult brain.
“Moderate drinking can become binge drinking without the person realizing it,” lead author Megan Anderson said in a press release.“In the short term there may not be any noticeable motor skills or overall functioning problems, but in the long term this type of behavior could have an adverse effect on learning and memory.”
The team modeled moderate to heavy drinking in humans using rodents that reached a blood alcohol level of 0.08 percent and found that brain cell production was affected negatively.
They discovered that at this level of intoxication, the number of nerve cells in the hippocampus of the brain were reduced by about 40 percent compared to those in the abstinent group of rodents. This part of the brain is known to be necessary for some types of new learning.
Alcohol intake was not enough to impair the motor skills of either male or female rats, or prevent them from associative learning in the short-term. However, Anderson said that this substantial decrease in brain cell numbers over time could have profound effects on the structural plasticity of the adult brain because these new cells communicate with other neurons to regulate brain health.
“If this area of your brain was affected every day over many months and years, eventually you might not be able to learn how to get somewhere new or to learn something new about your life,” Anderson, a graduate fellow in the Department of Neuroscience and Cell Biology, said in the press release. “It´s something that you might not even be aware is occurring.”
Men who drink 14 drinks a week and women who drink seven are considered at-risk drinkers, according to the National Institute of Alcohol Abuse and Alcoholism (NIAAA). This group says that 70 percent of binge drinking episodes involved adults who were 26 years old or older.
“This research indicates that social or daily drinking may be more harmful to brain health than what is now believed by the general public,” she said in the release.
Connie K. Ho for redOrbit.com — Your Universe Online
Researchers from the University of South Florida (USF) recently identified a gene related to hearing loss in older Americans. 30 million individuals in the U.S. suffer from hearing loss and the new findings will assist in the development of preventive efforts for this particular population.
The discovery of a genetic biomarker for age-related hearing loss is nine years in the making. Robert Frisina Sr. and Robert Frisina Jr., both of USF, were interested in finding out one of the causes of long-term permanent hearing loss. As such, the study became a collaborative effort between USF´s Global Center for Hearing and Speech Research and Rochester Institute of Technology´s (RIT) National Technical Institute for the Deaf.
In particular, the scientists identified a genetic biomarker for presbycuiss and the genetic mutation related to hearing loss can eventually affect a person´s ability to process speech. The researchers from USF and RIT worked with the House Ear Institute in finding a gene that creates an important protein in the cochlea, which is the inner ear. The protein, otherwise known as glutamate receptor metabotropic 7 (GRM7), helps convert sound into the code for the nervous system. The brain then utilizes that code for hearing and speech processing purposes.
“This gene is the first genetic biomarker for human age related hearing loss, meaning if you had certain configurations of this gene you would know that you are probably going to lose your hearing faster than someone who might have another configuration,” explained Robert Frisina Jr., a professor at the USF College of Engineering, in a prepared statement.
The study included a DNA analyses by the University of Rochester Medical School and RIT. A total of 687 people participated in the study and completed three hours of examinations regarding their hearing abilities. Testing included observations of speech processing and analyses of genetic material.
The gene appeared to have different results from males and females. The gene ended up having a negative impact for men, but a positive impact for women who reported that they had a better than average hearing in their later years. The differences between males and females relates to a 2006 finding by the Frisina research group that stated that the hormone aldosterone affected hearing capabilities.
The researchers believe that the gene can help people understand how to protect their hearing. They noted that people can prevent hearing loss with little things like avoiding loud noises, particular medications know to cause hearing damage and wearing ear protection. The scientists now understood that presbycusis is caused by a number of different genetic and environmental factors.
“Age-related hearing loss is a very prevalent problem in our society. It costs billions of dollars every year to manage and deal with it. It´s right up there with heart disease and arthritis as far as being one of the top three chronic medical conditions of the aged,” noted Robert Frisina Jr., who also helped found the Global Center for Hearing and Speech Research, in the statement.
The results of the project were recently featured in the journal Hearing Research.
CPR Survival More Likely In Wealthy White Neighborhoods
In a large, first-of-its-kind US study, conducted by the Centers for Disease Control and Prevention (CDC), it has been discovered that people who suffer cardiac arrest in poorer, predominantly black neighborhoods are half as likely to receive CPR from a bystander as those in richer, mostly white neighborhoods.
The study, published in Thursday´s edition of the New England Journal of Medicine (NEJM), shows that even in well-to-do black neighborhoods, cardiac arrest victims are 23 percent less likely to receive CPR from a bystander. The study also found that, overall, blacks and Latinos were less likely to receive aid, regardless of the neighborhood´s economic status.
Study author Comilla Sasson, MD, an emergency room physician at the University of Colorado Hospital said if a person has a heart attack “in a neighborhood that is 80 percent white with a median income over $40,000 a year, [that victim has] a 55 percent chance of getting CPR.” But in a predominantly black and poor neighborhood, the victim has only “a 35 percent chance.”
“Life or death can literally be determined by what side of the street you drop on,” said Sasson in a press release.
Although the racial card is a factor, Sasson believes “it´s socioeconomic status that matters more than racial composition” when it comes to saving a life.
Close to a third of a million people collapse from cardiac arrest each year in the US, and previous research has suggested that ethnic or socioeconomic conditions influence the chance that a bystander will perform CPR.
“We’ve seen for many years that certain communities have a higher likelihood of a patient getting CPR,” study coauthor Dr. Bryan McNally of Emory University in Atlanta, told Reuters Health. “This is pointing out that within communities there is variation in the local or neighborhood area.”
The findings of the study are based on 14,225 cases of cardiac arrest from 29 non-rural parts of the US. Using census data, the researchers separated these sites into two categories: high income, where household income was $40,000 and up; and low income. Each neighborhood was given an ethnic classification if that group made up more than 80 percent of the population.
Once baselines for the study were set, Sasson and her colleagues teased out the results. They found the overall chance of a victim receiving CPR from a bystander was about 29 percent. The percentage was highest (55%) in wealthy, predominantly-white neighborhoods. However, if that same person crossed the street and was in a poorer, mostly-black neighborhood, his or her chance of receiving CPR drops to 35 percent.
Sasson said information received from focus groups suggests the reason your survival odds are lower in poorer neighborhoods is because of the cost of learning CPR. Most classes cost $250, which is a large chunk out of someone´s monthly budget, especially if they are making less than $20,000 a year, she noted. “A lot of folks would love to learn it, but they can´t.”
The study further found that cardiac arrest victims “who received bystander CPR were more likely to be male and white. Black and Latino patients were less likely to receive CPR,” said McNally. “The association was most apparent in low-income black neighborhoods where the odds of receiving bystander CPR was 50 percent lower than that of a high-income non-black neighborhood.”
Sasson, McNally, and their colleagues say because of their findings, there needs to exist a commitment to increase CPR training efforts for all people. In the past, CPR training required multiple lessons, was intimidating and was offered in conventional settings. Nowadays, it is much faster, simpler and easier to learn and remember.
“Rather than widely blanketing the entire U.S. with CPR training, a targeted, tailored approach in these “high-risk” neighborhoods may be a more efficient method, given limited resources,” said McNally.
And the call for more training is just, because the proof is in the pudding. CPR given in the first few minutes after a cardiac arrest is crucial for survival. And in most cases, EMTs often do not arrive on scene for minutes, making it all the more important for trained CPR bystanders to step up to the plate and potentially save a human life.
Currently, survival rates vary greatly from city to city. In Seattle, the cardiac arrest survival rate is 16 percent. Yet in Detroit, the survival rate is only 0.2 percent.
It isn´t exactly clear why difference like these exist, noted Sasson. Although, Seattle´s widespread culture of CPR training most likely helps, along with the fact its citizens are predominantly white and more well-to-do than those in Detroit.
Sasson, being an ER physician, said she sees her share of cardiac arrest victims each year. “I would see African-Americans coming in and dying from cardiac arrests after having laid there for 10 minutes with no one delivering CPR.”
“There is no reason in 2012 that this kind of disparity exists – that you live or die depending on what side of the street you drop on. It is simply unacceptable,” she concluded.
Opposing Ice: Antarctic Grows While Arctic Ice Cap Shrinks
Brett Smith for redOrbit.com — Your Universe Online
Over the past few years, researchers have consistently shown an overall decrease in the size of the Arctic ice cap–particularly during the summer months when the most melting occurs.
However, a new study from NASA scientists has shown that this melt off around the Arctic is accompanied by the record expansion of sea ice around Antarctica, when the Southern Hemisphere is experiencing its colder temperatures of the year.
According to the study, from 1978 to 2010 the total amount of sea ice around the southernmost continent grew by about 6,600 square miles every year. Previous studies from the same research group demonstrates that this rate of expansion has increased in recent years–up from 4,300 square miles per year between 1978 and 2006.
“There’s been an overall increase in the sea ice cover in the Antarctic, which is the opposite of what is happening in the Arctic,” said lead author Claire Parkinson, a climate scientist with NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “However, this growth rate is not nearly as large as the decrease in the Arctic.”
Part of this bi-polar story can be told by the possible changes in atmospheric circulation being driven by the hole in the ozone layer above Antarctica. Recent research has suggested that a lack of ozone, which absorbs solar energy, leads to a cooler stratosphere over the South Pole. As this effect works in combination with increasingly warmer temperate zones, it could drive stronger winds across the Ross Ice Shelf, where the biggest growth in Antarctic sea ice has been observed.
“Winds off the Ross Ice Shelf are getting stronger and stronger, and that causes the sea ice to be pushed off the coast, which generates areas of open water, polynyas,” said Josefino Comiso, a senior scientist at Goddard. “The larger the coastal polynya, the more ice it produces, because in polynyas the water is in direct contact with the very cold winter atmosphere and rapidly freezes.” As the wind keeps blowing, the ice expands further to the north.”
While the sea ice during the austral winter appears to be expanding at a previously unseen rate, this does not disprove global warming, according to Parkinson.
“Climate does not change uniformly: The Earth is very large and the expectation definitely would be that there would be different changes in different regions of the world,” Parkinson said. “That’s true even if overall the system is warming.”
Another recent NASA study supports Parkinson´s assertion, showing that Antarctic sea ice slightly thinned from 2003 to 2008. However, this thinning was taken into account during the latest study as increases in the extent of the ice more than compensated for the loss in thickness and led to an overall gain in volume.
The new study was the first to incorporate laser altimetry from the Ice, Cloud, and land Elevation Satellite (ICESat). The satellite could prove extremely useful in future Antarctic research as the logistics of analyzing sea ice on location has obvious complications. Currently, icebreakers are used to gather sea ice thickness information during the warmer months when the sailing conditions are most optimal.
There are some wild diets out there. As many of us attempt to either lose weight or simply eat healthier, we´ll try our hands at diets of all kinds: All meat, no meat; All vegetables, only specific vegetables; foods in a specific order, foods in a specific season.
There´s a diet for any kind of person and personality, no matter how odd or strange. One such diet, the raw food diet, has come under plenty of scrutiny for it´s strict regulations. While there are different varieties of the raw food diet (such as veganism, vegetarianism, and even animal-based diets) the basic rule is the same: Food is not to be cooked to a temperature more than 115 degrees Fahrenheit. Not only does this diet sound like a giant bummer to many cheeseburger and taco-loving humans, a new study this week suggests it could potentially starve a dieter´s brain. What´s more, the study has also found that homo erectus began to develop more rapidly whenever we finally got that whole cooking thing mastered. According to Suzana Herculano-Houzel, a neuroscientist at the Federal University of Rio de Janeiro in Brazil and co-author of this report, a human eating the raw diet would have to eat for 9 hours straight just to consume enough energy to feed their brains.
“If you eat only raw food, there are not enough hours in the day to get enough calories to build such a large brain,” said Herculano-Houzel, speaking to Science Magazine.
“We can afford more neurons, thanks to cooking.”
One of the many ways in which we are different from our primate cousins is the number of neurons found in our brains. According to the study, humans have nearly 86 billion neurons, on average, in their brain. Gorillas, on the other hand, have roughly 33 billion neurons, with chimps falling slightly behind at 28 billion. As the saying goes, there´s no such thing as a free lunch, and while humans have larger brains packed with more neurons, our brains consume 20% of our body´s energy just by sitting still.
In order to test this theory about raw food diets, Herculano-Houzel and her graduate student, Karina Fonseca-Azevedo, who is now a neuroscientist at the National Institute of Translational Neuroscience in São Paulo, Brazil, began by measuring the number of neurons in 13 species of primates and more than 30 species of other mammals. To begin with, the team discovered that brain size is directly linked to the number of neurons, a fitting observation.
The team also discovered that a brain with more neurons needed more calories to sustain itself than a brain with fewer neurons.
Setting to work, Herculano-Houzel and Fonseca-Azevedo began to crunch some numbers to determine how long a mammal would need to eat a diet of raw foods in order to feed their brain. Gorillas would have to eat the uncooked food for 8.8 hours to sustain their brains, while orangutans would need to eat for 7.8 hours at a time, and chimps would have to eat for nearly 7.3 hours consistently.
While humans on the raw food diet often add supplements, such as proteins and other nutrients in order to live a healthier lifestyle, primates don´t have the same luxury. Therefore, unless primates eat raw food for hours at a time or begin to cook their own food, they´ll never be able to grow their brains to our size.
“The reason we have more neurons than any other animal alive is that cooking allowed this qualitative change–this step increase in brain size,” concludes Herculano-Houzel.
“By cooking, we managed to circumvent the limitation of how much we can eat in a day.”
This study can be seen as good news for anyone worried about a primate take over or looking for a new point to argue against anyone pushing a raw food diet.
Sure, their ads promote a certain “extreme” lifestyle, full of rock and roll and motorbikes, but new reports issued to the US Food and Drug Administration (FDA) are now linking Monster high energy drink to the deaths of 5 people in the last year alone, according to a report from Bloomberg.
According to reports, each of the 5 victims had consumed Monster energy drinks just before their deaths. Shelly Burgess, a spokesperson for the FDA, told Bloomberg that each of these reports were submitted voluntarily and will be considered allegations until the agency can investigate further.
Now, these reports are set to be used in a lawsuit against Monster by parents of a 14-year old girl who claim their daughter died of caffeine toxicity after drinking too many of the high energy drinks.
This issue isn´t a new one, as cases involving such high energy drinks have jumped tenfold from 2005 to 2009.
“FDA continues to evaluate the emerging science on a variety of ingredients, including caffeine,” said Burgess in an emailed statement to Bloomberg.
According to Mail Online, Anais Fournier, a Maryland teen, had consumed 2, 24-ounce cans of Monster energy drink last year, just 2 days before Christmas. Fournier went into a cardiac arrest later, and her autopsy revealed that the 14-year old girl had died of cardiac arrhythmia due to caffeine toxicity. This toxicity had limited her heart´s ability to pump blood as it needed to.
A medical examiner later discovered that Fournier also had a preexisting condition which made her blood vessels weaker than normal.
Ms. Fournier´s parents, Wendy Crossland and Richard Fournier, spoke with the Record Herald about their California lawsuit against Monster.
“I was shocked to learn the FDA can regulate caffeine in a can of soda, but not these huge energy drinks,” said Crossland.
“With their bright colors and names like Monster, Rockstar, and Full Throttle, these drinks are targeting teenagers with no oversight or accountability. These drinks are death traps for young, developing girls and boys, like my daughter, Anais,” she added.
Each 24-ounce can (the smallest size offered) contains not only twice as much liquid as the normal soda can, it also contains 240 milligrams of caffeine: 7 times the amount found in a normal 12-ounce can of soda.
While there are other energy drinks available, like the aforementioned Full Throttle and Rockstar, Monster sells the lion´s share of energy drinks on the market.
According to Bloomberg, Monster sold $1.6 billion worth of highly caffeinated beverage last year. Business has been booming for the company as well, as their sales have tripled since 2006.
“Over the past 16 years Monster has sold more than 8 billion energy drinks, which have been safely consumed worldwide,” explained a Monster spokesperson in an emailed statement to Bloomberg.
“Monster does not believe that its beverages are in any way responsible for the death of Ms. Fournier. Monster is unaware of any fatality anywhere that has been caused by its drinks.”
A trial date for the cases between Crossland, Fournier and Monster has yet to be set, as lawyers for the Maryland parents continue to investigate similar cases.
Older People Who Exercise Have Less Brain Shrinkage
Physical exercise at any age is good for the body. But those who are later in their years may have a better reason to continue exercising than worrying about their figure. According to a study published in the latest issue of the journal Neurology, researchers have found that people over 70 who remain physically active have less brain shrinkage than those who do little or no exercise.
Experts from Edinburgh University say that people who exercise in their 70s may not only be halting their brains from shrinking, but also lowering the risk of dementia. And the good news is that exercise doesn´t have to be strenuous, they noted–just going for a walk several times per week suffices.
However, the experts found no evidence that participation in social and mentally stimulating activities, such as playing a game with a friend, or doing a tricky crossword, contribute to overall brain health in older people.
“People in their seventies who participated in more physical exercise, including walking several times a week, had less brain shrinkage and other signs of aging in the brain than those who were less physically active,” said study author Alan J. Gow, PhD. “On the other hand, our study showed no real benefit to participating in mentally and socially stimulating activities on brain size, as seen on MRI scans.”
Gow and his colleagues studied brain scans over a three-year period of 638 people born before 1936. The group gave the researchers details about their exercise habits, ranging from just completing necessary household chores to keeping fit with heavy exercise and competitive sports several times per week.
Previous research has shown that exercise helps reduce the risk of dementia and can slow its onset. But scientists are still unclear why. Exercise does increase blood flow to the brain, delivering doses of oxygen and nutrients to brain cells, which could be an important factor in keeping dementia away.
Researchers are also not clear on another aspect of the study: Are people´s brains shrinking because they are not exercising, or are people less inclined to exercise due to brain shrinkage?
One thing´s for sure, exercising is an easy thing to do, and it is beneficial not just for your body, but for your mind as well.
“This study links physical exercise to fewer signs of ageing in the brain, suggesting that it may be a way of protecting our cognitive health,” noted Dr Simon Ridley, head of research at Alzheimer’s Research UK. “While we can’t say that exercise is the causal factor in this study, we do know that exercise in middle age can lower the risk of dementia later in life.”
“It will be important to follow these volunteers to see whether these structural features are associated with greater cognitive decline over the coming years. More research is also needed to tease out how physical activity might be having a beneficial effect,” he told BBC´s Michelle Roberts.
“This research is exciting as it provides vital clues as to what impacts the way our brain ages and how we could tackle mental decline. If we can establish definitively that exercise provides protection against mental decline, it could open the door to exercise programs tailored to the needs of people as they age,” added Professor James Goodwin, Head of Research at Age UK.
“We already know that exercise is important in reducing our risk of some illnesses that come with ageing, such as cardiovascular disease and cancer. This research reemphasizes that it really is never too late to benefit from exercise, so whether it’s a brisk walk to the shops, gardening or competing in a fun run it is crucial that, those of us who can, get active as we grow older,” said Goodwin.
Health Organization Recommends Safety Policies For Cheerleaders
Written By: editor
editor
Connie K. Ho for redOrbit.com — Your Universe Online
On a Friday night, the local high school stadium is filled to the brim with students, parents, and teachers watching a football game. Just when the referee blows the whistle, the players head back to the sidelines to take a break. As the jocks trickle in from the field, a group of girls dressed in short skirts and waving pom poms run to the field, shouting out cheers, completing back flips, and even being tossed in the air. The moves, a mix of dance and gymnastic elements, are so difficult that a girl could be injured at any minute. As a result of this kind of problem, the American Academy of Pediatrics (AAP) recently published a statement that encourages coaches, parents, and school administrators to limit injuries by following safety guidelines listed by the organization as well as by creating emergency plans that will allow cheerleaders access to coaches, medical care, and injury surveillance should an issue arise.
What started as a way to lead audience members in cheers at football games, cheerleading has evolved into a competitive, year-round sport.
“Not everyone is fully aware of how cheerleading has evolved over the last couple of decades. It used to be just standing on the sidelines and doing cheers and maybe a few jumps,” Dr. Cynthia LaBella, a co-author of the new policy who works as a sports medicine specialist at Chicago’s Lurie Children’s Hospital, told ABC News.
Despite a surge in the popularity of the sport, there have been a number of serious injuries that have occurred. The AAP reports that, since 2007, 26,000 injuries related to cheerleading have occurred in the U.S. every year. As well, over the past 25 years, cheerleading has caused 66 percent of all “catastrophic injuries” for female high school athletes. The AAP released the statement at the AAP National Conference and Exhibition in New Orleans as well as published it in the November issue of Pediatrics.
“Cheerleading has become extremely competitive in the past few years, incorporating more complex skills than ever before,” commented LaBella, who serves as a member of the AAP Council on Sports Medicine and Fitness, in a prepared statement. “Relatively speaking, the injury rate is low compared to other sports, but despite the overall lower rate, the number of catastrophic injuries continues to climb. That is an area of concern and needs attention for improving safety.”
According to the AAP, cheerleading should be considered a sport so that cheerleaders can receive necessary protection like qualified coaches, facilities that are maintained at a certain standard, mandated sports physicals and surveillance of injuries, as well as consultation with certified athletic trainers. Currently, only 29 U.S. states recognize cheerleading as a high school cheerleading sport. It is not included under the National Collegiate Athletic Association.
“The reason we promote having all the states recognizing it as a sport is because it would be a simple way to provide these services for the cheerleaders,” remarked LaBella in the Reuters article.
In addition, the AAP believes that a majority of the injuries are related to sprains and strains of the lower extremities, followed by injuries to the head and neck. 96 percent of concussions and 42 percent to 60 percent of all injuries are related to the physically demanding skills of cheerleading that feature stunts like pyramid building as well as lifting, catching, and tossing cheerleaders. As well, there can be increased injuries as cheerleaders become older. Cheerleaders at the collegiate level have the highest rate of injury as compared to cheerleaders in middle school and highs school. Risk factors that can increase chance of injury include performance on hard surfaces, insufficient coaching, previous injury, complicated stunts, as well as higher body mass index.
“Most serious injuries, including catastrophic ones, occur while performing complex stunts such as pyramids,” noted Dr. Jeffrey Mjaanes, a member of the AAP Council on Sports Medicine & Fitness who helped co-author the new guidelines, in the prepared statement. “Simple steps to improve safety during these stunts could significantly decrease the injury rate and protect young cheerleaders.”
Others disagree with some points made by the AAP. Jim Lord, executive director of the American Association of Cheerleading Coaches and Administrators (AACCA), told Reuters that he believes that cheerleading should not be listed as a sport because the athletes do not compete regularly, a requirement for some states and organizations. As well, while the AAP recommended that pyramids and stunts only be performed with foam or spring boards, Lord stated that it would be difficult to do so. However, Lord believes that it should be possible to come to an agreement with the AAP.
“Most of this is actually stuff we’ve recommended for quite a while,” Lord told Reuters.
Furniture Maker IKEA Invests In Renewables To Become Self-Sufficient By 2020
As energy costs soar, many businesses are looking for ways to become more energy efficient. One company is looking to become completely self-sufficient by 2020, relying only on renewable resources to power its factories, stores and offices.
IKEA, the world´s largest home furnishings retailer, plans to invest as much as $2 billion in wind farms and solar parks to increase its reliance on renewables and steer away from volatile fossil fuels. The move should help the Swedish furniture maker get 70 percent of its energy from renewable resources by 2015, the company´s CSO (chief sustainability officer) said in a phone interview with Business Week´s Alex Morales.
IKEA said it also wants to limit sales by 2016 to energy-efficient products to meet the growing demand of customers who want more green options. IKEA said it will focus on green products such as induction cookers and LED light bulbs.
In embracing the renewable aspect, IKEA said it plans to install rooftop solar panels, develop wind farms, replace incandescent light bulbs with more efficient LEDs, and vowed to plant at least as many trees as it uses in the production of its furniture by 2020.
Howard said “each roof is a power station in the making,” referring to rooftop solar panels. The company has so far installed solar panels on 34 of its 38 US stores and distribution centers. The US has “fantastic sun … as good as anywhere in the world,” added Howard.
Howard also noted that many regions in the US also have great wind potential and would like to incorporate such resources there as well. However, policy environment is kind of shaky at the moment in the US, he said, adding that production tax credits for the US wind industry are set to expire in December unless Congress renews them.
IKEA is not the only company with renewable goals, however. Wal-Mart set similar goals for its stores and distribution centers in the US. Although there is no set timeline for when it will become self-sufficient, it ranks first among US companies for solar power generation, according to a recent survey by the Solar Energy Industries Association (SEIA). Costco ranks second, followed by Kohl´s and then IKEA; Macy℠s ranks fifth.
“Efficiency makes sense and it makes more sense now than ever before,” Howard said.
Other companies are also joining in the fight to end reliance on fossil fuels. Sportswear maker Puma SE and PepsiCo are also expanding efforts to go green. As prices for wind turbines and solar panels have declined in recent years, moving to greener pastures is becoming extremely cost-effective, according to Bloomberg New Energy Finance.
“By producing as much renewable energy as we use through the system, we contribute to development in society and make ourselves even more competitive,” said IKEA CEO Mike Ohlsson. “This will be a great driver of innovation.”
He told Reuters he had no doubt the company´s “People & Planet Positive” strategy would save money both for IKEA and its clients, although he declined to estimate total savings.
Under this plan, IKEA plans to be 70 percent self-sufficient by 2015, and by 2020 it would produce as much renewable energy as it consumes.
IKEA already owns wind farms in six European countries and has more than 342,000 solar panels on its buildings, which already generate 27 percent of the company´s electricity.
Howard said the company is a little under half-way from its 2015 goal in terms of investments. The company would also halve its greenhouse gas emissions from its operations by 2015, from 2010 levels.
Joel Makower, executive editor of GreenBiz.com, which covers corporate sustainability efforts for Ikea’s plans, told USA Today´s Wendy Koch it´s really good that “a company is trying to get its own house in order, but its house is more than its buildings.”
He added that more than 90 percent of the total energy that retailers use is embedded in the supply chain–the making of and delivering of parts and products. Renewable energy is no longer just about scoring points with the consumer. “It’s about mitigating risks,” he said, referring to the uncertainties of energy prices and supplies.
Howard agreed. He said IKEA believes energy independence is “the right thing to do,” not only because it’s concerned about climate change but also because it wants to protect itself against higher energy prices in the future. “Sustainability will decide the winners and losers in the business community,” he said.
Environmentalists are backing IKEA´s ambitious goals.
John Sauven, head of Greenpeace UK, said IKEA is at the “forefront of leading companies” trying to become self-sufficient in the face of environmental concerns.
The furniture maker´s plan is a roadmap to a “clean industrial revolution,” added Mark Kenber, head of the UK-based think-tank Climate Group. He urges that other businesses follow in IKEA´s footsteps.
IKEA noted that other environmental experts were also praising its renewable efforts, including the World Wildlife Fund.
What sets IKEA apart from other companies moving in similar directions, is the fact the furniture giant has freedom to act on its plans without investor lash-back. IKEA is owned by a private foundation and does not list on the stock market. What this means, said Ohlsson, is “our whole focus is customers throughout the chain and not stock exchange and owners.”
Increase In Obesity, Decrease In Sleep Due To TV And Other Devices In Kids’ Rooms
Written By: editor
Connie K. Ho for redOrbit.com — Your Universe Online
A study by researchers from the University of Alberta has found that children lose sleep and develop bad lifestyle habits when there are electronic devices in the bedroom.
The study included a group of grade five students in Alberta and researchers discovered that, if they had as few as one hour of sleep more, they could increase the likelihood of not being overweight or obese. The researchers defined the electronic devices that affected children in the bedroom as items like computers, cell phones, televisions and video games. Those who had one or more of these items had a greater likelihood of being obese or overweight.
“If you want your kids to sleep better and live a healthier lifestyle, get the technology out of the bedroom,” explained the study´s co-author Paul Veugelers, a professor in the School of Public Health, in a prepared statement.
The scientists believe that this is the first study to look at the connection between diet, physical activity and sleep among children in relation to electronic devices. Almost 3,400 students participated in the study, detailing their nighttime sleep routine as well as the number of opportunities available to use electronics. The results showed that half of the students had access to a DVD player, a television or a video game console while five percent of kids had all three items. As well, 21` percent had a computer and 17 percent owned a cell phone.
Based on the findings, the team of investigators discovered that children who could use at least one of the electronic devices had a 1.47 higher chance of being overweight compared to kids who did not have access to any electronic devices. For children who had all three devices, the likelihood increased to 2.57 times as compared to kids who had no opportunities to use electronics. In particular, increased sleep also caused a heightened level of physical activity and healthier diet choices.
The researchers believe that, currently, children are not sleeping as much as kids in previous generations. Specifically, two-thirds of children are not sleeping the recommended numbers of hours of sleep. A healthy amount of sleep has been linked to healthier lifestyle, more academic success and fewer problems related to moodiness.
“It´s important to teach these children at an earlier age and teach them healthy habits when they are younger,” commented the study´s co-author Christina Fung, whose study was published in the September edition of the journal Pediatric Obesity.
The research addresses health concerns that are plaguing the health system in the United States as rates of child obesity are skyrocketing. According to the Centers for Disease Control and Prevention (CDC) in the U.S., the numbers have tripled since 1980. Specifically, there is approximately 17 percent of 12.5 million adolescents and children between the ages of two and 19 years of age who are considered obese. Along with factors like sleep, the CDC pinpoints factors like racial and ethnic differences as contributing to the obesity issue. From 2007 to 2008, more Hispanic boys between the ages of two and 19 years of age were found to be obese as compared to non-Hispanic white boys; similarly, non-Hispanic black girls had a higher likelihood of being obese as compared to non-Hispanic white girls.
Cyberbullying Rarely Sole Factor In Teen Suicides
Written By: editor
editor
Lee Rannals for redOrbit.com — Your Universe Online
New research reported at the American Academy of Pediatrics (AAP) National Conference and Exhibition shows that cyberbullying is rarely the sole factor in teen suicides.
The team searched for reports of youth suicides where cyberbullying was a reported factor, logging information about demographics and the event itself through online news media and social networks.
They also used descriptive statistics to assess the rate of pre-existing mental illness, the occurrence of other forms of bullying, and the characteristics of the electronic media associated with each suicide case.
The team identified 41 suicide cases from the U.S., Canada, the U.K. and Australia, including 24 female and 17 male from the ages 13 to 18.
They found 24 percent of teens were the victims of homophobic bullying, including the 12 percent of teens identified as homosexual and another 12 percent of teens identified as heterosexual.
Suicides most frequently occurred in September and January, but the authors warned these higher rates may have occurred by chance.
The incidence of suicide cases increased over time, with 56 percent occurring from 2003 to 2010, compared to 44 percent from January 2011 through April 2012.
They found that 78 percent of adolescents who committed suicide were bullied at school and online, and only 17 percent were targeted online only.
A mood disorder was reported in 32 percent of the teens, and depression symptoms in an additional 15 percent.
“Cyberbullying is a factor in some suicides, but almost always there are other factors such as mental illness or face-to-face bullying,” study author John C. LeBlanc said in a press release. “Cyberbullying usually occurs in the context of regular bullying.”
Cyberbullying occurred through social media sites like Facebook and Formspring, both specifically mentioned in 21 cases. Text or video messaging was noted in 14 cases.
“Certain social media, by virtue of allowing anonymity, may encourage cyberbullying,” Dr. LeBlanc said in the release. “It is difficult to prove a cause and effect relationship, but I believe there is little justification for anonymity.”
Researchers Show How Spatial Perception Is Influenced By Fear
Jedidiah Becker for redOrbit.com — Your Universe Online
One of the many mind-bending lessons that neuroscience has taught us in recent years is that our brain often ℠fudges´ the picture of reality that it gives us. For a very simple at-home demonstration of this cognitive trickery, stand in front of a mirror and alternate back and forth between looking at your left eye, then your right eye. No matter how hard you try, you won´t be able to see your eyes actually moving.
And it gets even weirder: Not only can you not see your eyes move, but there doesn´t appear to be any gap in your perception during the time it takes for your eyes to change their focus from point A (your left eye) to point B (your right eye). One second you´re staring at your left eye and the next moment at your left, and that half second in between seems to simply vanish.
What has essentially happened is that your brain has taken a rather complex little scene and ℠edited´ it in order to present you with a more simplified picture of reality. In essence, your brain is not presenting you with the ℠whole truth´ but rather with a version of it that is somehow more useful or easier to manage. And modern neuroscience and psychology continue to expose more and more of the human brain´s little white lies.
In the results of a recent study published in the journal Current Biology, two psychologists have shown that a sense of fear and impending danger can actually change our spatial perception of an approaching object.
In general, the human brain is amazingly accurate in its ability to estimate the distance of approaching objects and predict the moment of impact with them. That is, amazingly accurate so long as that approaching object does not appear to pose a threat.
Participants in the study were placed in front of a computer screen and shown images of different objects that expanded in size over a period of one second before they disappeared. The images increased in size in order to simulate a phenomenon known as “looming,” an optical pattern that the brain uses to predict the amount of time until a collision with an approaching object. Study participants were asked to press a button to predict the moment of impact with each of the objects on the screen.
And the experiment had one more twist: Some of the images displayed on the screen were harmless objects such as butterflies or rabbits, while other images were of ℠threatening´ creatures like snakes and spiders.
The results showed that the individuals consistently underestimated the collision time for threatening images — that is, people who were afraid of snakes or spiders repeatedly believed that they would come into contact with those objects sooner than with non-threatening objects, even when both types of images were “looming” at the same speed.
The team´s results call into question the traditional understanding of “looming” as being a mere optical phenomenon that is more or less detached from other brain processes.
“We´re showing that what the object is affects how we perceive looming. If we´re afraid of something, we perceive it as making contact sooner,” explained Longo.
At the crux of the study, Lourenco highlighted, is the idea that perception and emotion may be more entangled in the human mind than we currently understand.
“Fear can alter even basic aspects of how we perceive the world around us,” said Lourenco.
What´s more, says Lourenco, is that they were able to gauge how much a participant would underestimate the collision time with an object based how afraid they were of it: “The more fearful someone reported feeling of spiders, for example, the more they underestimated time-to-collision for a looming spider.”
Viewed from an evolutionary perspective, it´s easy to see how this kind of automatic brain response to dangerous objects could have offered a survival advantage to our early ancestors living in the wild. As Lourenco noted: “If an [approaching] object is dangerous, it´s better to swerve a half-second too soon than a half-second too late.” When a primitive man on the Sahara plains looked up in time to see a lion barreling towards him, it wasn´t necessarily a bad thing for his brain to ℠trick´ him into believing that the approaching danger was a bit more imminent than it really was.
What the researchers do not yet understand is the exact mechanism behind this underestimation of collision time with dangerous objects. For instance, it could be that the fear of a dangerous object causes the brain to perceive the object as moving faster than it actually is. On the other hand, it could also be that a sense of fear causes the threatened individual to experience an enlargement of their personal space, or the sphere in which they feel safe.
“We´d like to distinguish between these two possibilities in future research,” says Lourenco. “Doing so will allow us to shed insight on the mechanics of basic aspects of spatial perception and the mechanisms underlying particular phobias.”
The team believes that the results of their study will have a number of implications for the research and understanding of clinical phobias.
Bull Sharks, Not Great Whites, Have The Most Powerful Bite
Written By: editor
redOrbit Staff & Wire Reports — Your Universe Online
The shark with the most powerful bite isn’t the Great White or the Hammerhead, scientists have discovered — rather, it is the Bull shark that bites with the most force relative to its size, according to a new study.
Marine biologists, including Philip Motta and Maria Habegger of the University of South Florida (USF), measured the bite strength from 13 different shark species, according to Dan Vergano of USA Today.
They discovered that a nine-foot-long Bull shark (Carcharhinus leucas) had a bite force of 478 pounds, while a Great White of similar size had a bite force of only 360 pounds.
“An 18-foot-long great white will still have a more powerful bite than an 11-foot bull shark, just by virtue of its size. But pound-for-pound, a bull shark of the same size would have a stronger bite,” Motta told Vergano on Friday. “It’s all about the width of the jaws, and bull sharks have very wide heads. We’ve seen sea turtles bitten in half, and that takes quite a lot of force.”
“We expect strong bite force values in the larger sharks that occupy top positions in the food chain, for example, the great hammerhead, great white shark, tigers and bull sharks,” Habegger added in an interview with BBC Nature Editor Matt Walker. “[But] sometimes size is misleading. Although larger size sharks will exert higher values of bite force, the relative value of bite force is what matters, pound per pound, how strong is the bite?”
Habegger, Motta, and colleagues from the U.S. and Germany tested a variety of sharks and shark-like creatures, including the one-meter-long ratfish and the Great White, Walker explained.
They dissected specimens of the 13 creatures studied to analyze their jaws and jaw muscles, then calculated the amount of force that those muscles put forth when the jaws of each creature closed. The effect that the size of each shark’s body had on bite strength was then mathematically removed from the equation, in order to ensure a level playing field for the smaller specimens, he added.
So why do Bull sharks possess such ferocious jaw strength? The researchers admit they aren’t sure.
“From our knowledge there is no need of such massive values to break fish skin or even to puncture bone,” Habegger, a doctoral student at USF, said.
“One idea is that this ability gives young bull sharks an advantage over other competing species; allowing them to eat more diverse prey earlier in their lives,” Walker added. “But overall, bull sharks, which the research shows can bite with a force of almost 6,000N at the back of the jaw and more than 2,000N at the front, seem to have bites that are too powerful.”
Their findings have been published in the journal Zoology.
Genetic Tumor Testing Could Lead To Better Breast Cancer Treatment
Written By: editor
redOrbit staff & Wire Reports — Your Universe Online
Experts at the Baylor College of Medicine (BCM) have started using a new DNA sequencing technique they say will help devise individualized, more effective methods of treating breast cancer.
According to oncologists at the BCM Lester and Sue Smith Breast Center, the methods used to treat a patient’s cancer are dependent upon several factors, including the size of the tumor, its biological state, the stage the illness is currently in, and whether or not it has spread to other parts of the body.
Now, they have developed a way to take what they call a “detailed genetic snapshot” of the patient’s tumor, which will help them better understand the specific characteristics of that tumor and help create a unique course of treatment for that patient, the school announced in an October 17 statement.
“It is important to distinguish between tumor-specific changes that we look for in tumor tissue and heritable mutations, such as the BRCA1 and BRCA2 mutations that are associated with breast and ovarian cancers,” Dr. Mothaffar Rimawi, medical director of the Smith Breast Center, said.
“Testing is available to look specifically for those mutations, which greatly increase a woman’s chance of developing these cancers and may require a more aggressive approach to prevention,” he added. “We are able to take tissue from a patient’s tumor and send it to the Cancer Genetics Lab and within two weeks, we receive a comprehensive genetic analysis of that tumor.”
Doctors have just recently started using the tumor genomic sequencing method in clinical environments, Rimawi said, but it has already helped them consider using alternative, and possibly more effective, methods for treating breast cancer.
“There is a good chance that the testing will identify what we call an ℠actionable mutation,’ meaning something that we think is driving the tumor and that we currently have an approved (by the U.S. Food and Drug Administration) drug that we know works against it,” he said. “In situations where the cancer has spread or is resistant to treatment, this resource provides us with another opportunity to attack the cancer that we would not have had.”
He said the testing has already had an impact on the treatment methods of a “considerable” number of patients, and in most cases, the genetic sequencing techniques are covered by health insurance.
According to Dayton Daily News staff writer Peggy O’Farrell, the American Cancer Society said an estimated 226,000-plus women in the U.S. would be diagnosed with some form of breast cancer this year. From 1977 to 2007, advances in the treatment of the disease helped five-year survival rates increase from 75% to 90%, she added.
Banana Boat Sunscreen Could Burst Into Flames, Burn Skin
Written By: editor
April Flowers for redOrbit.com — Your Universe Online
Remember those adorable, and slightly creepy, Coppertone ads that started in the 1950s? They starred a little girl with a dog pulling her bathing suit to reveal a starkly white bottom. Would they have been as cute, or effective, if the little girl was on fire?
Consumers are asking that question about the Banana Boat product recall for suntan lotions. Energizer Holdings (ENR), the parent company of Banana Boat, announced Friday that certain sunscreen sprays may potentially burst into flames on the user’s skin if they come into contact with a flame before the spray is completely dry.
The recall is voluntary and Energizer Holdings has ordered retailers not to sell the sprays. They have also notified the Food and Drug Administration of the problem.
Energizer reports they have received four complaints of “adverse affects” caused by the sprays. The products have caused burns: four in the U.S. and one in Canada. The problem stems from the fact the spray nozzles deliver more product than the typical amount in the industry. This means it takes longer for the spray to dry.
“If a consumer comes into contact with a flame or spark prior to complete drying of the product on the skin, there is a potential for the product to ignite,” Energizer said in a statement.
Consumers who have recently purchased Banana Boat products are being advised not to use them. The recall affects 23 varieties of UltraMist sunscreen, including UltraMist Sport, UltraMist Ultr Defense and UltraMist Kids. 20 million units of UltraMist have been sold since 2010, when the line launched.
The product label already contains a warning about proximity to open flames, but Dr. Michele Green, a Lenox Hill dermatologist, says most people don’t read the labels.
“So many people put this on outside, while they’re on their way to activities, so I just don’t think people are aware of that,” said Green.
The problem seems to be extremely rare, according to burn experts.
“We’ve found no evidence of this happening before the incidents that came to our attention,” said Dan Dillard, executive director of the Burn Prevention Network. Two incidents were reported to Burn Prevention; one a man who was standing near a BBQ grill, and the other a woman working with welding equipment. Both resulted in second and third degree burns.
“The alcohol and petroleum products listed on the containers are flammable, so the only thing you’re missing in the heat triangle is an ignition source,” Dillard said.
Listing Polar Bears As Threatened Species Challenged In Court
April Flowers for redOrbit.com — Your Universe Online
The State of Alaska and a group of plaintiffs that include hunters and the California Cattlemen’s Association, has appealed a federal court ruling from last year that upheld the 2008 Interior Department’s designation of polar bears as a threatened species. The bears were designated as threatened because their icy habitat is melting away.
Maury Feldman, an attorney representing the plaintiffs, told the U.S. Court of Appeals for the District of Columbia the Interior Department had failed to show how the polar bears would likely be nearing extinction by the middle of this century, calling the Department’s decision “arbitrary and capricious.” He claimed the decision was based on flawed models without any real connection between population projection and habitat loss.
Lawyer for the department’s Fish and Wildlife Service Katherine Hazard, asserted the designation relied on decades of research and long-term trends underpinning it.
“The agency needs to make a determination based on the best available science, which the agency did here,” she said.
A threatened designation means the species is likely to become endangered within the foreseeable future in all or a majority part of its range.
Feldman asserts the government “applied a standard so imprecise that the Service could conceivably use it to list any healthy species whose habitat is projected to be affected by climate change, without making a future ℠on-the-brink´ determination.”
The listing drew more attention to the bears’ situation and triggered funding for programs to increase patrols that will limit contact with humans and a recovery plan for the bears, according to Bruce Woods, a spokesman for the Fish and Wildlife Service.
The National Oceanic and Atmospheric Agency (NOAA) reported this month that Arctic sea ice shrank to a record low of 1.32 million square miles by mid-September.
“Declines in sea ice extent have major negative impacts on polar bears,” Hazard said in court papers. “Sea ice declines, which lengthen the period in which bears are unable to productively hunt seals, cause nutritional stress and weight loss and, ultimately, affect mortality and reproduction.”
The State of Alaska and large oil companies have argued the Endangered Species Act protection for polar bears diminishes opportunities for further Alaskan energy development. In its appeals court filing, the State argues bears have survived warming trends before and most populations have grown or maintained stability despite the ice shrinkage.
The appeals court is not expected to make a decision for months.
Metamaterial May Redefine Printed Circuit Board Manufacturing, Recycling
Written By: editor
April Flowers for redOrbit.com — Your Universe Online
It’s probably a safe bet that you have never been in a big electronics retailer like Best Buy or Comp USA and seen a sign boasting the percentage of recycled electronic components that a laptop or smart TV uses. This suits the electronics industry, as many of the leading companies make their money selling more and more new components.
A new technology from Oxford University, however, may just change this paradigm forever.
“It is a technology that is going to fundamentally change the way we build computers,” says Dr. Mark Gostock, a technology transfer manager at the University of Oxford’s ISIS Innovation. “And there are going to be a lot of people who make a whole lot of money from the way we do it now who aren’t going to be happy when they hear what we have got.”
“The PCB [printed circuit board] industry in particular has already made a big investment in manufacturing infrastructure and they are not going to want to change,” said Chris Stevens, engineering lecturer and successful academic-entrepreneur.
The team started with the technology behind the Pentagon’s cloaking device and came up with a new technology to replace the solder, pins and wiring from conventional computers with LEGO-like blocks of silicon. The blocks are stuck to a Velcro-like metamaterial board capable of wirelessly transmitting or conducting both data and power. This is science fiction transformed into reality, with wallpaper that can connect the components of your entertainment system and computers designed as wristbands.
“We saw the potential first of this technology because most people have been looking at metamaterials from a physics perspective, in terms of cloaking devices or optics, and other potential applications like this use of radio frequencies were seen to be niche, with little research excitement,” says Stevens in a statement.
Stevens tried to convince Microsoft to use the metamaterial for the new Surface tablet, saying that “you could put your mobile on the screen of the tablet and all the apps on the phone would seamlessly appear on the larger screen.” Microsoft took a pass because the technology is too unproven.
The copper-wire and balsa-wood test beds look more like something created during WWI by British scientists, making it easy to miss the potential of the technology if you are watching the demonstration videos. Watching, it is difficult to imagine this is the future of computing.
However, as you watch the LEDs light up as they are waved over the wire and data from a USB stick is flashed up on a screen with only a simple tap of the stick on the metaboard, the enormous potential becomes apparent.
“Right now we can achieve 3.5 gigabit-per-second data transfer rate and hundreds of watts of power — enough to recharge any number of mobile devices without loss of efficiency — but the circuits have the capacity for increased performance and the limits aren’t really known,” says Stevens.
The team embedded copper coils in a conductive layer of material to form a sealed circuit board.
Stevens says, “You can then produce an individual chip that has no legs, no pins and can in no way be damaged and which is simply stuck — even glued — on to the metaboard.”
The result is that instead of “throwing on the tip PCBs which could last for 25 years if it weren’t for the six-months-to-a-year built-in obsolescence embedded in the product life cycle,” these metamaterial chips can be peeled off and reused several times. For example, it could be moved into a lower-end computer, then again into a smart TV, and perhaps a washing machine at the end of its lifetime.
Stevens admits, however, that although he’s done the theoretical work and is satisfied with his progress, it is going to take some hard work to convince people of the potential of his product until he finishes building a carbon demo model, which depends on finding funding.
“If Samsung funded it, they could do all the hard work of silicon integration (which is what they know about) within a year. If I have to fund it out of academic research grants it could take three to four years.”
Darren Cadman, research coordinator at the Innovative Electronics Manufacturing Centre at Loughborough University, says that Stevens’ work “displays significant potential to alter the current design, manufacture and use of electronic circuits in a wide number of applications.”
“By removing solder it offers a novel solution to the problems of reliability. The removal of the need for cables and wires is obviously a huge benefit with the increasing costs of copper and the multitude of electronic devices found in every home. Additionally the simple and cost-effective manufacture of the circuits means they have an excellent chance of finding widespread adoption and use,” Cadman says.
Cadman agrees with Stevens that further investment, probably corporate in nature, will be necessary to ensure the product is robust enough — in terms of data rates, accuracy of data, range and proximity of devices – for the intended applications.
Warren East, chief executive of Cambridge-based ARM, which designs the architecture used in the chips powering almost every mobile phone in the world, agrees that this technology has enormous potential. East warns, however, “sometimes being truly groundbreaking is just not enough”.
“We have had a number of on-going discussions with Chris about a range of different technologies he has been working on to improve the reliability of packaging materials,” East says, and in particular “the use of such conductive materials”.
“After all, while we can do amazing things with chips now, it doesn’t make much sense making chips smaller and smaller if the connections using wires and pins are actually larger than the chips themselves and also unreliable.”
A lot of ideas in research laboratories appear to be groundbreaking, but the real challenge is to get them from the laboratory to economic production, cautions Cadman. Hundreds of good ideas die at the proof-of-concept stage for every one that makes it to commercial reality. This is because the need to produce chips in quantities of billions reliably and economically is a very high hurdle to jump.
Sometimes, Cadman adds, it is simply inertia that holds an idea back for a time. An example of this is 3D transistors, which have recently launched with a great deal of fanfare although the technology has been around for at least 10 years.
“Similarly, people in the industry should be interested in recycling”, East says, but at the moment “there are the commercial disincentives not to do so”. Silicon companies make their money by supplying chips and “they want to supply more of them. If a quarter were recycled then it would mean less profit.”
The goal of truly flexible electronics, for example where the whole computer is flexible and can be worn like a wristwatch, is only possible if all physical connections can be done away with and everything can be made wireless. East believes that the people who are currently making money will not want to be displaced, which will make it harder to bring this technology to the public.
For Cadman, this technology “is the sort of clever creative technology that Britain is so good at, and serves as an example of the strength of work in electronics design and electronics manufacturing currently going on in the UK.”
Stevens is aware, however, that the success of this technology depends on manufacturers desire to improve recycling, and the cost of the initial manufacturing.
“The problem is that nobody is making anything in the UK anymore. If we had our own research institute just down the road where I could pop in for tea then I believe the road ahead would be different.”
Gostock sees this as a case of Oxford University versus the rest of the world once again. Some of the biggest names in electronics and chemicals from the USA, Korea, and India have been showing interest in the metamaterial technology, so Oxford just might win again.
Drinking At A Younger Age Leads To Heavier Drinking Later On
Written By: editor
Lee Rannals for redOrbit.com — Your Universe Online
Drinking alcohol at a younger age can lead to drinking more and stronger alcohol, as well as a risk of developing an addiction.
The scientists gathered data from 6,009 young people between the ages of 14 and 25, from 2007 to 2009, and in three Spanish cities.
“The general tendency is to think that university students drink more alcohol than teenagers as they are older and can access it more easily. But this is not true. Males in secondary school and university drink the same amount of alcohol while practicing botellón. The same is the case for females,” Begoña Espejo Tort, lead researcher of the study at the University of Valencia, said in a statement.
They found that males drink more and aim to get drunk; yet they associate their alcohol intake with the possibility of developing an addiction to a lesser extent than females.
“We have observed that university students progressed to drink more alcohol. When they were adolescents they drank less alcohol and then more when reaching university. Nonetheless, today’s adolescents drink the same amount as university students,” Espejo said in the statement.
If intake levels for high school and university students of the same sex are similar, this means when secondary school students reach the age of 20, the consequences would be greater than those seen in current university students.
“Nearly all adolescents who consumed alcohol started at around 13 or 14 years of age by drinking distilled alcohol (drinks with high alcohol content) in large quantities. On the other hand, university students started between 14 and 15 with fermented drinks like beer in relatively low quantities,” Espejo said in the statement.
They found the students only take into account consequences like drunk driving, or vomiting, dizziness, and hangovers.
The researchers said youngsters feel drinking alcohol will have no negative consequences unless they increase their consumption.
The study authors warn there is a need to take action amongst these groups to reduce and change alcohol consumption. They said campaigns on self-esteem and interpersonal relationship management should be reinforced.
Radiocarbon Dating Improved With Sediment Measurements
Written By: editor
April Flowers for redOrbit.com – Your Universe Online
A research team from Oxford University‘s Radiocarbon Accelerator Unit has found a more accurate benchmark for dating materials, especially for older objects, from a series of radiocarbon measurements from Japan’s Lake Suigetsu.
As far back as 1993, researchers realized sediment cores from Lake Suigetsu would be useful for radiocarbon dating. However, the initial efforts encountered technical problems.
The current team extracted cores of preserved layers of sediment from where they had lain on the bottom of Lake Suigetsu for tens of thousands of years. The cores contained organic material such as tree leaf and twig fossils.
The findings of this study, published in Science, are significant because they provide a more precise way to determine the radiocarbon ages of organic material for the entire 11,000 — 53,000-year time range. Using this process, for example, archaeologists should now be able to pinpoint the timing of the extinction of Neanderthals or the spread of modern humans into Europe with much more accuracy.
“The new results offer an important refinement of the atmospheric radiocarbon record and place the radiocarbon timescale on a firmer foundation,” said Jesse Smith, Senior Editor at Science.
Professor Christopher Ramsey of the Radiocarbon Accelerator Unit, along with his colleagues, worked with scientists from two other radiocarbon laboratories — NERC in Scotland and in Groningen, the Netherlands — on the radiocarbon record from Lake Suigetsu as part of a large, international team studying the cores for clues about past climate and environmental change.
Radiocarbon, or C-14, is produced in the upper atmosphere continuously and is incorporated into all living organisms. The radioactive isotope of carbon decays at a known rate when the organisms die. By measuring the radiocarbon levels remaining in samples of ancient organic materials, scientists can work out how old things are. One element that complicates this calculation is the variability of the amount of environmental radiocarbon from year to year and location to location.
The radiocarbon in leaf fossils, such as those found in Lake Suigetsu, comes directly from the atmosphere. This means the processes that can slightly change the levels found in marine sediments or cave formations do not affect it. Previous to this study, the most important radiocarbon dating records came from such marine sediments or cave formations, which needed corrections. The samples from Lake Suigetsu provide a more complete, direct record from the atmosphere without the need for correction.
The cores display layers of light diatoms and darker sediments and in the sediment record for each year, making them unique and giving scientists the means of counting back the years. These counts are then compared to over 800 radiocarbon dates from the preserved fossil leaves. Tree rings provide the only other direct record of atmospheric carbon, but they only go back 12,593 years. The new record from Lake Suigetsu extends backwards 52,800 years, dramatically increasing the direct radiocarbon record by more than 40,000 years.
“In most cases the radiocarbon levels deduced from marine and other records have not been too far wrong. However, having a truly terrestrial record gives us better resolution and confidence in radiocarbon dating,’ said Professor Ramsey in a press release. “It also allows us to look at the differences between the atmosphere and oceans, and study the implications for our understanding of the marine environment as part of the global carbon cycle.”
The team measured radiocarbon from terrestrial plant fragments spaced throughout the core to construct a radiocarbon record. To place the radiocarbon measurements in time, they also counted the dark and light layers throughout the glacial period. They used microscopes and a method called X-ray fluorescence that identifies chemical changes along the core because many of the layers were too fine to be distinguished by the naked eye.
Some part of the record must be “anchored” in time by assigning some part of it an absolute age. The team managed this by matching the first 12,200 years of their record with the tree ring data, a well-established record that begins in the present. They also used other records from the same period and found that they generally aligned.
“Because of the unique combination of a complete radiocarbon record and terrestrial paleo-climate data, Suigetsu can be a benchmark against which other records can be compared,” said Professor Takeshi Nakagawa of Newcastle University.
“From a palaeoclimate perspective, this radiocarbon dataset will also allow very high precision direct correlation between Suigetsu and other terrestrial climate records,” said Nakagawa in a press release. “This allows us to see how changes in climate in different parts of the world relate to one another — and particularly where there are leads and lags. Information like this is very useful for studying climate mechanisms.”
“This record will not result in major revisions of dates. But, for example in prehistoric archaeology, there will be small shifts in chronology in the order of hundreds of years,” said Professor Ramsey. “Such changes can be very significant when you are trying to examine human responses to climate that are often dated by other methods, such as through layer counting from the Greenland ice cores. For the first time we have a more accurate calibrated time-scale, which will allow us to answer questions in archaeology that we have not had the resolution to address before.”
To determine the age of objects based on their radiocarbon measurements, scientists generally use a composite record called IntCal. IntCal uses marine records, stalagmites and stalactites, tree rings and multiple other records. The team expects the Suigetsu data will be incorporated into the latest version of IntCal, due to be released in the next few months.
Scientists Show Earth’s Surface Acts As A Giant Loudspeaker
Written By: editor
Brett Smith for redOrbit.com — Your Universe Online
In the midst of an earthquake, buildings sway back and forth, terrain pops and locks, and the ground seems to shuffle underfoot.
Acoustic scientists have shown in a new study that the Earth´s surface and atmosphere act as a giant loudspeaker. This “loudspeaker” soundtracks these geologic raves in both the audible range of hearing and in infrasound, or frequencies below the range human hearing can detect.
According to the computer modeling, sound recordings and seismic data used in the study, an earthquake “pumps” the surface and the atmosphere above it, sending sound waves radiating from the epicenter.
“It’s basically like a loudspeaker,” said Stephen Arrowsmith, a researcher at Los Alamos National Laboratory in Santa Fe, N.M., who will present his team’s findings at the 164th meeting of the Acoustical Society of America (ASA) next week. “In much the same way that a subwoofer vibrates air to create deep and thunderous base notes, earthquakes pump and vibrate the atmosphere producing sounds below the threshold of human hearing.”
The infrasound made by an earthquake can provide detailed information about the event, according to the researchers. In particular, it can reveal the amount of shaking that is occurring directly above the source of the quake. Accurate analysis of these sound waves could provide information that is typically gathered using an array of seismometers. This could make infrasound detection a key tool for assessing the damage and studying the mechanism behind a seismic event.
In creating their computer models, the acoustics team assumed the surface and atmosphere would pump like a piston and, during a seismic event, act in the same way as a loudspeaker or subwoofer. Subwoofer technology was first introduced in the 1960s as a way to convey audio signals in the 20 to 200 Hz range, and it uses a small-volume box to generate the desired sound and pressure levels.
To test this model, the research team collected acoustic and seismic data during a 4.6-magnitude earthquake that occurred on January 3, 2011 near Circleville, Utah. The data was recorded at the University of Utah that maintains seismograph stations equipped with infrasound recording devices.
After analyzing the data, they found it closely matched the results produced by their loudspeaker-based computer models.
“This was very exciting because it is the first such clear agreement in infrasound predictions from an earthquake,” said Arrowsmith in a statement. “Predicting infrasound is complex because winds can distort the signal and our results also suggest we are getting better at correcting for wind effects.”
The researchers were able to demonstrate the significance of “pumping” the ground and how it generates infrasound during a seismic event.
Infrasound is typically described as sound with a frequency less than 20 Hz, the lowest frequency for human hearing. Other natural events can produce infrasound, including avalanches, lightning, and waterfalls. According to the National Oceanic and Atmospheric Administration (NOAA), infrasonic arrays can be used to detect deadly avalanches in the Rocky Mountains as they happen and tornadoes as they form in the upper atmosphere before they touch down.
Your Personality Depends On How You Sleep At Night
Written By: editor
Lawrence LeBlond for redOrbit.com – Your Universe Online
How do you sleep at night? Do you sleep on your tummy with limbs out-stretched? Do you lay curled up in a fetal position? Or do you sleep like a log? It is probably safe to say that most of you do not think much about the way you get your Zzzs at night. But would you if you knew that the way you sleep could tell a lot about the type of personality you have?
According to body language expert Robert Phipps, the way people sleep at night actually determines a lot about the type of personality they have. In a new study on the topic, Phipps has identified four sleeping positions that affect personality.
“Our sleeping position can determine how we feel when we wake,” said Phipps.
The study was carried out as part of a survey for Premier Inn, one of the largest hotel chains in the UK. Hotel spokeswoman Claire Haigh said: “We were shocked the research revealed just how stressed we are. It is important we try to wind down after a long day and get a good night’s rest so we wake up refreshed.”
Phipps found that worriers, those who stress the most, tend to sleep in the fetal position. He found that this is by far the most common bedtime position, with nearly 58 percent of snoozers sleeping on their side with knees up and head down. The more we curl up, the more comfort we are seeking, according to Phipps.
The second most common position is the log. People who sleep with a straight body, with arms at the side, as if they are standing guard at Buckingham Palace, indicates stubbornness, and these people (the 28 percent who sleep this way) often wake up stiffer than when they went to sleep.
“The longer you sleep like this, the more rigid your thinking and you can become inflexible, which means you make things harder for yourself,” according to Phipps.
Yearner sleepers are next on the list. About 25 percent of people sleep in this style–on their backs with arms stretched out in front, looking as if they are either chasing a dream or perhaps being chased themselves. Yearners are typically their own worst critics, always expecting great results, explained Phipps. These people often wake up refreshed and eager to face the challenges of the day ahead.
However, he warned to “make sure what you yearn for is what you really want or you’ll spend a lot of wasted time and energy.”
Perhaps the most peculiar of sleep styles is the freefaller position. This sleep style makes up 17 percent of the population. They sleep face down with arms stretched out. These people, according to Phipps, feel like they have little control over their life. Not only is this the oddest of sleep styles, it also the least comfortable, and people may wake up feeling tired and have no energy.
In conclusion, Phipps has only one more thing to add: “A good night’s sleep sets you up for the following day and our sleeping positions can determine how we feel when we wake.”
2002 Meningitis Outbreak Offers Lessons In Treating New Cases
As the death toll from fungal meningitis continues to rise, a physician who spearheaded the 2002 meningitis outbreak demands that health experts take a look back at the lessons learned before venturing into unknown waters.
A new perspective paper has been published online in the Annals of Internal Medicine, detailing the lessons learned from treating patients who were affected by the 2002 outbreak of Exophiala dermatitidis meningitis due to contaminated, injectable coticosteroids prepared from a compounding pharmacy.
Dr. John R. Perfect, one of the health experts at the forefront of the 2002 outbreak, said in the article that the lessons he learned during the 2002 outbreak are applicable to the current outbreak, even though the 2012 infections are mainly from Exserohilum rostratum.
In 2002, the US Centers for Disease Control and Prevention (CDC) detailed 5 cases of E. (Wangiella) dermatitidis meningitis. Perfect said he was involved in the recognition and management of some of these patients. He said he and other experts learned, or so they thought, several important lessons from the outbreak.
Perhaps the most important lesson was that the compounding of preservative-free corticosteroids requires meticulous sterility to ensure lack of contamination. Without that level of sterility, fungi has the ability to grow aggressively and very rapidly. Also, once injected, the fungus can spread throughout human tissue fairly quickly, leading to invasive mycosis. He noted, however, they also learned that the incubation period for appearance of disease from the time of exposure can be up to 6 months. In the 2002 outbreak, many cases of those exposed were successfully treated because of low attack rate, resulting in only one fatality from fungal meningitis.
The reason the 2002 outbreak proved fearsome was predominantly due to patient worry and suffering. Combined with increased medical expenses, detailed public health surveillance, and a lack of trust in medications for fear of microbial contamination, the 2002 outbreak was a medical health crisis in the making.
While the 2012 cases are mainly tied to E. rostratum, the infections are occurring through the same process: injectable steroids produced in a compounding pharmacy. These injections were primarily given to older adults with low back pain and were probably administered as intra-articular injections.
As of October 19, there have been 20 reported deaths in the 2012 meningitis outbreak. What is making this year´s outbreak exponentially more worrisome is the fact that some 14,000 people have been exposed to the contamination. The New England Compounding Center (NECC), responsible for the contaminated injectable, had shipped some 17,000 of the contaminated steroid injections out to 76 medical facilities in 23 states as early as last May.
NECC, based in Framingham, Massachusetts, has now been shut down, and all products from the pharmaceutical have been secured and retained by the CDC and the FDA, which are currently in the midst of an aggressive investigation into the crisis.
The CDC has confirmed 4 new cases since Wednesday, with Virginia and Florida both reporting new deaths–overall cases in those states are now at 37 and 13, respectively. Michigan has so far seen 49 cases related to fungal meningitis this year, the most of any state. The CDC said the number of cases has jumped from 214 to 245 in just a few days.
While the FDA doesn´t regulate compounding facilities like they do regular pharmacies, it has flagged violations with at NECC as recently as 2006. The US House of Representatives´ investigative panel have given the FDA until October 31 to turn over all documents related to NECC, including communications with state regulators and the agency’s commissioner, dating back to 2004.
The FDA had reportedly told the House last week it had been assured by NECC of the pharmacy´s compliance in 2007. However, the FDA investigators could not confirm whether the agency then took steps to ensure corrective measures had been taken.
An FDA spokesman said the agency had received the letter and would respond directly to the House panel.
“If the investigation finds any criminal misdoings, the Department of Justice must act decisively, file charges and prosecute the company or individuals responsible,” said US Representative Rosa DeLauro, a Connecticut Democrat who has proposed legislation to give FDA more authority to regulate compounding pharmacies.
This year´s meningitis outbreak will likely continue to get worse before it gets better. As the incubation period can take up to 6 months, it is likely most of the some 14,000 patients who had received the injectable have yet to encounter symptoms.
Cases of fungal meningitis had only first begun popping up last month when people began arriving at the ER of St. Thomas Hospital in Nashville, Tennessee with headaches, neck stiffness, nausea and other symptoms after having received epidural injections for back pain in a separate clinic at the hospital.
Because the product was shipped as early as last May, it could be November before an influx of cases begin pouring in. That is, of course, unless health officials can contact these patients who have received the injections, get them seen by medical professionals, and possibly get them on voriconazole treatments to stave off any infection.
Perfect said, based on the 2002 outbreak, voriconazole is the best possible antifungal drug for initial treatment. “Due to the aggressive and deadly nature of the disease, it is important for physicians to act decisively and early,” he said.
He warned that these outbreaks will continue to occur if pharmacy societies, the FDA, and the pharmaceutical industry do not work together to regulate pharmacy compounding.
First Curiosity Martian Soil Sample Delivered
Written By: editor
editor
Lee Rannals for redOrbit.com – Your Universe Online
Curiosity has taken in its first Martian soil sample into its laboratory on board in search for extraterrestrial life on Mars.
The sample is being analyzed inside the rover’s Chemistry and Mineralogy (CheMin) instrument to determine what minerals the soil sample contains.
“We are crossing a significant threshold for this mission by using CheMin on its first sample,” according to Curiosity project scientist John Grotzinger of the California Institute of Technology in Pasadena. “This instrument gives us a more definitive mineral-identifying method than ever before used on Mars: X-ray diffraction. Confidently identifying minerals is important because minerals record the environmental conditions under which they form.”
The sample is a small portion of the third scoop taken by Curiosity at the site named “Rocknest.” NASA said the arm delivered the sample to Curiosity’s CheMin on Wednesday.
The material was scooped up into sample-processing chambers to scrub internal surfaces of any residue carried from Earth back on Tuesday, according to the space agency.
NASA also released images taken after Curiosity collected its scoops, one of which included a “bright object” that halted the rover’s efforts at its first soil analysis.
The bright object at first was believed to be part of the Curiosity rover itself, but a later analysis determined the material to be a native Martian material.
“We plan to learn more both about the spacecraft material and about the smaller, bright particles,” said Curiosity Project Manager Richard Cook of NASA’s Jet Propulsion Laboratory, Pasadena. “We will finish determining whether the spacecraft material warrants concern during future operations. The native Mars particles become fodder for the mission’s scientific studies.”
Researchers Use Underwater Robot To Track Tagged Sharks
Researchers from University of Delaware are in the midst of a multiyear study with Delaware State University researchers to better understand the behavior and migration patterns of sand tiger sharks. In the latest phase of the study, an underwater robot is being unleashed to hunt down and follow these seemingly placid predators.
The Oceanographic Telemetry Identification Sensor (OTIS) is a remote-controlled underwater device that looks very much like a yellow torpedo. Normally used for testing water conditions, the researchers have outfitted OTIS with acoustic receivers that can recognize signals given off by transmitters. OTIS will track sharks that have been previously tagged with these transmitters as they travel through their coastal habitat.
Matthew Oliver, assistant professor of oceanography in UD´s College of Earth, Ocean, and Environment, said OTIS has, “in the past week“¦detected multiple sand tiger sharks off the coast of Maryland that were tagged over the past several years.”
“This is the first time that a glider has found tagged sharks and reported their location in real time,” he said in a news release, reported by Teresa Messmore, a Communications Specialist for UD´s College of Earth, Ocean, and Environment.
The technology implemented allows the course of OTIS to be changed, enabling it to follow the sharks and test the water around them. Using OTIS to track these sharks will allow scientists to follow where the sharks are going more efficiently than using conventional tracking methods.
OTIS will be tracking sharks with three different types of tags.
One is an acoustic transmitter that “pings” receivers while passing by a set of 70 devices situated around Delaware Bay. DSU´s Dewayne Fox maintains these receivers, and has tagged more than 500 sharks since 2006.
Another tracking mechanism implemented is the pop-off satellite archival tag. The team is using 34 of these tags, which store data on the sharks´ journeys for up to a year then automatically release from the animal to dispatch a location signal for retrieval.
A third type of tag, also the newest, is called a VEMCO mobile transceiver (VMT). This tag is larger than the others, but also transmits and receives information to communicate its location and listen for the pings of other marine animals outfitted with acoustic tags.
The VMT tag “will tell us not only where it is, but who it´s with,” said Oliver. “It´s like a social network for sharks.”
Oliver, Fox and students from both universities, spent the summer catching sand tiger sharks, carefully pulling them into stretchers alongside their boat, and inserted transmitters through a quick surgery.
Sand tiger sharks are the largest commonly occurring shark in Delaware´s bay and coastal region. Although these marine animals are generally slow-moving and seem relatively placid, they are apex predators in their habitat and play a key role in the ecological balance of the region.
“Sand tigers have suffered from a number of threats that ultimately led to population declines,” said DSU´s Fox. “In 1997 sand tigers were listed as a ℠species of concern´ by the National Marine Fisheries Service, although very little is known of their migrations and habitat requirements.”
Scientists have suspected that these sharks migrate widely along the Eastern Seaboard, and using newly collected information, the university teams plan to map these habitats, cross-referencing shark data with satellite and remotely sensed environmental conditions to create a comprehensive picture of the animals´ habitats.
Oliver told Messmore the teams are integrating two areas of biotelemetry to get a better understanding of the behaviors of migration patterns of these apex predators.
Fox is part of the Atlantic Cooperative Telemetry Network (ACT), which tracks thousands of animals up and down the coast; Oliver participates in the Mid-Atlantic Regional Association Coastal Ocean Observing System (MARACOOS), which uses satellites, underwater robots and models to study the coastal ocean.
By combining the two unique data sets, Oliver and Fox hope they will assist natural resource managers in predicting where sand tiger sharks live and how best to handle conservation efforts.
OTIS will be a big part of this, helping researchers find out which water conditions sharks prefer to swim in during their migrations. OTIS is capable of traveling much further out than what the static receivers´ can pick up, and can also collect information on a wide array of conditions, including water temperature, quality, clarity and oxygen levels.
The team hopes the data will give scientists a little more understanding as to why these sharks head to certain places, explained Oliver.
Oliver said the team took an educated guess as to where these sharks were currently hanging out last week when it launched OTIS for the first time. They launched it off the coast of Delaware´s Indian River Inlet and heading south. After five days, they began receiving transmissions from sharks about 4 to 9 miles off the coast of Assateague Island, Maryland.
Their next goal is to direct the glider to stay near the sharks, unless they move south of the lower Delmarva Peninsula. OTIS can last up to four weeks without recharging.
“We have at least another two weeks of battery,” Oliver said. “We´ll see how it develops.”
Russ George Releases 100 Tons Of Iron Into Pacific
Written By: editor
Brett Smith for redOrbit.com — Your Universe Online
In a plot that could have been yanked from the script of the upcoming James Bond film, American entrepreneur Russ George has released over 100 tons of iron sulphate into the Pacific Ocean in an attempt to foster a massive plankton bloom that would capture carbon dioxide and sink to the bottom of the ocean, thereby effecting climate change–U.K. news organization The Guardian has reported.
Besides having a potential impact on global warming, the supposed ℠experiment´ could net George valuable carbon credits that he could then sell on the open market for a hefty ransom.
Environmentalist groups are calling the dump a “blatant” violation of two international moratoria and warn that a massive plankton bloom could have unforeseen and irreversible effects.
George remained defiant and even optimistic about the project as he asserted that an unnamed team of scientists is closely monitoring his geoengineering experiment with equipment loaned from NASA and the National Ocean and Atmospheric Administration.
“We’ve gathered data targeting all the possible fears that have been raised (about ocean fertilization),” he said. “And the news is good news, all around, for the planet.”
George is the former chief executive of Planktos Inc, a geoengineering firm dedicated to “removing CO2 from our oceans and atmosphere by healing the seas, growing new climate forests, and erasing carbon footprints,” according to a statement online.
The defiant businessman had previously attempted similar large-scale commercial dumps near the Galapagos and Canary Islands, but those efforts led to his vessels being banned from ports by the Spanish and Ecuadorean governments.
Scientists are currently debating the value of the types of ℠experiments´ George is conducting, but have warned that the long-term effects of uncontrolled dumping could produce toxic tides and acidify the ocean.
“It is difficult if not impossible to detect and describe important effects that we know might occur months or years later,” John Cullen, an oceanographer at Dalhousie University told The Guardian. “Some possible effects, such as deep-water oxygen depletion and alteration of distant food webs, should rule out ocean manipulation. History is full of examples of ecological manipulations that backfired.”
The iron sulphate dump allegedly took place near the islands of Haida Gwaii off the coast of British Columbia. George convinced the local council to allow him to perform his experiment after telling them that it would benefit the ocean. The council also agreed to spend $1 million of its own funds on the project.
The potential abuse of both the environment and a small local government has many observers sounding the alarm on what could be a disastrous recipe for future geoengineers.
Back in 2009, an article in Foreign Affairs magazine predicted just such a scenario, warning that smaller developing nations could act unilaterally.
“A single country could deploy geoengineering systems from its own territory without consulting the rest of the planet,” the authors wrote.
The article also warns against using geoengineering as a quick fix for what could be increasing effects of global warming.
“At some point in the near future, it is conceivable that a nation that has not done enough to confront climate change will conclude that global warming has become so harmful to its interests that it should unilaterally engage in geoengineering,” says the Foreign Affairs article.
Website Makes Evolutionary Tree Of Life Digital And Interactive
Written By: admin
Brett Smith for redOrbit.com – Your Universe Online
Evolutionary biologists have long envisioned creating a diagram, or tree of life, that would detail how different species have evolved from a common ancestry, but the task has been a daunting one for taxonomists who would need multiple reams of paper or computer screens to clearly show the evolutionary descent of each species.
A research associate at the Imperial College London, however, has risen to the challenge and created OneZoom, an interactive website that allows users to navigate different branches of evolution by clicking and zooming in on different aspects of a virtual tree of life.
James Rosindell, from the Imperial College´s Department of Life Sciences, and his partner Luke Harmon, a biologist at the University of Idaho, created the fractal-based tree in order to escape what they call the “paper paradigm,” a way of displaying data that is optimized for the printed page.
Inspired in part by Google Earth, users can zoom in on any point along the tree and see incrementally smaller groups of species. As the users attempt to single out one particular species, say humans, they pass though the various classes, orders and genera that the species belongs to at different points along the way.
“OneZoom gives you a natural way to explore large amounts of complex information like the tree of life,” explained Rosindell. “It’s intuitive because it’s similar to the way we explore the real world by moving towards interesting objects to see them in more detail.”
Traditionally, the tree of life is depicted as starting with a thick trunk that represents the first life on earth. The trunk then diverges into large branches for the many different categories of life, such as plants and animals. These large branches then split into smaller branches to represent groups such as reptiles, birds and mammals.
OneZoom replicates this traditional illustration with each branch leading to smaller branches and eventually to individual leafs that denote one particular species.
Rosindell has also decided to include other functionality within the OneZoom website. Color and animations reveal the different dates and timescales associated with each branch. Search functionality provides users the ability to quickly find the branch or species they are looking for and metadata allows additional information to be displayed at certain points on the tree.
Currently, OneZoom depicts only the tree of mammals, however, Rosindell says the project will be expanded in scale over the next few years, which should coincide with the work currently being performed by other researchers.
“After decades of study, scientists are probably only a year away from having a first draft of the complete tree of life. It would be a great shame if having built it we had no way to visualize it,” Rosindell added.
According to an article written by Rosindell and Harmon that appeared in the journal PLoS ONE, they also plan to increase the amount of detail that will be included in the OneZoom tree.
“We envisage putting ℠microdots´ on the branches of the tree, that when zoomed into, show fossil images and other evidence backing up the hypothesized evolutionary path of that branch,” they wrote.
“A richly annotated IFIG may help make the evidence, logic, and beauty of evolution easy to explore and understand in a way that is compelling and fun. “
Researchers Say General Health Checkups Do Not Save Lives
Written By: editor
editor
Lawrence LeBlond for redOrbit.com – Your Universe Online
Most people visit their doctors for regularly-scheduled general checkups with the notion that doing so provides them with the security of knowing they will live a long, happy life. But those people could be accepting false hope, according to Danish researchers who carried out a recent study.
The researchers found that patients who had general health checkups died of cardiovascular disease and cancer at virtually the same rate as those who forewent check-ups. The researchers noted that, not only do their findings show general healthcare checkups do not offer any added security, but may also cause undue stress, which may or may not play a part in the findings.
Analysis of 16 clinical trials involving 183,000 patients yielded mortality risks of 1.01 and 1.03 for people who had general checkups versus those who did not, according to Lasse T. Krogsboll, of the Cochrane Nordic Center in Copenhagen, and colleagues.
Based on their findings, the researchers, who carried out the review for The Cochrane Library, are warning against offering general health checkups as part of a public health program.
In England, people between the ages 40 and 74 are offered free health checkups. The initiative was started in 2009 and was designed to spot conditions such as heart disease, stroke and diabetes by looking for silent risk factors such as high blood pressure and cholesterol. Ministers said they believed such measures would save at least 650 lives every year.
But the latest findings suggest these routine checkups are a waste of time.
These general health checks, intended to reduce deaths and ill health by enabling early detection and treatment of disease, could be leading to potentially negative implications, for example diagnosis and treatment of conditions that might have never led to symptoms or shortened life.
While a few of the trials analyzed in the study showed an increase in diagnoses after general health checks, the researchers noted many of the trials were poorly studied. In one of the trials, it was also noted that most patients who were offered free general health checks, were already more likely to be diagnosed with high blood pressure or high cholesterol. In three other trials, large numbers of abnormalities were identified in the screened groups.
However, based on nine of the trials with a total of 11,940 deaths, the researchers found no difference between the number of deaths in the two groups in the long term, either overall or specifically due to cancer or heart disease. While other outcomes were poorly studied, the researchers suggested, based on the evidence garnered, offering general health checks had no impact on hospital admissions, disability, worry, referrals, additional checkups or time off work.
“From the evidence we’ve seen, inviting patients to general health checks is unlikely to be beneficial,” said Krogsboll. “One reason for this might be that doctors identify additional problems and take action when they see patients for other reasons.”
“What we’re not saying is that doctors should stop carrying out tests or offering treatment when they suspect there may be a problem. But we do think that public healthcare initiatives that are systematically offering general health checks should be resisted.”
He said any screening program should be able to prove the benefits outweigh any potential harm, something he said has not been proven in these trials.
But, not everyone agrees. And a number of health experts still believe general health checkups save lives.
“By spotting people who are at risk of heart attacks, diabetes, stroke and kidney disease we can help prevent them,” a Department of Health representative told BBC News. “The NHS Health Check program is based on expert guidance. Everyone having a health check is offered tailored advice and support to manage or reduce their risk of developing serious health conditions.”
The researchers said despite their findings, more studies are needed and should focus on the individual components of health checks and better targeting of conditions such as kidney disease and diabetes. They should be designed to further explore the harmful effects of general health checks, which are often ignored, producing misleading conclusions about the balance of benefits and harm.
Another problem is that those people who attend health checks when invited may be different to those who do not. People who are at a high risk of serious illness may be less likely to attend. Also, most of the trials in the study were old, “which makes the results less applicable to today’s settings because the treatments used for conditions and risk factors have changed,” concluded Krogsboll.
The researchers reported their findings online in the Cochrane Database of Systematic Reviews.
Anti-HIV Vaginal Ring Being Developed
Written By: editor
editor
Connie K. Ho for redOrbit.com — Your Universe Online
Researchers have been delving into new human immunodeficiency virus (HIV) prevention technology. A joint effort by the University of Utah and CONRAD has resulted in the development of an intravaginal ring that can be utilized by women to stop the transmission of HIV through sex. Scientists believe that it is the first product that allows for the vaginal delivery of tenofovir in a long-lasting method.
Scientists at the University of Utah collaborated with CONRAD, a division of the Department of Obstetrics and Gynecology at Eastern Virginia Medical School in Virginia that focuses on reproductive health research and contraceptive development. In the study, the ring was used with sheep to test whether the release of the antiretroviral drug tenofovir was effective and safe during a 90-day period. Tenofovir is considered the only topical prophylactic that can lower the sexual transmission of HIV. Besides being presented at the 2012 American Association of Pharmaceutical Scientists (AAPS) Annual Meeting and Exposition in Chicago, the results of the study will be published in the 12th issue of Antimicrobial Agents and Chemotherapy.
“We have developed a new intravaginal ring technology based on rubbery hydrogel plastics that are loaded with antiretroviral drugs. We can engineer the plastic so it can release a small quantity of drug per day, or a much larger quantity, depending on the drug being delivered,” explained the study´s lead investigator Patrick Kiser, a researcher at the University of Utah, in a prepared statement.
In the past, vaginal rings have been used to help decrease HIV infection. Recent advancement in materials and technologies has led to progress in the development of the contraceptive option.
“Most vaginal rings release a limited quantity of drug each day, but this ring can release quantities 1,000 times larger due to the selection of specific hydrophilic polymers with high permeability,” commented one of the study´s authors David Friend, Ph.D., the director of Product Development at CONRAD, in the statement. “This study showed that the ring releases at least 10 mg of tenofovir a day over 90 days, which makes it very possible that it can be effective in preventing HIV infection in women.”
The researchers believe that the ring must be worn for 90 days to be effective. The intravaginal ring is composed of rubbery water sealable plastics with tubing made up of plastic that is filled with tenofovir and glycerin. The tube is then closed up and formed into a ring shape. In the center of the ring lies glycerol, which helps bring liquid from the vagina and quickens the delivery of the drug.
“We directly compared the ring to 1 percent tenofovir gel, and the ring resulted in similar, if not higher, levels of drug in the vaginal tissue,” continued Friend in the statement. “If the results in sheep hold up in humans, we would expect this ring to be highly protective against HIV.”
The team of investigators noted that the ring could also be adapted to deliver an anti-HIV agent along with a contraceptive. As such, it is considered a multi-purpose prevention technology. The group plans to push the product into the first clinical trial next year.
“We anticipate that this next-generation ring will be able to release a spectrum of drugs that currently cannot be delivered due to limitations of standard technology,” concluded Kiser in the statement. “This ring is a breakthrough design because it is highly adaptable to almost any drug; the amount of drug delivered each day is the same and the release rate can be modified easily if needed.”
Prescription Drug Abuse At All-time High For Teens
Written By: editor
Connie K. Ho for redOrbit.com — Your Universe Online
Researchers from the University of Colorado at Denver recently discovered that young people in the U.S. are abusing prescription pain medications at an alarming rate — 40 percent more than past generations to be exact.
Many of the adolescents have been using prescription drugs such as oxycontin, valium and vicodin. Following marijuana, prescription drug abuse is the second leading form of illegal drug use in the country. The findings of the study were recently revealed in the Journal of Adolescent Health.
“Prescription drug use is the next big epidemic,” explained the study´s lead author Robert Miech, a professor of sociology at CU Denver, in a prepared statement. “Everyone in this field has recognized that there is a big increase in the abuse of nonmedical analgesics but our study shows that it is accelerating among today’s generation of adolescents.”
The team of investigators looked at data from the National Survey on Drug Use and Health, cross-sectional surveys done annually throughout the country to assess national drug use. They analyzed data specifically between 1985 and 2009. Based on the findings, the abuse of prescription medication is “higher than any generation ever measured” and is found throughout subgroups of females, males, non-Hispanic whites, non-Hispanic blacks, and Hispanics.
The epidemic of prescription drug abuse may be due to a variety of factors.
“The increasing availability of analgesics in the general population is well documented, as the total number of hydrocodone and oxycodone products prescribed legally in the U.S. increased more than fourfold from about 40 million in 1991 to nearly 180 million in 2007,” wrote the researchers in the study. “Higher prevalence of analgesics“¦ among contemporary youth easier than in the past because more homes have prescription analgesics in their medicine cabinets.”
Another factor may be the modeling of drug use by parents of children.
“Youth who observe their parents taking analgesics as prescribed may come to the conclusion that any use of these drugs is OK and safe,” continued Miech in the statement.
As a result of the high use of prescription drugs, there have been more accidental deaths due to overdose of prescription medications as compared to overdoses of cocaine and heroin combined. Between 2004 and 2009, there was a 129 percent increase in the number of emergency room visits due to prescription drug use. As well, between 1997 and 2007, there was a 500 percent uptick in the number of individuals in the U.S. who sought treatment for dependence on prescription drugs.
Many students note that they receive the medications from their family members or friends.
“While most people recognize the dangers of leaving a loaded gun lying around the house,” noted Miech in the statement. “What few people realize is that far more people die as a result of unsecured prescription medications.”
Even with these negative consequences, there has not been as much social cost related to the abuse of prescription drug medications.
“These results suggest that current policies and interventions are not yet effective enough to counter the factors that have increased nonmedical analgesic use among U.S. youth and the general population,” concluded Miech in the statement. “But it is critical that we devise a strategy to deal with an epidemic that shows little sign of ebbing.”
Federal Regulators Expand Sunland Peanut Butter Recall To Include All Nuts
Written By: editor
Lawrence LeBlond for redOrbit.com – Your Universe Online
Retail outlets across the country had removed Sunland nutty spreads from their store shelves following a recall of the company´s peanut butter and other nut butters last month. The recall came after 35 people were sickened in 19 states from salmonella-contaminated peanut butter manufactured by the New Mexico-based peanut processor.
The recall, which was first announced late last month in peanut butter sold at Trader Joe℠s retail outlets, has now been expanded to include all peanut butters and raw and roasted peanuts marketed by Sunland Inc., the US Food and Drug Administration (FDA) said in a statement on its site, adding that Sunland has halted all production at its butter and peanut processing facilities and has issued voluntary recalls on its products.
The move came after FDA investigators looking into the contaminated peanut butter inspected Sunland´s production line and found salmonella identical (based on DNA tests) to the strain that caused last month´s outbreak.
The list of affected products is continuing to grow, with more than 70 products added to Sunland’s list of recalled items, including all nuts and peanut butters, as well as nutty spreads the company manufacturer´s for brands such as Trader Joe´s, Archer Farms, Treasured Harvest, and Natural Value.
The FDA has also released a list of recalled products that includes more than 400 product recalls from various retailers, as well as Sunland. The FDA list also shows recalls for products such as cookies, ice cream, and chocolate. Also, a statement on the FDA´s recall site states that the recalled packages “are within their current shelf life or had no expiration date,” according to Sunland.
The Sunland statement read: “The raw and roasted peanuts available to retail customers were distributed primarily under the Company´s own label … primarily to produce houses and nationally to numerous large retail chains“¦The products also were available for purchase on the internet. The roasted and roasted/salted peanuts being recalled were distributed during the six month period prior to the recall date (April 12, 2012 — October 12, 2012), and will have best by/expiration dates on the packaging from October 12, 2012 through April 12, 2013. The raw peanuts being recalled (shelled and in-shell), were distributed during the twelve months prior to the recall date (October 12, 2011 — October 12, 2012), and will have either best by dates from October 12, 2012 through October 12, 2013 on the packaging, or a “Crop Year” marking on the package of 2011 or 2012, up to and including October 12, 2012.”
The US Centers for Disease Control and Prevention (CDC) said in its own statement that of the 35 cases of salmonella poisoning, eight have been hospitalized, but to date, there have been no deaths associated with this outbreak.
Salmonella usually causes diarrhea, fever and abdominal pain. It can be fatal for old people, young children and people with weakened immune systems.
Health experts have also urged California schools to destroy any and all peanut butter being recalled by the Sunland manufacturer, to avoid possible sickness and death. As of last Friday, the California Department of Education said there were no reported cases of salmonella in their school children.
The recall has also depleted food banks of peanut butter supplies, which have received more than 23,000 cases of the affected product.
Customers with recalled products should throw them away or return them for a full refund. Those seeking further information can call Sunland 24 hours a day at 1-866-837-1018.
Alternatively, leading brands such as Skippy, Jif and Peter Pan are not included in the recall and are safe to eat and feed to children.
Fibromyalgia Treating is now part of the RedOrbit.com community!
We are excited to announce that FibromyalgiaTreating.com is now part of RedOrbit.com. All of the same great people, writers and editors but now with more firepower. We now have access to an enormous amount of additional research information
from doctors and scientists. We can now extend to other conditions that may be part of your everyday lives and help you on a broad level if that is what you need. We are here for you and now, better than ever so sit back and enjoy
the new Fibromyalgia Treating by RedOrbit!