Earth’s Water Has Been Here Far Longer Than Previously Believed, Claims New Study

Chuck Bednar for redOrbit.com – Your Universe Online
The water that covers over 70 percent of the Earth formed just 14 million years after the formation of the solar system – much earlier than previously believed, according to a new study led by Woods Hole Oceanographic Institution (WHOI) scientists and published online Friday in the journal Science.
“The answer to one of the basic questions is that our oceans were always here. We didn’t get them from a late process, as was previously thought,” lead author Adam Sarafian, an MIT/WHOI Joint Program student in the Geology and Geophysics Department, said in a statement.
One commonly held belief, according to the researchers, was that Earth and other worlds were completely dry when they formed due to the fact that planetary formation is a high-energy and high-impact process. Under this hypothesis, water would have arrived later from comets or “wet” asteroids composed of ice and gases.
“With giant asteroids and meteors colliding, there’s a lot of destruction,” noted co-author Horst Marschall, a geologist at WHOI. “Some people have argued that any water molecules that were present as the planets were forming would have evaporated or been blown off into space, and that surface water as it exists on our planet today, must have come much, much later – hundreds of millions of years later.”

Image Above: In this illustration of the early solar system, the dashed white line represents the snow line—the transition from the hotter inner solar system, where water ice is not stable (brown) to the outer Solar system, where water ice is stable (blue). Two possible ways that the inner solar system received water are: water molecules sticking to dust grains inside the “snow line” (as shown in the inset) and carbonaceous chondrite material flung into the inner solar system by the effect of gravity from protoJupiter. With either scenario, water must accrete to the inner planets within the first ca. 10 million years of solar system formation. (Illustration by Jack Cook, Woods Hole Oceanographic Institution)
Sarafian, Marschall and their colleagues opted instead to examine another potential source of Earth’s water – ancient, unaltered meteorites known as carbonaceous chondrites that were formed in the same dust, ice and gas particle clouds involved in the sun’s formation roughly 4.6 billion years ago, long before the planets formed. Carbonaceous chondrites resemble much of the bulk solar system composition, contain a lot of water and have previously been considered as candidates for the origin of Earth’s water, the study authors noted.
“In order to determine the source of water in planetary bodies, scientists measure the ratio between the two stable isotopes of hydrogen: deuterium and hydrogen,” WHOI explained. “Different regions of the solar system are characterized by highly variable ratios of these isotopes. The study’s authors knew the ratio for carbonaceous chondrites and reasoned that if they could compare that to an object that was known to crystallize while Earth was actively accreting then they could gauge when water appeared on Earth.”
As part of their research, the team analyzed meteorite samples from the planetoid Vesta, explained Irene Klotz of Discovery News. The samples from Vesta, which were provided to the team by NASA and are known as eucrites, demonstrated hydrogen isotope rations matching those found in carbonaceous chrondrites, and previous research revealed that those carbonaceous chrondrites match the chemical fingerprints of Earth’s hydrogen.
The chemical signatures of the eucrites belong to one of the oldest hydrogren reservoirs in the solar system, and according to the study authors, their age (14 million years after the formation of the solar system) makes them ideal for the source of the water present in the inner solar system while the Earth was forming. The research team analyzed five different samples, and by combining those results with nitrogen isotope data, they concluded that the carbonaceous chondrites were indeed the most likely common source of water for both Earth and Vesta.
Based on that conclusion, Sarafian and his colleagues argue that the origin of water actually dates back to roughly 4.6 billion years ago, during a time when the planets of the inner solar system were still forming, according to National Geographic reporter Andrew Fazekas. While they are not eliminating the possibility that some of Earth’s water may have arrived later and from a different source, their findings indicate that there would have been enough H2O available on the planet for life to have begun earlier than previously believed, Fazekas added.
“An implication of that is that life on our planet could have started to begin very early,” explained co-author Sune Nielsen, an assistant scientist in the WHOI geology and geophysics program. “Knowing that water came early to the inner solar system also means that the other inner planets could have been wet early and evolved life before they became the harsh environments they are today.”
Related Reading:
> Meteorites Source Of Earth’s Water, New Study Suggests
> Meteorite – Universe Reference Library
—–
FOR THE KINDLE: Space Technologies on Earth: redOrbit Press
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Does Milk Really Give You Stronger Bones? New Research Questions Long-Held Beliefs

April Flowers for redOrbit.com – Your Universe Online

How many times were you told as a child that drinking your milk would give you strong bones? Turns out that might not be true.

A new study led by Uppsala University and published in The British Medical Journal reveals that drinking more milk not only doesn’t lower your risk of fractures, it might raise your risk of death.

The research team, which included members from Karolinska Institute and Uppsala University, believes their findings might be explained by the high levels of lactose and galactose (types of sugar) in milk. These sugars have been shown to increase oxidative stress and chronic inflammation in animal studies.

They caution, however, that their results can only show an association, not prove cause and effect. They recommend further studies before any firm conclusions or dietary recommendations can be made.

“A diet rich in milk products is promoted to reduce the likelihood of osteoporotic fractures,” writes Emma Dickinson for the British Medical Journal. “But previous research looking at the importance of milk for the prevention of fractures and the influence on mortality rates show conflicting results.”

The new study, led by Uppsala University’s Professor of Surgical Sciences Karl Michaëlsson, investigated whether oxidative stress, which affects the risk of mortality and fracture, would increase due to a high milk intake.

Food frequency questionnaires for 96 common foods — including milk, yogurt and cheese — were completed by two large groups of participants in Sweden. The first group included 61,433 women between the ages of 39-74 years in 1987-1990. The second group was made up of 45,339 men between the ages of 45-79 in 1997.

The researchers also collected lifestyle data, weight, height and factors such as education level and marital status. Fracture and mortality rates were tracked through the National registers.

The female group was tracked for an average of 20 years. During this period, 15,541 died and 17,252 suffered a fracture — 4,259 of those fractures were hip fractures. The researchers found no reduction in fracture risk for the women with higher milk consumption. In fact, women who drank more than three glasses of milk a day (average of 23 oz) had a higher risk of death than women who drank one glass of milk a day (average of 2 oz).

The male participants were tracked for an average of 11 years. During this time, 10,112 men died and 5,066 suffered fractures. Of those fractures, 1,166 were hip fractures. The risk of death was also higher in men who consumed large quantities of milk, but not as pronounced a difference as with women.

The research team was able to identify a positive association between milk intake and the biomarkers of oxidative stress and inflammation. Strangely, they found an association with reduced rates of mortality and fracture (especially in women) and a high intake of fermented milk products with low lactose content, such as yogurt and cheese.

“Our results may question the validity of recommendations to consume high amounts of milk to prevent fragility fractures,” they write. “The results should, however, be interpreted cautiously given the observational design of our study. The findings merit independent replication before they can be used for dietary recommendations.”

Professor Mary Schooling at City University of New York said in an accompanying editorial that the new study raises a fascinating possibility about the potential harms of milk. She stresses, however, that assessing diet precisely is difficult.

“As milk consumption may rise globally with economic development and increasing consumption of animal source foods, the role of milk and mortality needs to be established definitively now,” she concluded.

Related Reading:

> Ancient Europeans Were Lactose Intolerant For Thousands Of Years
> Study Re-examines The Evolutionary Origins Of Lactose Tolerance

—–

Amazon.com – Read eBooks using the FREE Kindle Reading App on Most Devices

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Scientists Solve The Mystery Of Why Scratching Makes You Feel Even Itchier

Chuck Bednar for redOrbit.com – Your Universe Online
Scratching an itch should make you feel better, but often times it only intensifies the feeling – and now scientists from Washington University School of Medicine in St. Louis have discovered what is responsible for this paradox.
Writing in the latest edition of the journal Neuron, senior investigator Zhou-Feng Chen, director of the university’s Center for the Study of Itch, and his colleagues found that the act of scratching causes the brain to release serotonin, which intensifies the itch sensation.
According to BBC News, Chen’s team conducted research in mice that discovered that these so-called scratch cycles are harder to break as more serotonin is released into the system. While their work has yet to be tested in humans, their findings indicate that blocking specific serotonin receptors in the spine could reduce chronic itching, and dermatologists believe that it could lead to effective itch control.
Scientists have long known that scratching creates a mild amount of pain in the skin, the researchers said, and that pain can halt itching temporarily by causing nerve cells in the spinal cord to carry pain signals to the brain instead of itch signals. While serotonin’s role in pain control has long been known, this study marks the first time that its release from the brain has been associated with the sensation of itch.
“The problem is that when the brain gets those pain signals, it responds by producing the neurotransmitter serotonin to help control that pain,” Chen said in a statement Thursday. “But as serotonin spreads from the brain into the spinal cord, we found the chemical can ‘jump the tracks,’ moving from pain-sensing neurons to nerve cells that influence itch intensity.”
The research team, which also included scientists from the University of Toledo, Wuhan University, the Academy of Chinese Science, Guangzhou Medical University, the University of California, and the Xi’an Jiaotong University School of Medicine, genetically engineered mice that lacked the genes required to make serotonin.
When those rodents were injected with a substance that normally caused the skin to become itchy, the mice did not scratch as much as their unmodified littermates. However, when the genetically altered mice were injected with serotonin, they scratched as much as regular mice do in response to compounds designed to induce itching.
“So this fits very well with the idea that itch and pain signals are transmitted through different but related pathways,” explained Chen, a professor of anesthesiology, psychiatry and developmental biology at the university. “Scratching can relieve itch by creating minor pain. But when the body responds to pain signals, that response actually can make itching worse.”
However, Chen told BBC News that it is not feasible to completely block serotonin release in humans, since the chemical plays a key role in growth, aging, bone metabolism and mood. However, the research does indicate that disrupting the communication between serotonin and the cells responsible for transmitting itch signals to the brain could well be one of the most promising ways of controlling chronic itching, the British news organization added.
“We always have wondered why this vicious itch-pain cycle occurs,” the professor said. “Our findings suggest that the events happen in this order. First, you scratch, and that causes a sensation of pain. Then you make more serotonin to control the pain. But serotonin does more than only inhibit pain. Our new finding shows that it also makes itch worse by activating GRPR neurons through 5HT1A receptors.”
Related Reading:
> The Molecule That Makes You Itch
> Knockout Mouse – Genetically Modified Organisms Reference Library
—–
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Ghost Light From Distant Dead Galaxies Detected By Hubble Telescope

Chuck Bednar for redOrbit.com – Your Universe Online

Just in time for Halloween, the Hubble Space Telescope has detected faint ghost light emitted from stars that had been ejected from an immense collection of ancient, now dead galaxies known as Pandora’s Cluster billions of years ago, NASA and Space Telescope Science Institute officials announced on Thursday.

Pandora’s Cluster, which is also known as Abell 2744, is an immense grouping of nearly 500 galaxies, and the ghostly glow detected by Hubble was emitted by scattered stars that had been expelled from galaxies – galaxies which themselves had been gravitationally torn apart several billion years ago, according to the US space agency.
[ Watch the Video: Abell 2744 Y1 Is The Most Distant Galaxy Discovered ]
The orphaned stars, which are located four billion light-years from Earth, are no longer bound to a single galaxy and drift freely from one to another in the cluster. By observing their light, Hubble astronomers have managed to gather forensic evidence suggesting that up to six galaxies were torn to pieces in the cluster over a period of six billion years. Their findings have been published in The Astrophysical Journal.
Computer modeling of the gravitational dynamics among galaxies in a cluster suggests that the stars originated from galaxies approximately the same size as the Milky Way. Those galaxies, the study authors explained, would have been pulled apart if they traveled through the center of the galaxy cluster, where the strongest gravitational tidal forces are found.
“The Hubble data revealing the ghost light are important steps forward in understanding the evolution of galaxy clusters,” research team member Ignacio Trujillo of the Instituto de Astrofísica de Canarias (IAC), La Laguna, Tenerife, Spain said in a statement. “It is also amazingly beautiful in that we found the telltale glow by utilizing Hubble’s unique capabilities.”
While astronomers have long hypothesized that they should be able to detect the light from scattered stars left behind after galaxies become disassembled, it was difficult to detect this anticipated “intracluster” glow of stars because of how faint it was. The researchers estimate that the combined light of approximately 200 billion outcast stars contributes nearly one-tenth of the cluster’s brightness.
“Because these extremely faint stars are brightest at near-infrared wavelengths of light, the team emphasized that this type of observation could only be accomplished with Hubble’s infrared sensitivity to extraordinarily dim light,” NASA explained. “Hubble measurements determined that the phantom stars are rich in heavier elements like oxygen, carbon, and nitrogen. This means the scattered stars must be second or third-generation stars enriched with the elements forged in the hearts of the universe’s first-generation stars.”
“Spiral galaxies – like the ones believed to be torn apart – can sustain ongoing star formation that creates chemically-enriched stars,” the US space agency added. “Weighing more than 4 trillion solar masses, Abell 2744 is a target in the Frontier Fields program. This ambitious three-year effort teams Hubble and NASA’s other Great Observatories to look at select massive galaxy clusters to help astronomers probe the remote universe.”
Galaxy clusters are so massive that their gravity deflects light that passes through them. As a result, that light becomes brighter, magnified and distorted due to what is known as gravitational lensing. This phenomenon is exploited by astronomers, who use these clusters like a zoom lens to get a better look at distant galaxies that would otherwise be too faint for even telescopes as powerful as Hubble to detect.
These faint ghost stars appear brightest in near-infrared wavelengths of light, however, the researchers emphasized that they would not have been able to conduct their observation without Hubble’s infrared sensitivity to extraordinarily dim light. Those measurements determined that the stars are rich in heavier elements such as oxygen, carbon and nitrogen – meaning that they must have been second or third generation stars, the authors said.
Related Reading:
> Astronomy – Fields of Science Reference Library
> Milky Way Galaxy – Stellar Bodies Reference Library
> Gravitational Lens – Universe Reference Library
—–
Keep an eye on the cosmos with Telescopes from Amazon.com
—–
Follow redOrbit on Twitter, Facebook and Pinterest.




Mediterranean Diet May Be Linked With Decreased Risk Of Developing Chronic Kidney Disease

Provided by Tracy Hampton, American Society of Nephrology

Adhering to a Mediterranean-style diet may significantly reduce the risk of developing chronic kidney disease, according to a study appearing in an upcoming issue of the Clinical Journal of the American Society of Nephrology (CJASN).

Chronic kidney disease is a growing epidemic, and while there has been significant progress in protecting against kidney disease and its progression through aggressive treatment of risk factors such as hypertension and diabetes, many people still experience declining kidney function as they age. Minesh Khatri, MD (Columbia University Medical Center) and his colleagues wondered whether an improved diet might provide additional benefits.

“Many studies have found a favorable association between the Mediterranean diet and a variety of health outcomes, including those related to cardiovascular disease, Alzheimer’s disease, diabetes, and cancer, among others,” said Dr. Khatri. “There is increasing evidence that poor diet is associated with kidney disease, but it is unknown whether the benefits of a Mediterranean diet could extend to kidney health as well.” The Mediterranean diet includes higher consumption of fruits, vegetables, fish, legumes, and heart-healthy fats, while minimizing red meats, processed foods, and sweets.

The researchers examined the associations of varying degrees of the Mediterranean diet on long-term kidney function in an observational, community-based, prospective study. In their analysis of 900 participants who were followed for nearly 7 years, every one-point higher in a Mediterranean diet score, indicating better adherence to the diet, was associated with a 17% lower likelihood of developing chronic kidney disease. Dietary patterns that closely resembled the Mediterranean diet (with a score of ≥5) were linked with a 50% lower risk of developing chronic kidney disease and a 42% lower risk of experiencing rapid kidney function decline.

In an accompanying editorial, Julie Lin, MD, MPH, FASN (Brigham and Women’s Hospital) noted that a Mediterranean-style diet is only one component of an overall healthy lifestyle, which also needs to incorporate regular physical activity. “Although a seemingly simple goal, achieving this is challenging. We need to begin by embracing the reality that there is no magic pill or miracle food, only vigilance and discipline with diet and regular exercise, and the rare indulgence in cake for very special occasions,” she wrote.

> Continue reading…

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Ghrelin Stimulates Appetite For Drinking Alcohol

Provided by Rhiannon Bugno, Elsevier Editorial Office

Ghrelin is a hormone released by the stomach and it stimulates appetite and food intake. Alcohol is commonly viewed as a psychoactive substance that primarily affects brain function, but it is also a highly caloric food.

This knowledge, combined with findings from animal studies, led researchers to the hypothesis that ghrelin has the potential to stimulate alcohol craving.

Dr. Lorenzo Leggio and his colleagues tested this in humans and found that, as they had anticipated, alcohol craving was increased in heavy drinkers following administration of ghrelin. Their work is published in the current issue of Biological Psychiatry.

“This study provides a direct translation on the role of ghrelin in alcohol-seeking behaviors in humans from previous research conducted in rodents,” said Dr. Leggio, Clinical Investigator in the National Institute on Alcohol Abuse and Alcoholism (NIAAA) and the National Institute on Drug Abuse at the National Institutes of Health. Dr. Leggio is also Chief of the Section on Clinical Psychoneuroendo-crinology and Neuropsychopharmacology, in NIAAA’s Laboratory of Clinical and Translational Studies.

The study was conducted in the laboratory, where 45 men and women, all of whom were alcohol-dependent, heavy-drinking individuals not seeking treatment, were randomized to receive one of three different doses of ghrelin. One of those doses, at 0 mcg/kg, served as a placebo.

Following intravenous administration of the drug, the volunteers then completed a cue-reactivity task, during which they were exposed to both neutral and alcohol cues. Throughout the laboratory session, their craving (e.g., urge to drink) for alcohol or juice was repeatedly assessed.

Compared to placebo, ghrelin significantly increased alcohol craving, but had no effect on urge to drink juice. There were no differences in reported side effects between those who received placebo versus those who received ghrelin.

Dr. John Krystal, Editor of Biological Psychiatry, commented, “This study sheds new light on a role for ghrelin in alcohol craving, raising the possibility that ghrelin signaling might be targeted by future treatments for alcohol use disorders.”

Leggio added, “There is a crucial need to identify neurobiological pathways linked to alcohol craving that may help in the development of novel effective medications aimed to reduce excessive alcohol use. In this context, future studies may explore the potential of blocking ghrelin signaling as a new promising treatment for alcoholism.”

The article is “Intravenous Ghrelin Administration Increases Alcohol Craving in Alcohol-Dependent Heavy Drinkers: A Preliminary Investigation” by Lorenzo Leggio, William H. Zywiak, Samuel R. Fricchione, Steven M. Edwards, Suzanne M. de la Monte, Robert M. Swift, and George A. Kenna (doi: 10.1016/j.biopsych.2014.03.019). The article appears in Biological Psychiatry, Volume 76, Issue 9 (November 1, 2014), published by Elsevier.

> Continue reading…

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Can Social Media Help Stop The Spread Of HIV?

Provided by Mary Beth O’Leary, Cell Press
In addition to providing other potential benefits to public health, all of those tweets and Facebook posts could help curb the spread of HIV.
Although public health researchers have focused early applications of social media on reliably monitoring the spread of diseases such as the flu, Sean Young of the Center for Digital Behavior at the University of California, Los Angeles, writes in an October 29th article in the Cell Press journal Trends in Microbiology of a future in which social media might predict and even change biomedical outcomes.
“We know that mining social media will have huge potential benefits for many areas of medicine in the future, but we’re still in the early stages of testing how powerful these technologies will be,” Young said.
With the right tools in place, he says, social media offers a rich source of psychological and health-related data generated in an environment in which people are often willing to share freely.
His recent work on Behavioral Insights on Big Data (BIBD) for HIV offers the tantalizing possibility that insights gleaned from social media could be used to help governments, public health departments, hospitals, and caretakers monitor people’s health behaviors “to know where, when, and how we might be able to prevent HIV transmission.”
Young details a social-media-based intervention in which African American and Latino men who have sex with men shared a tremendous amount of personal information through social media, including when or whether they had ‘come out,’ as well as experiences of homelessness and stigmatization. What’s more, they found that people who discussed HIV prevention topics on social media were more than twice as likely to later request an HIV test.
In the context of HIV prevention, tweets have also been shown to identify people who are currently or soon to engage in sexual- or drug-related risk behaviors. Those tweets can be mapped to particular locations and related to actual HIV trends.
What’s needed now is the updated infrastructure and sophisticated toolkits to handle all of those data, Young said, noting that there are about 500 million communications sent every day on Twitter alone. He and a team of University of California computer scientists are working to meet that challenge now.
Although privacy concerns about such uses of social media shouldn’t be ignored, Young says there is evidence that people have already begun to accept such uses of social media, even by corporations looking to boost profits.
“Since people are already getting used to the fact that corporations are doing this, we should at least support public health researchers in using these same methods to try and improve our health and well being,” he said. “We’re already seeing increased support from patients and public health departments.”
> Continue reading…
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Microsoft Officially Unveils Its First-Ever Wearable Device: The $199 Microsoft Band

Chuck Bednar for redOrbit.com – Your Universe Online
Microsoft has officially unveiled a device that will allow users to monitor their heart rate, calories, sun exposure and other fitness-related information, marking the company’s first entry into the wearable technology market.
Reports of the wrist-worn device, which can function for up to two full days on a single charge, first surfaced less than two weeks ago. At the time, it was unknown what the name and cost of the device would be, but according to Reuters, the Redmond, Washington tech giant has since revealed that the new Microsoft Band will cost $199.
Furthermore, limited quantities of the gadgets will go on sale Thursday at both Microsoft’s physical and online stores, said Ina Fried of Re/Code.
Microsoft Band, which Fried said somewhat resembles the Samsung Gear Fit, will feature 10 tracking sensors that will measure heart rate, sun exposure and stress levels. She added that it will connect to a Microsoft Health fitness tracking service, and can be used on iOS and Android devices through a companion app.
Mike Beasley of 9to5Mac reported Wednesday that Microsoft Band would feature a 310 x 102 resolution display, 132 different backgrounds, and the ability to get phone notifications and create reminders using Microsoft’s Cortana virtual assistant. Furthermore, he revealed that the Microsoft Health app would use the device’s hardware to monitor health-related data such as the number of steps taken, heart rate and even sleep quality.
Screenshots for the device were leaked on the Mac App Store prior to the announcement, according to CNET’s Steven Musil, and the company had revealed via social media that it would be hosting fitness-related activities and prizes at all US Microsoft stores on Thursday starting at 10am local time. Since then, the official announcement has been made.
“Wearable devices such as smartwatches and smart glasses have commanded a great deal of consumers’ attention and manufacturers’ imagination in recent months,” Musil said. “But Microsoft seems to be focused on one of the key selling points that other players in the crowded smartwatch arena have already seized upon: health.”
While Microsoft rivals Apple, Google and Samsung have all entered the fitness-related market with their own devices, BBC News noted that Microsoft Band is somewhat distinctive because it will function with all major mobile operating systems and will connect with Facebook and Twitter.
“Consumers now have an overwhelming choice of health-related cloud platforms to choose from,” CCS Insight analyst Ben Wood told BBC News. “It’s going to be a tough decision to choose whether to place their loyalty with Apple, Google or Microsoft given the immaturity of all three platforms. Furthermore, once they choose a platform they risk locking themselves into a long term commitment if they want to keep a lifetime of health-related data in one place.”
This is not Microsoft’s first foray into the realm of digital health tracking, Fried said. In 2007, the company launched HealthVault, However, HealthVault is more focused on medical records than personal fitness data, she added, and the company reports that the new Microsoft Band service can actually connect with HealthVault as well.
“Microsoft hopes the features will grow over time. It’s working with a bunch of partners, including MapMyFitness, RunKeeper, Jawbone and Starbucks, with the latter allowing users to pay for their coffee with a gift card barcode on the watch,” Fried added, noting that weather and stocks are also available through the new device. “A broader software development kit is planned for January.”
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Deep DNA Sequencing Study Almost Quadruples Number Of Genes Linked To Autism

Chuck Bednar for redOrbit.com – Your Universe Online
By using deep DNA sequencing, an international team of researchers led by the Autism Sequencing Consortium (ASC) has increased the number of genes identified with autism spectrum disorder (ASD) from nine to 33.
The research, published Wednesday in the journal Nature, analyzed data on several types of rare genetic differences in over 14,000 DNA samples from parents, autistic children and unrelated individuals, according to the study authors. In addition to quadrupling the number of definitive autism genes discovered to date, the researchers also found more than 70 other likely ASD genes.
Working at three different universities throughout the country, researchers from over three dozen institutions found seven genes that have mutations in three or more children with autism, implicating those genes with the disorder with a level of near-certainty. Another 20 genes with mutations were found in two children, giving each of them a better than 90 percent chance of being a true autism gene, the authors reported in their paper.
The majority of the newly identified mutations are de novo mutations, meaning that they are not present in unaffected parents’ genomes, but appear spontaneously in a single sperm or egg cell just prior to conception of a child, the research team explained. The DNA implicated fall into three broad classes: those involved in the formation and function of synapses, those that regulate transcription, and those involved in the cellular packaging of DNA.
The first group helps with the creation and operation of sites of nerve-cell communication in the brain, while the second regulates how the instructions in other genes are relayed to the protein-making machinery in cells, and the third impacts how genetic information is wound up and packed into cells in a structure known as chromatin. Mutations that alter chromatin and affect transcription are believed to affect the activity of many genes, the researchers noted.
“We have a set of genes for which now, if people see a likely gene-disrupting mutation when sequencing a young child, there’s a high risk of the child developing autism, and that, to my mind, is pretty powerful stuff,” the University of Washington’s Evan Eichler, who leads one of the laboratories involved in the study, said in a statement. “Recognizing this early on may allow for earlier interventions, such as behavioral therapies, improving outcomes in children.”
“Our findings lend new weight to the hypothesis that there are specific functional categories of genes – likely conserved by evolution in development of the human neurological system and brain – that strongly contribute to autism’s causation, such as genes expressed during embryonic development and genes that encode proteins that remodel chromatin, the bundles in which our DNA is stored,” added Michael Ronemus of Cold Spring Harbor Laboratory.
This marks the first time that the scientists were able to assess the impact of both inherited genetic differences and the spontaneous de novo ones. While slight, rare genetic differences in over 100 top genes were found to increase a person’s risk by a relatively sizable amount, the research team said that looking at the prevalence of those variations allowed them to predict slight differences in up to 1,000 others that could eventually be found to increase ASD risk.
—–
May we suggest – Autism Spectrum Disorder (revised): The Complete Guide to Understanding Autism by Chantal Sicile-Kira
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Most Experts Believe Cyber Attacks Will Increase Over The Next Decade

Chuck Bednar for redOrbit.com – Your Universe Online
The majority of computer experts and Internet builders predict that a major cyber attack causing widespread damage will occur by the year 2025, according to a new report from the Pew Research Center and Elon University’s Imagining the Internet Center.
As part of their research, the authors of the study asked 1,642 experts whether or not a major cyber attack would take place within the next decade, damaging national security, resulting in a significant loss of life or theft/property damage of in the range of tens of billions of dollars. 61 percent of the respondents believed such an attack would occur.
Among those individuals, Pew said that there were four main key themes: the fact that Internet is a critical infrastructure for energy, banking, transportation and national defense activities, making it an attractive target to terrorists; the fact that security is typically not the primary concern in the design of Internet applications; the history of cyber attacks such as the Stuxnet worm; and the vulnerability of sectors such as finance and power grids.
“The Internet was not built for security, yet we have made it the backbone of virtually all private-sector and government operations, as well as communications,” Washington-area lawyer Joel Brenner, a fellow at the Center for International Studies at MIT, wrote in a column published by the Washington Post last Friday. “Pervasive connectivity has brought dramatic gains in productivity and pleasure but has created equally dramatic vulnerabilities.”
Pew said that there was “considerable agreement” among the experts they polled that the accounts and identities of individual Internet users would be more vulnerable to future cyber attacks, and that businesses would be “persistently” under siege from such attacks. Many respondents said that essential utilities such as the energy grid would be among the most vulnerable targets, while many expected theft to increase from current levels and that it was likely the economy could be disrupted as a result.
However, the experts had varying opinions on the likely extent of damage and disruption at both the state and national levels, Pew explained. Many believed that cyber attacks between countries had already taken place, citing the spread of the Stuxnet worm as a possible example, and while most believed that cyber attacks could deter the use of weapons of mass destruction, they also anticipate that the so-called cyber arms race will expand as both governments and other organizations work to overcome online security measures.
“Cyber attacks will become a pillar of warfare and terrorism between now and 2025. So much of a country’s infrastructure – commerce, finance, energy, education, health care – will be online, and gaining control of or disrupting a country’s online systems will become a critical goal in future conflicts,” said Joe Kochan, chief operating officer for US Ignite, a company currently working on gigabit-ready applications.
“Current threats include economic transactions, power grid, and air traffic control,” added Mark Nall, a program manager for NASA. “This will expand to include others such as self-driving cars, unmanned aerial vehicles, and building infrastructure. In addition to current methods for thwarting opponents, growing use of strong artificial intelligence to monitor and diagnose itself, and other systems will help as well.”
Among the 39 percent that said they did not expect major cyber attacks to occur by 2025, there were three general themes: upgraded security infrastructure would overcome the Internet’s vulnerabilities and help prevent the worst possible attacks; the threat of retaliation would keep cyber attackers at bay; and the notion that the threat of such attacks is being exaggerated by those who would most benefit from creating an atmosphere of panic.
“Nations and others who hold necessarily secure information are getting better and better about protecting their essential assets,” said University of North Carolina professor Paul Jones. “Yes, a bunch of credit card numbers and some personal information will leak. Yes, you may not be able to place an order for a few hours. But it’s less and less likely that say all pacemakers in a major city will stop at once or that cyber attacks will cause travel fatalities.”
“Cyber attacks will always be a threat, but it is unlikely that a future cyber attack causing widespread harm will occur, any more than today,” noted business professional Todd Cotts. “The challenge will be in whether or not the government is capable of staying ahead of the cyber terrorists. As long as the government leans on a competitive marketplace of non-government companies specializing in technological advances in cyber security, the advances should keep the United States at par, at minimum, with advances by cyber terrorists.”
Also on Wednesday, researchers from the Georgia Tech Information Security Center (GTISC) released their 2015 Emerging Cyber Threats Report. In the report, the institution cautions about the loss of privacy, the abuse of trust between users and their machines, attacks against the mobile ecosystem, and the growing involvement of cyberspace in conflicts between nations and states.
“We must continue to invest in research and develop technology, processes and policies that help society deal with these developments,” GTISC Director Wenke Lee said in a statement. “Researchers from academia, the private sector, and government must continue to work together and share information on emerging threats, make improvements to policy, and educate users.”
—–
PROTECT YOURSELF TODAY – Norton Antivirus
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

MIT Researchers Are Now Able To Watch Robots Think

Eric Hopton for redOrbit.com – Your Universe Online

We may not have to wait as long as we thought to see those much vaunted delivery drones Googling their way down our streets – all because we can now see them thinking. The deployment of autonomous vehicles, flying cars, and even fire-fighting drones in real-life situations, may happen much sooner than was previously possible thanks to breakthrough work by researchers at MIT. The scientists have developed a way to view and understand just what happens as a robot tries to make decisions.

Understandably, official bodies like the Federal Aviation Administration (FAA) have restricted real-world testing of robots such as those autonomous vehicles, quadrotors and drones. A bad decision, even by robots, can end up being a bad accident and, after struggling to explain to visitors and observers at MIT how multiple robots interact and make decisions, the researchers realized they had to take an entirely new approach if they were ever to convince the FAA and others that the robots were safe.

If they couldn’t take the robots outside to test, why not bring the world inside, thought the MIT team. In the dimly lit hangar-like Building 41 down at MIT, Roomba-like robots are being put through their paces to test the new system known as MVR, or “measurable virtual reality.” MVR is “a spin on conventional virtual reality that’s designed to visualize a robot’s “perceptions and understanding of the world” and is the brainchild of Ali-akbar Agha-mohammadi, a postdoc in MIT’s Aerospace Controls Lab and Shayegan Omidshafiei, a graduate student. The MIT pair and their colleagues, including Jonathan How, professor of aeronautics and astronautics, will present details of the visualization system at the American Institute of Aeronautics and Astronautics’ SciTech conference in January. MIT’s work is supported by Boeing.

In one Building 41 simulation, a robot demonstrates how MVR works. Its task is to get to the other side of the room, but to do that it has to avoid an obstacle in the shape of a human pedestrian moving around in its path. Thanks to MVR, the robot’s decision making process can be visualized as its “thoughts” are projected on the ground. As the pedestrian moves, it is tracked by a large pink dot on the ground. The dot represents the robot’s perception of the pedestrian’s spatial position. Meanwhile, several different colored lines radiate across the room, each signifying one possible route for the robot. A green line represents the robot’s idea of what it sees as the optimal route, avoiding collision with the pedestrian.

This new visualization system combines ceiling-mounted projectors with motion-capture technology and animation software to project a robot’s intentions in real time. We really can see the robot thinking – in color too!

According to Agha-mohammadi, seeing a robot’s decision making process will help fix faulty algorithms much faster. “For example,” he says, “if we fly a quadrotor, and see something go wrong in its mind, we can terminate the code before it hits the wall, or breaks.”

His colleague Shayegan Omidshafiei adds, “Traditionally, physical and simulation systems were disjointed. You would have to go to the lowest level of your code, break it down, and try to figure out where the issues were coming from. Now we have the capability to show low-level information in a physical manner, so you don’t have to go deep into your code, or restructure your vision of how your algorithm works. You could see applications where you might cut down a whole month of work into a few days.”

There are many potential applications for this technology, and in one study the scientists have been testing drones they hope can be used in fighting forest fires. In order to work in real life, the drones will need to observe and understand a fire’s effect on a range of different vegetation. They will then need to pick out the areas of fire most likely to spread and which to put out first.

For this test, the researchers projected landscapes to simulate an outdoor environment on the floor of the hangar. They then flew “physical quadrotors over projections of forests, shown from an aerial perspective to simulate a drone’s view, as if it were flying over treetops.” Images of fire were projected on various parts of the landscape and the quadrotors were instructed to take images of the terrain which may eventually be used to “teach” the robots how to recognize signs of particularly dangerous fires.

The scientists now plan to use many more simulated environments, including using the MVR system to test drone performance in package-delivery scenarios by simulating urban environments and creating street-view projections of cities.

This kind of faster prototyping and more realistic testing in simulated environments should speed up development and regulatory approval.

Related Reading:

> The History Of Robotics – Robotics Reference Library
> MIT-Developed Submersible Robot Could Help Foil Smugglers
> MIT’s Cheetah “Bound For Robotic Glory”

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Probe Into Tuesday’s Antares Explosion Begins As Questions Arise About The Rocket’s Engines

Chuck Bednar for redOrbit.com – Your Universe Online
As day one of the investigation of the failure and explosion of an Orbital Science Corp.’s Antares rocket during an attempted resupply mission wrapped up Wednesday evening, reports surfaced that the company already had plans to retire the five-decade old engines used to power the launch vehicle.
In a statement, NASA said that members of the Wallops Incident Response Team had completed an initial assessment of the Virginia-based launch site, but noted that it would take several more weeks to fully analyze and understand the effects of the incident. They did report that a sounding rocket launcher adjacent to the Pad 0A of the Mid-Atlantic Regional Spaceport and buildings in the immediate area suffered the most severe damage.
The US space agency also said that a number of support buildings had suffered broken windows and imploded doors following the explosion of the Antares rocket and its Cygnus cargo craft shortly after 6:22pm EDT on Tuesday. Furthermore, the initial assessment revealed damage to the transporter erector launcher and lightning suppression rods, as well as debris around the pad. Environmental assessments were also being conducted at the site.
In a separate report, Orbital officials said that the preliminary assessment indicated “the major elements of the launch complex infrastructure, such as the pad and fuel tanks, avoided serious damage. However, until the facility is inspected in greater detail in the coming days, the full extent of necessary repairs or how long they will take to accomplish will not be known.”
“I want to praise the launch team, range safety, all of our emergency responders and those who provided mutual aid and support on a highly-professional response that ensured the safety of our most important resource – our people,” said Wallops director Bill Wrobel. “In the coming days and weeks ahead, we’ll continue to assess the damage on the island and begin the process of moving forward to restore our space launch capabilities.”
According to Charisse Jones of USA Today, Orbital CEO David Thompson told investors during a conference call that the probe into the incident, which also resulted in the loss of 1,600 pounds of research (including many experiments designed by students) and the Planetary Resources ARKYD 3 spacecraft, “may or may not” reach the conclusion that its two Soviet-era AJ-26 engines were the cause of the explosion.
However, he also said that the company had already been planning to replace them. The Antares rocket was powered by a pair of AJ-26 main engines, which were originally built in the Soviet Union in the 1970s and were later refurbished in the US by Aerojet Rocketdyne, Jones said. She added that the rockets have had issues during tests in the past, with one catching fire three years ago and another being lost completely on a test stand earlier this year.
“Under the original plan, we were as of now about two years away from conducting the first launch of an Antares with a second generation propulsion system, We are currently looking at the prospects for accelerating the introduction of that system,” Thompson reportedly said during the conference call. “I would anticipate that there will be some delay in the next scheduled Antares launch,” he added, saying that the delay could be as little as three months, or that it could be “considerably longer than that depending on what we find in the review.”
Frank Culbertson, the Orbital Sciences executive in charge of the NASA program, defended the engines during a Tuesday night press conference, according to Bloomberg’s Justin Bachman. He said that the AJ-26s had been “refurbished and Americanized,” calling them “robust and rugged” and noting that they had a successful track record. However, Bachman noted that Elon Musk, founder of Orbital’s rival SpaceX, had previously mocked the use of the engines.
“One of our competitors, Orbital Sciences, has a contract to resupply the International Space Station, and their rocket honestly sounds like the punch line to a joke,” Musk told Wired two years ago, according to Bloomberg. “It uses Russian rocket engines that were made in the ’60s. I don’t mean their design is from the ’60s – I mean they start with engines that were literally made in the ’60s and, like, packed away in Siberia somewhere.”
Thompson said that the Antares failure will not impact the company’s 2014 financial results. Orbital, which has a $1.9 billion contract to complete eight ISS resupply missions for NASA, will not suffer a “major financial hit” from the incident, as most of the revenue from its contract had already been paid and insurance should cover any difference, according to Bachman. However, its reputation among other clients may be another story, he added.
The incident could also have a broad impact on the space exploration industry, as Los Angeles Times reporter Melody Petersen said that analysts believe it could lead critics to question NASA’s decision to hire private-sector companies to ferry astronauts to the International Space Station (ISS) starting in 2017. However, Eric Stallmer, the president of the Commercial Spaceflight Federation, said that while he was certain that “questions will be raised,” he doubted that the explosion would in any way alter NASA’s plans of having Boeing and SpaceX develop vehicles for use on future manned ISS missions.
—–
FOR THE KINDLE – The History of Space Exploration: redOrbit Press
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Google Is Working On A Pill That Could Detect Early Signs Of Cancer Inside The Body

John Hopton for redOrbit.com – Your Universe Online

Google is in the initial phase of a project that may help us to detect cancer earlier, and we could depend on wearable technology to flag up problems rather than taking regular trips to the doctor.

The system involves swallowing a pill containing tiny magnetic particles that move around the bloodstream searching for and attaching themselves to abnormal cells. The findings would then be relayed to a sensor on a wearable device. Data from these nanoparticles, which could number as many as 2,000 inside a single red blood cell, could also be uploaded and sent to doctors via the cloud, giving a comprehensive picture of a patient’s health. The information available to doctors would go beyond the scope of blood tests.

The project from the secretive yet much discussed Google X lab is part of Google’s broader efforts on healthcare, in which it claims to be focusing on the early prevention side of medicine rather than cures. The latest development follows work on contact lenses that are able to monitor glucose in tears for diabetics, and the acquisition in September of Lift Labs, who makes ‘shake-canceling’ eating utensils for people with Parkinson’s disease.

As well as detecting cancers during their early stages, including some such as pancreatic cancer which at the moment are often detected too late to cure, the nanotechnology could also be useful in early detection of other life-threatening conditions. Potential heart attacks and strokes could be prevented by identifying fatty plaques about to break free from the lining of blood vessels, which can stop blood flow, while porous nanoparticles that change color as potassium passes through could warn of the high levels of potassium that cause kidney disease.

Google’s head of life sciences Dr. Andrew Conrad, a molecular biologist who developed a cheap and now widely employed HIV test, told the BBC News reporters Leo Kelion & James Gallagher that, “Nanoparticles… give you the ability to explore the body at a molecular and cellular level,” explaining that, “What we are trying to do is change medicine from reactive and transactional to proactive and preventative.”

TechCrunch’s Sarah Buhr also quoted Dr. Conrad from a speech given to the Wall Street Journal Digital conference in which he appeared to compare the nanoparticle pill to Google’s self-driving car project. We could, “Think of it as sort of like a mini self-driving car. We can make it park where we want it to.” He made the analogy that current healthcare systems are guilty of trying to change oil after we have broken down.

Concerns about Google’s venture relate to the security of the huge amounts of data that would be collected, along with the possibility that being able to constantly monitor ourselves would lead to obsessive worry about the readings. The BBC’s health editor James Gallagher suggests that, “Screening the body for disease is littered with dangers, and if it is not done carefully, it could make hypochondriacs out of all of us.”

On the subject of privacy, Dr. Conrad said that a partner rather than Google would be responsible for individual data. “It’d be like saying GE (Healthcare) is in control of your x-ray. We are the creators of the tech and they are the disseminators,” he clarified.

—–

FOR THE KINDLE: The History of Wearable Technology: redOrbit Press

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

BlackBerry Chief Pens Open Letter To Promote Upcoming Classic Smartphone

Chuck Bednar for redOrbit.com – Your Universe Online
John Chen is going old school in order to promote his company’s upcoming BlackBerry Classic smartphone, sending an open letter to current and former customers on Wednesday in order to generate buzz for the soon-to-be-released throwback device, various media outlets are reporting.
According to Reuters, Chen, the chief executive of BlackBerry, admitted that the company had made some mistakes over the past few years, writing: “It’s tempting in a rapidly changing, rapidly growing mobile market to change for the sake of change – to mimic what’s trendy and match the industry-standard, kitchen-sink approach of trying to be all things to all people.”
“But there’s also something to be said for the classic adage, if it ain’t broke don’t fix it,” he added, according to Entrepreneur.com. With that in mind, he said that the Classic would come with a top row of navigation keys, a trackpad, and a larger and higher-resolution screen – and it will apparently run on the BlackBerry 10 operating system.

Reuters said that the open letter was posted just two days after reality television star Kim Kardashian confessed during the Code/Mobile conference that she loved BlackBerry and has owned several Bold devices. The Classic is supposed to have a larger screen and more extensive app catalog than its predecessor, the news agency noted.
Re/code’s Ina Fried compared BlackBerry’s situation to that faced by Coca-Cola several years ago, when they tried to change their soda’s recipe, only to be forced by public demand to re-release the old formula under the Coca-Cola Classic moniker and ultimately quietly doing away with what came to be known as New Coke. She called it a “costly flop” for the beverage company.
While BlackBerry is attempting to capture some of that Coca-Cola Classic magic with its new Classic smartphone, it faces a tremendous challenge from Apple’s iPhone line and Android devices. The iPhone 6 and 6 Plus sold more than 10 million units during its first weekend after it was released last month, making it the biggest launch weekend ever for an Apple smartphone, while Android devices currently hold a 73.9 percent share of the European smartphone market, according to TechCrunch.
“Innovation is a word that gets used too often and carelessly. Innovation is not about blowing up what works to make something new – it’s about taking what works and making it better,” Chen wrote in the letter. “You don’t reinvent yourself every day; you take what you learned yesterday and sharpen it today. You drive change – often on your terms, but sometimes not. That you keep going regardless is what distinguishes you as a grown-up.”
“We are committed to earning your business – or earning it back, if that’s the case,” he added, promising that more details about the Classic would be released within the next few weeks, according to Reuters. People who are interested in learning more about the forthcoming device can pop on over to BlackBerry’s website and register to receive e-mail updates.
—–
FOR THE KINDLE: The History of Mobile Phones – redOrbit Press
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

TIGHAR Investigators Identify Fragment Of Amelia Earhart’s Missing Plane

Chuck Bednar for redOrbit.com – Your Universe Online
Researchers believe that they have identified a piece of the airplane flown by Amelia Earhart when she vanished over the Pacific Ocean in 1937 during a failed attempt to fly around the world, various media outlets are reporting.
According to Rossella Lorenzi of Discovery News, which broke the story Tuesday night, an investigation by the International Group for Historic Aircraft Recovery (TIGHAR) found that a small piece of aluminum sheeting found on the Pacific atoll Nikumaroro in 1991 was identified “to a high degree of certainty” as belonging to Earhart’s craft.
Earhart vanished on July 2, 1937 while flying from New Guinea to Howland Island as part of her attempt to become the first woman to circumnavigate the globe, and TIGHAR has been working for years to solve the mystery of her disappearance. Now, they report that the aluminum debris was likely part of her twin-engine Lockheed Electra.

TIGHAR researchers said that the aluminum sheet was a patch of metal that had been installed by Earhart during an eight-day stay in Miami – the fourth stop on her attempt to fly around the world, Lorenzi explained. The sheet was being used as a patch to replace a navigational window before departing for San Juan, Puerto Rico, as evidenced by a photograph published in the June 1, 1937 edition of the Miami Herald.
“The Miami Patch was an expedient field repair. Its complex fingerprint of dimensions, proportions, materials and rivet patterns was as unique to Earhart’s Electra as a fingerprint is to an individual,” TIGHAR executive director Ric Gillespie told Discovery News. “This is the first time an artifact found on Nikumaroro has been shown to have a direct link to Amelia Earhart.”
TIGHAR investigators from the group compared the dimensions of the 19-inch-wide by 23-inch-long metal sheet, also known as Artifact 2-2-V-1, to the structural components of a Lockheed Electra being restored at Wichita Air Services in Newton, Kansas, Lorenzi said. The rivet pattern and other features of the artifact matched that of both the patch and the aircraft being restored, the organization said in a detailed report posted on its website.
Nikumaroro had long been suspected as a place that Earhart and her navigator, Fred Noonan, may have made a forced landing, according to NBC News. In fact, Gillespie said that TIGHAR had previously discovered archival records describing the partial skeleton and campsite of an unknown female on the atoll, and analysis of a 1937 photo taken months after the disappearance depicts what many believe was the landing gear of her plane.
“The breakthrough would prove that, contrary to what was generally believed, Earhart and her navigator, Fred Noonan, did not crash in the Pacific Ocean, running out of fuel somewhere near their target destination of Howland Island,” Lorenzi said. “Instead, they made a forced landing on Nikumaroro’ smooth, flat coral reef. The two became castaways and eventually died on the atoll, which is some 350 miles southeast of Howland Island.”
Gillespie and his team have made a total of 10 archaeological expeditions to the atoll, having discovered a number of artifacts the Discovery News reporter said “provide strong circumstantial evidence for a castaway presence.” The TIGHAR executive director said that he believed that the aviatrix sent radio distress signals for at least five nights before rising tides and surf would have washed her plane into the ocean.
TIGHAR researchers will return to Nikumaroro in June 2015 to investigate an object resting at a depth of 600 feet at the base of an offshore cliff where they believe Earhart’s Electra was carried into the ocean, noted Lorenzi. They believe that the results on the new Artifact 2-2-V-1 analysis increases the likelihood that this object is the rest of her plane, and plan to use Remote Operated Vehicle (ROV) technology to search for wreckage during that 24-day expedition.
—–
Join Amazon Student – FREE Two-Day Shipping for College Students
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Orbital Launch Fails Following Explosion – Science Experiments And ARKYD Lost

Chuck Bednar for redOrbit.com – Your Universe Online
UPDATE: Wednesday, October 29, 2014 (9:00 a.m.)
Reports have surfaced that 1,600 pounds of science and research, including experiments designed by students in the US and Canada and the Planetary Resources ARKYD 3 spacecraft, were among the cargo lost during Tuesday evening’s Antares rocket failure and subsequent explosion.
According to Lindsey Bever of The Washington Post, among the 18 student-designed experiments lost were one examining how crystals would change in a zero-gravity environment, and others probing whether or not plants would grow in space and how quickly milk would spoil outside of the Earth’s atmosphere.
“The payload was supposed to be returning early in December and we were going to analyze the data and see if we could draw any conclusions,” Greg Adragna, a science teacher at Houston’s Cristo Rey Jesuit College Preparatory – one of the schools which had a project on board, told the Houston Chronicle. “This was about a year-and-a-half of work. This is not the way we wanted to have the evening end.”
Likewise, the ARKYD 3, a small test vehicle for future asteroid missions that was to be carried into low Earth orbit on board the Orbital Sciences vehicle, was lost in Tuesday’s explosion. It was to have been the first of a series of orbital test flights scheduled for Planetary Resources, according to their colleagues at Deep Space Industries (DSI),
“We stand with our brothers and sisters at Planetary Resources on the loss of their first ARKYD spacecraft in today’s accident at Wallops Island,” Deep Space CEO Daniel Faber said in a statement. “We know how hard they have been working, and the high expectations we all had for their first mission. Even as we design and prepare our own spacecraft, we recognize that a loss for one of us is a loss for all.”
“We are not competitors as much as we are compatriots, working towards the same goal: the opening of space for humanity. Both companies are attempting to prove that are goals are not just possible, but will become profitable.” added DSI Chair Rick Tumlinson. “Space is hard, and accidents like this remind us that any number of things can go wrong on our way to achieving our dreams. We are sure they will continue to move ahead, and look forward to continuing our friendly race to harvest the resources of space.”
—–
ORIGINAL: Wednesday, October 29, 2014 (5:30 a.m.)
An unmanned commercial rocket built by Orbital Sciences, on a mission to carry supplies to the International Space Station (ISS), exploded seconds after lifting off Tuesday evening at NASA’s Wallops Flight Facility launch site in Virginia.
No injuries were reported as a result of the incident, but according to USA Today reporters Doyle Rice and William M. Welch, both the Antares rocket and its Cygnus cargo spacecraft were destroyed. NASA and Orbital Sciences are still in the process of gathering data to determine what caused the vehicle to fail.
“It is far too early to know the details of what happened,” Frank Culbertson, Orbital’s Executive Vice President and General Manager of its Advanced Programs Group, said in a statement following the incident. “As we begin to gather information, our primary concern lies with the ongoing safety and security of those involved in our response and recovery operations.”
“We will conduct a thorough investigation immediately to determine the cause of this failure and what steps can be taken to avoid a repeat of this incident,” he added. “As soon as we understand the cause we will begin the necessary work to return to flight to support our customers and the nation’s space program.”
Orbital Sciences explained that the Antares rocket had suffered a catastrophic failure shortly after lifting off from Mid-Atlantic Regional Spaceport Pad 0A at 6:22 p.m. (EDT). NASA’s emergency operations officials confirmed that there were no injuries or casualties, and said that property damage was limited to the southern end of Wallops Island. The company also said it had formed an investigation team that would be working alongside government agencies in order to determine the cause of the explosion.
“While NASA is disappointed that Orbital Sciences’ third contracted resupply mission to the International Space Station was not successful today, we will continue to move forward toward the next attempt once we fully understand today’s mishap,” William Gerstenmaier, Associate Administrator of NASA’s Human Exploration and Operations Directorate, said in a separate statement from the US space agency.
“Orbital has demonstrated extraordinary capabilities in its first two missions to the station earlier this year, and we know they can replicate that success,” he added. “Launching rockets is an incredibly difficult undertaking, and we learn from each success and each setback. Today’s launch attempt will not deter us from our work to expand our already successful capability to launch cargo from American shores to the International Space Station.”

During a briefing on Tuesday night, NASA and Orbital officials said that none of the cargo lost was “absolutely critical” to the space station, and that the ISS crew was in no danger. There were science projects on board, as well as hardware that Orbital called important and of high value both to the company and to its customers.
Space.com reports that the mission was carrying 5,000 pounds of cargo, and was the third of eight resupply missions under Orbital’s $1.9 billion contract with NASA. Shares of the Dulles, Virginia-based aerospace firm dropped as much as 17 percent to $24.51 following the incident, according to Bloomberg’s Julie Johnsson and Chris Cooper.
NASA confirmed via Twitter earlier this morning that a Russian Progress vehicle carrying approximately 5,700 pounds worth of supplies successfully departed from Kazakhstan at 3:09am ET. The ship, which will be taking 1,940 pounds of propellant, 48 pounds of oxygen, 57 pounds of air, 926 pounds of water and 2,822 pounds of other supplies, is expected to dock with the orbiting laboratory at 9:09am ET.
—–
FOR THE KINDLE – The History of Space Exploration: redOrbit Press
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Tea And Citrus Dietary Flavonoids Could Lower Ovarian Cancer Risk

April Flowers for redOrbit.com – Your Universe Online

According to Chinese legend, tea was discovered by Emperor Shennong around 2,700 BC. Originally, tea was used in religious rituals and for medicinal purposes. For at least the last 300 years, tea has been a drink enjoyed by the masses around the world. The history of citrus fruits is nearly as long, migrating from the Far East and India to Europe during the Middle Ages. The consumption of both might reduce a woman’s risk of epithelial ovarian cancer, a new study from the University of East Anglia finds.

The results, published in the American Journal of Clinical Nutrition, reveal that women who consume foods containing flavonols and flavanones can significantly decrease their risk of this type of cancer, which is currently the fifth-leading cause of cancer death among women. Flavonols (found in tea, red wine, apples and grapes) and flavonones (found in citrus fruits and juices) are both subclasses of dietary flavonoids.

The research team used data on the dietary habits of 171,940 women between the ages of 25 and 55, collected over a thirty year time span during the Nurses’ Health Study at Harvard University. They found that women who consumed these foods were less likely to develop the disease.

In the UK alone, ovarian cancer affects more than 6,500 women a year. In the US, the number is closer to 20,000. According to the National Ovarian Cancer Coalition, epithelial ovarian cancer develops from the cells that cover the ovaries, not in the ovaries themselves.

Prof Aedin Cassidy, from the Department of Nutrition at UEA’s Norwich Medical School, said: “This is the first large-scale study looking into whether habitual intake of different flavonoids can reduce the risk of epithelial ovarian cancer. We found that women who consume foods high in two sub-groups of powerful substances called flavonoids – flavonols and flavanones – had a significantly lower risk of developing epithelial ovarian cancer.”

“The main sources of these compounds include tea and citrus fruits and juices, which are readily incorporated into the diet, suggesting that simple changes in food intake could have an impact on reducing ovarian cancer risk. In particular, just a couple of cups of black tea every day was associated with a 31 percent reduction in risk.”

Cassidy collaborated with Professor Shelley Tworoger of Brigham and Women’s Hospital and Harvard Medical School. Their study was the first to perform a comprehensive examination of the six major subclasses of flavonoids found in a normal diet, as well as the first to investigate the impact of polymers and anthocyanins on ovarian cancer risk.

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

How To Combat Fibromyalgia Symptoms Like A Pro

Fibromyalgia Overview

While fibromyalgia may appear mysterious to some, this intimately painful condition embeds itself in the framework of one’s life, sleep, work, and emotional state.

This condition inflicts an incessant flurry of horrible symptoms after horrible symptom on its sufferers, and currently, there is no known cure.

Fibromyalgia is a life-altering condition that requires concerted efforts at management and treatment. The cause of this condition is unknown.

This condition afflicts women at a greater frequency than that of men, for a few distinguishable reasons. Biological sexes are characterized by biochemical, hormonal and structural brain differences which shape how the disease progresses in each individual.

One proposed theory notes that women possess far more serotonin neurotransmitter receptors than males, which means that the process of serotonin reuptake, or the recycling and reuse of this neurotransmitter is inefficient.

Some believe that severe deficiency in serotonin is what causes the painful symptoms associated with this condition.

The most prominent symptom this condition is bodily pain, especially in the muscles, which manifests as sore points or, in some cases, cramps.

This pain can give rise to a host of other problematic symptoms, disturbing one’s sleep cycle and inducing depression, anxiety, stress, memory lapses, and metabolic problems.

Because this condition induces a course of  chronic pain for its sufferers, it has, in the past, been incorrectly classified as a condition loosely related to arthritis. But while arthritis is an inflammatory condition, fibromyalgia pain is not triggered by inflammatory factors.

How to Treat Fibromyalgia Symptoms

Treatments

Selecting the proper course and combination of treatments is crucial for fibromyalgia sufferers. But although medications, healthy food, acupressure, massage therapy and exercise can alleviate many symptoms of fibromyalgia, they can not cure the disease.

A cure has yet to emerge in the landscape of medicine to effectively address this highly unique and complex disorder. While ongoing studies have targeted the mechanisms residing within the pain pathways of the nervous system, more research is still required.

The effectiveness of a treatment will vary considerably in patients possessing different profiles. The genetic, gender, psychological and physical makeup of an individual determines the general effectiveness of the treatments. The treatments that have been deemed the most helpful in terms of curbing the effects of these symptoms, are antidepressants, exercise and localized pain injections and treatments.

The following list was compiled in an effort to harness the collective power of Western medicine, Eastern medicine, fitness and health in the context of addressing the symptoms of fibromyalgia:

One of the most frequently under use but effective means of contending with fibromyalgia is cardiovascular (aerobic) exercise. While the relative impact of resistance training on this disease has not been established, aerobic exercise has an undeniable impact on the progression and intensity of this illness. Aerobics functions powerfully for a number of reasons.

For example, aerobic exercise has been known to release endorphins and augment serotonin levels, a neurotransmitter that is fundamentally deficient in fibromyalgia sufferers. Aerobic exercises also harbor the potential to increase both muscular fluidity and movement. And of course, the increased blood flow during cardiovascular exercise can stimulate the muscles and promote recovery and healing.

Another commonly-pursued avenue in the world of fibromyalgia is that of physical therapy. Physical therapy is designed to facilitate the motion, fluidness, flexibility and recovery of the body’s musculo-skeletal system. Many physical therapists who specialize in the field of fibromyalgia can suggest exercises to aid in the reduction of muscular tension and stiffness, and they provide meaningful instruction and guidance in the area of proper muscle movement and stretching, as well. Routine visits to a physical therapist can drastically improve one’s symptoms.

Antidepressants are an indispensable medium of relief for fibromyalgia sufferers everywhere, they can lower the severity of pain, increase emotional regulation and positive mood states, and normalize the course of one’s sleep cycles.

Anticonvulsants fulfill their respective role in the context of fibromyalgia treatments. While they are not a systematic replacement for antidepressant therapies, they can induce a notable decrease in painful symptoms for some sufferers.

Sleeping pills are not advised during course of treatment, as they do not increase the depth or frequency of one’s sleep. Instead, medications that specifically cater to the proper regulation of sleep cycles should be introduced into one’s lifestyle in a healthy fashion. In some cases, an antidepressant can be utilized for this sake.

Medical professionals have strictly contended against the use of anti-inflammatory drugs as a viable treatment method for this disease. As noted previously, fibromyalgia is not an inflammatory condition, and therefore, would not benefit from the introduction of an anti-inflammatory medication such as an NSAID.

Accordingly, many medications that are designed to relax the muscles actually have a counteractive effect on the painful symptoms of this condition. Hence, muscle relaxants have been known to appease the presence of aches and stiffness.

A localized treatment for muscle pain entails the injection of steroids.

Cognitive behavioral therapy is a psychological method in which one restructures one’s thought processes to reduce stress, curb anxiety and think positively. This has been shown to induce marked improvements in the mental and emotional states of fibromyalgia sufferers.

Both acupuncture and chiropractic treatments have been shown to reduce pain in distinct ways. While acupuncture modifies the brain’s pain pathways, chiropractic methods increase muscle movement and decrease pain.

The consumption of a healthy, balanced diet is imperative in the framework of managing this condition. Essentially, foods that make one susceptible to mood disturbances and depression (e.g. fat, sugar) should be minimized over the course of one’s treatment. Some dietary supplements are highly effective, as well.

Many fibromyalgia patients enjoy the intensity and cathartic nature of deep tissue massage, which exerts a tremendous amount of pressure on the muscles. This practice has been known to both uproot toxins in the muscles, decrease tension points, and increase healing.

Biofeedback is a process that forges a stronger relationship between body and mind and ameliorates pain.

Women More Likely To Delay Seeking Medical Attention When Experiencing Cardiac Symptoms

April Flowers for redOrbit.com – Your Universe Online

In the US, every 33 seconds someone dies from cardiovascular disease and, according to The Heart Foundation, more than 920,000 will have a heart attack this year alone.

Men and women suffering a heart attack go through very similar stages of pain, but their reactions are very different. According to a study presented at the Canadian Cardiovascular Congress (CCC), women will delay seeking care, putting their health at risk.

“The main danger is that when someone comes to the hospital with a more severe or advanced stage of heart disease, there are simply fewer treatment options available,” said Dr. Catherine Kreatsoulas, a Fulbright Scholar and Heart and Stroke Foundation Research Fellow at the Harvard School of Public Health.

Enough isn’t known about how people perceive heart symptoms, Kreatsoulas said, nor at what stage they should decide to seek medical care. The new study, which was conducted in two parts, made use of patients with suspected coronary artery disease, just before they underwent their first coronary angiogram test.

In the first part of the study, Kreatsoulas’ team interviewed cardiac patients regarding their experience with angina. The patients were also questioned about their decision to seek medical care. The second phase of the study included a new set of patients. The research team used their answers to quantify reasons for seeking care by gender.

Angina is a warning signal that you are at risk of heart attack, cardiac arrest or sudden cardiac death. Angina — which can present as pressure, tightness or burning — occurs when the heart doesn’t receive enough blood and oxygen because of a blockage of one or more of the heart’s arteries.

The team developed a theory of the “symptomatic tipping point,” which is defined as that transitional period between experiencing cardiac symptoms and seeking medical care. Six transitional stages, common to both men and women, were identified.

The stages below are in chronological order:

1. uncertainty, during which the patient attributes symptoms to another health issue

2. denial or dismissal

3. asking for assistance or opinion of a friend or family member

4. a feeling of defeat when symptoms are recognized for what they are

5. seeking medical attention

6. acceptance of diagnosis

The researchers found that women stayed in stage 2 – denial – longer than men, waiting “for others to tell them they looked horrible,” said Dr. Kreatsoulas. “Women displayed more of an optimistic bias, feeling that the symptoms would pass and get better on their own.”

The results of the second phase of the study upheld this belief. The researchers found that women were one and a half times more likely to wait for symptoms to become more severe and more frequent before seeking medical care than men.

Kreatsoulas attributes this to women prioritizing differently than men — such as on caregiving or from risk aversion. She cites prior studies which found when women are ill, “they are often more concerned with how long they may be out of commission and not necessarily as concerned about the best treatment options.” Women are also more likely to dismiss symptoms when they feel even a small improvement.

—–

GET FIT WITH FITBIT – Fitbit Flex Wireless Activity + Sleep Wristband, Black

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Paid Subscriptions, Streaming Music Service May Be Coming Soon To YouTube

Chuck Bednar for redOrbit.com – Your Universe Online
Google’s online video service YouTube is reportedly set to roll out a paid subscription service that would allow viewers to skip advertisements, senior vice president Susan Wojcicki revealed during the Code Mobile technology conference in Half Moon Bay, California on Monday.
Wojcicki, who took over as the head of YouTube in February, said the service would be introduced in the “near term,” but according to PCMag.com’s Stephanie Mlot, the company is not quite certain at this point what form its premium, ad-free subscription service will take.
The new business model would be “a big change from the advertising-only approach” that helped turn YouTube into “into the world’s largest online video website,” said Alistair Barr and Rolfe Winkler of the Wall Street Journal. Offering both a free, ad-based service and a subscription model “would give users more choice and work well in a world where viewers are increasingly watching videos through apps on mobile devices,” they added, paraphrasing Wojcicki’s comments.
Wojcicki said at the conference that she acknowledged there will be instances in which YouTube users don’t want to sit through the ads to get to the content they want to watch. She noted there are several mobile apps and streaming content providers that allow users to either save money and be exposed to ads, or opt to pay to remove those ads. Wojcicki called that model interesting because it gives users a choice.
According to Reuters, YouTube launched a pilot program back in May 2013 that allowed individual content creators to charge consumers a subscription fee for access to a specific channel of videos. The service described by Wojcicki on Monday would expand that to allow consumers to pay a fee to watch YouTube’s entire content library ad-free.
“We rolled out the ability for an individual channel to do a subscription,” the YouTube head explained, according to the Wall Street Journal. “We’ve also been thinking about other ways that it might make sense for us. If you look at media over time most of them have both ads and subscriptions.”
Also on Monday, Wojcicki discussed YouTube’s anticipated streaming music subscription service, which CNET’s Shara Tibken, Richard Nieva, and Joan E. Solsman said is expected to be similar to Spotify with both ad-supported free models and commercial-free paid subscriptions. The reporters said the service is expected to launch before the end of the year, but Wojcicki would not pin down a release date, stating only that it would be released “soon.”
Tibken, Nieva and Solsman said that Google has reportedly been close to launching it on multiple occasions, but that the service has suffered setbacks related to the departure of several key executives and issues with independent music labels. In discussing the service, Wojcicki would state only that YouTube was “working on it.”
“Google’s search and advertising business is still the most dominant in the industry – it generates $50 billion a year in revenue – but some financial analysts fear the business is slowing… So as Google looks to the future, it’s trying to find other revenue sources to make sure it keeps its lead,” said Tibken, Nieva and Solsman. “In the United States, video ad revenues from YouTube will hit $1.13 billion by the end of 2014, according to eMarketer.”
“Google’s YouTube video site is an Internet juggernaut, notching more than a billion unique visitors every month and streaming about three months’ worth of video to viewers every minute,” they added. “All that watching adds up – it’s the No. 1 source of traffic on mobile Internet networks in North America, according to researcher Sandvine, and second only to Netflix on fixed-access networks, like your home broadband service.”
—–
Shop Amazon.com – Fire TV – Say it. Watch it.
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

FTC Files Lawsuit Over AT&T’s Throttling Of Unlimited Data Customers

Chuck Bednar for redOrbit.com – Your Universe Online
The US Federal Trade Commission (FTC) announced on Tuesday that it was filing a federal court complaint against AT&T Mobility accusing the company of throttling unlimited data plan customers by up to 90 percent.
The FTC’s complaint accuses the second-ranked wireless carrier in the US of misleading consumers by failing to adequately disclose that their data speeds on unlimited plans are reduced once customers reach a certain amount of data use in a single billing cycle, making many common mobile applications extremely difficult to use.
“AT&T promised its customers ‘unlimited’ data, and in many instances, it has failed to deliver on that promise,” FTC Chairwoman Edith Ramirez said in a statement. “The issue here is simple: ‘unlimited’ means unlimited.”
According to USA Today’s Mike Snider, the FTC said that AT&T first began throttling data speeds for unlimited customers in 2011 after those individuals used as little as 2 GB worth of data during a billing period. Overall, the agency said that at least 3.5 million unique customers have been throttled more than 25 million times, he added.
The FTC’s complaint said that such practices contradicted AT&T’s own marketing materials, which emphasized the “unlimited” amount of data available to customers that signed up for these plans. The agency also said that the company continued to fail to inform those customers of the throttling practices even when they renewed their contracts, and those who canceled their contracts over throttling were hit with early termination fees.
Furthermore, the FTC said that customers participating in AT&T focus groups “strongly objected” to the idea of a throttling program, and that company documents indicated that the wireless service provider had received “thousands of complaints” about slow data speeds on unlimited plans, with some calling the program a “bait and switch.”
AT&T has called the allegations “baseless,” telling Diane Bartz of Reuters that it was necessary to reduce data transfer speeds in order to manage network resources. Wayne Watts, AT&T’s general counsel, said that the company had been “completely transparent” about the practice “since the very beginning,” that only about three percent of customers are affected, and that those individuals are notified by text messages in advance.
“It’s baffling as to why the FTC would choose to take this action against a company that, like all major wireless providers, manages its network resources to provide the best possible service to all customers, and does it in a way that is fully transparent and consistent with the law and our contracts,” Watts added, according to Snider.
“AT&T says on its support website that people who have certain plans can experience data slowdowns once they exceed certain limits,” Bartz said. The website said that 3G smartphone customers will experience slowdowns after using 3 GB of data in a month, and 4G customers can use up to 5 GB before experiencing potential issues. Those unhappy with the slower Internet “are told that they can use Wi-Fi or switch to a different AT&T plan,” she added.
The FTC said that it had been working with the US Federal Communications Commission (FCC), which issued warnings to wireless carriers about throttling practices, on the matter. As CNET’s Roger Cheng noted, the FCC recently got into a dispute with Verizon over the carrier’s plans to slow down the connection speeds of some high-use LTE customers. Verizon relented and abandoned plans to do so earlier this month.
—–
Shop Amazon – Contract Cell Phones & Service Plans
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

FTC Files Lawsuit Over AT&T’s Throttling Of Unlimited Data Customers

Chuck Bednar for redOrbit.com – Your Universe Online

The US Federal Trade Commission (FTC) announced on Tuesday that it was filing a federal court complaint against AT&T Mobility accusing the company of throttling unlimited data plan customers by up to 90 percent.

The FTC’s complaint accuses the second-ranked wireless carrier in the US of misleading consumers by failing to adequately disclose that their data speeds on unlimited plans are reduced once customers reach a certain amount of data use in a single billing cycle, making many common mobile applications extremely difficult to use.

“AT&T promised its customers ‘unlimited’ data, and in many instances, it has failed to deliver on that promise,” FTC Chairwoman Edith Ramirez said in a statement. “The issue here is simple: ‘unlimited’ means unlimited.”

According to USA Today’s Mike Snider, the FTC said that AT&T first began throttling data speeds for unlimited customers in 2011 after those individuals used as little as 2 GB worth of data during a billing period. Overall, the agency said that at least 3.5 million unique customers have been throttled more than 25 million times, he added.

The FTC’s complaint said that such practices contradicted AT&T’s own marketing materials, which emphasized the “unlimited” amount of data available to customers that signed up for these plans. The agency also said that the company continued to fail to inform those customers of the throttling practices even when they renewed their contracts, and those who canceled their contracts over throttling were hit with early termination fees.

Furthermore, the FTC said that customers participating in AT&T focus groups “strongly objected” to the idea of a throttling program, and that company documents indicated that the wireless service provider had received “thousands of complaints” about slow data speeds on unlimited plans, with some calling the program a “bait and switch.”

AT&T has called the allegations “baseless,” telling Diane Bartz of Reuters that it was necessary to reduce data transfer speeds in order to manage network resources. Wayne Watts, AT&T’s general counsel, said that the company had been “completely transparent” about the practice “since the very beginning,” that only about three percent of customers are affected, and that those individuals are notified by text messages in advance.

“It’s baffling as to why the FTC would choose to take this action against a company that, like all major wireless providers, manages its network resources to provide the best possible service to all customers, and does it in a way that is fully transparent and consistent with the law and our contracts,” Watts added, according to Snider.

“AT&T says on its support website that people who have certain plans can experience data slowdowns once they exceed certain limits,” Bartz said. The website said that 3G smartphone customers will experience slowdowns after using 3 GB of data in a month, and 4G customers can use up to 5 GB before experiencing potential issues. Those unhappy with the slower Internet “are told that they can use Wi-Fi or switch to a different AT&T plan,” she added.

The FTC said that it had been working with the US Federal Communications Commission (FCC), which issued warnings to wireless carriers about throttling practices, on the matter. As CNET’s Roger Cheng noted, the FCC recently got into a dispute with Verizon over the carrier’s plans to slow down the connection speeds of some high-use LTE customers. Verizon relented and abandoned plans to do so earlier this month.

—–

Shop Amazon – Contract Cell Phones & Service Plans

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Nearly 1 In 5 Adults Have Persistent Pain: Study

Provided by Eric Sorensen, Washington State University

Americans are in a world of hurt. Nearly one in five US adults are in pain most every day for spells of three months or longer, according to an analysis by Jae Kennedy, professor of health policy and administration at Washington State University Spokane. The estimated 39 million adults in persistent pain outnumber the residents of California.

Previous studies have said so much pain costs hundreds of billions of dollars a year in lost productivity and health care. And that doesn’t take into account pain’s psychic toll.

“A sizeable portion of American adults are dealing with persistent pain and that’s affecting their lives profoundly,” said Kennedy. “Access to good pain management for this population is limited, and there’s a real risk that taking short-term pain medications for a long period of time will lead to dependency or addiction.”

His study, published this month in the Journal of Pain, drew from the first national survey to measure persistent pain, defined as daily or nearly daily pain lasting three months, which is usually ample time for an injury to heal. The National Center for Health Statistics survey questioned 35,000 households.

Determining pain’s economic, social costs

Kennedy was inspired to look at the data after seeing the 2011 national Institute of Medicine report, which found nearly half of Americans suffer what it called chronic pain. The report’s chronic pain definition is more inclusive and can include arthritis, joint pain, moderate or severe pain in the past four weeks and any work or housework disability.

“I don’t think that half of the population is dealing with chronic pain in the sense that we would describe chronic pain as a risk factor for deteriorating mental health and substance abuse,” said Kennedy. “So we wanted to come up with a subset of chronic pain that focused on something that we could look at across different chronic conditions rather than saying, ‘OK, if you’ve got arthritis, then you’ve got chronic pain.’”

By focusing on persistent pain, he said, health policy makers and providers can get a clearer sense of pain’s economic and social costs.

“Persistent pain is going to have the biggest impact on people’s daily lives,” he said. “If you’re dealing with pain constantly for a long period of time, that’s going to affect your work life, your family life, your social life. It also puts you at higher risk for things like mental illness and addiction.”
Pain triggers psychological distress

The study presents a rough demography of the problem. Naturally, older adults are more likely to report persistent pain, particularly between the ages of 60 and 69. Women are at a higher risk than men, as are those without high school degrees. Latino and African American adults are less likely to report pain than whites.

Two-thirds of those with persistent pain said it is “constantly present.” Half said it is sometimes “unbearable and excruciating.”

People with persistent pain were also more likely to report daily feelings of anxiety, depression and fatigue. This makes sense, said Kennedy.

“Being in pain is depressing,” he said. “Being in pain all the time is tiring. Being in pain all the time is anxiety-provoking. So it’s plausible that pain is triggering other kinds of more psychological distress.”

Policies, practices that ease pain

He said he would like to see questions about persistent pain asked in future national health surveys to get a more consistent measure of it across different groups of people. And while pain is in some ways inherent in the human condition, he would like to see policies and practices that ease it.

The rate of pain could be lowered, he said, “with responsive health systems that look at the entire person and the range of therapeutic services that they may need. It may be more expensive in the short term but in the long term – if we can get those people back to work, paying taxes, supporting their families, engaged in the community – there will be all kinds of economic as well as social benefits.”

Kennedy’s co-authors are John Roll, principal investigator and WSU Spokane College of Nursing professor; Taylor Schraudner, graduate student in health policy and administration when the paper was researched; Sean Murphy, assistant professor in health policy and administration; and Sterling McPherson, assistant professor in the College of Nursing.

Support came from the Washington Life Sciences Discovery Fund.

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Parents Perceive Using Agave Nectar, Placebo Are Better Than Doing Nothing For Cough In Kids

Provided by The JAMA Network Journals

Pasteurized agave nectar and placebo were both perceived to be better by parents for treating nighttime cough and the resulting sleep difficulty in infants and toddlers than doing nothing at all, according to a study published in JAMA Pediatrics.

Cough is a frequent symptom in children and one of the main reasons they visit a health care professional. Little evidence supports the use of over-the-counter medicine for acute cough. An alternative to treat cough is honey but children younger than 1 year are precluded from consuming honey because of concerns over infant botulism. Agave nectar has properties similar to honey but has not been associated with botulism.

Researchers Ian M. Paul, M.D., M.Sc., of the Penn State College of Medicine, Hershey, Penn., and colleagues compared treatment with agave nectar, placebo or no treatment at all on nighttime cough and the accompanying sleep disturbance in a group of 119 children who were randomized to one of the three treatment groups. The one-night study included children 2 to 47 months old who had nonspecific acute cough for seven days or less.

Study results indicate that agave nectar and placebo resulted in perceived symptom improvement by parents compared with no treatment, but agave nectar did not outperform placebo when a comparison was made between the two.

“Both physicians and parents want symptomatic relief for children with these common and annoying illnesses. The significant placebo effect found warrants consideration as health care providers and parents determine how best to manage the disruptive symptoms that occur in the setting of upper respiratory tract infections among young children. Placebo could offer some perceived benefit, although at a financial cost, while reducing inappropriate antibiotic prescribing,” the study concludes.

In a related editorial, James A. Taylor, M.D., and Douglas J. Opel, M.D., M.P.H., of the University of Washington, Seattle, write: “Although the study intervention provided no more relief from cough symptoms than placebo, both treatments were statistically superior to no treatment. The investigators contend that these findings are indicative of a placebo effect.”

“Rather, what the investigators are observing in this study is a placebo effect in the parents who assessed outcomes in study children using a cough symptom questionnaire,” they continue.

“As investigators such as Paul and colleagues continue to evaluate pharmacologic treatments, perhaps we should also conduct research designed to identify other components of care (e.g., communication techniques and nonspecific treatments) that improve outcomes after visits to clinicians by children with cold symptoms, even if the improvement is simply caused by a placebo effect, as broadly characterized,” the authors conclude.

Follow redOrbit on Twitter, Facebook and Pinterest.

When Hearing Aid Users Listen To Music, Less Is More, Says CU-Boulder Study

Provided by Laura Snider, University of Colorado at Boulder

The type of sound processing that modern hearings aids provide to make speech more understandable for wearers may also make music enjoyment more difficult, according to a new study by the University of Colorado Boulder.

The findings, published in the journal Ear and Hearing, suggest that less sophisticated hearing aids might actually be more compatible with listening to music, especially recorded music that has itself been processed to change the way it sounds.

“Hearing aids have gotten very advanced at processing sounds to make speech more understandable,” said Naomi Croghan, who led the study as a doctoral student at CU-Boulder and who now works at Cochlear Americas in Centennial, Colorado. “But music is a different animal and hasn’t always been part of the hearing aid design process.”

A frequent complaint among people who use hearing aids is that music can sound distorted, said Croghan, and it’s common for people to remove their hearing aids to listen to music.

Modern hearing aids use processing called “wide dynamic range compression,” which leaves loud sounds untouched but amplifies softer sounds. This kind of processing is useful for helping people with hearing loss follow a conversation, but it can distort music, which often covers a wider range of volumes than speech.

Adding to the distortion is the fact that recorded music commonly undergoes its own processing, known as “compression limiting,” which squeezes louder and softer sounds together into a narrower range, increasing the perceived volume. Too much compression limiting can affect the quality of music even for people with normal hearing, Croghan said, but it compounds the problem for hearing aid users.

“The recorded music is processed through multiple layers by the time the person with hearing loss actually hears it,” Croghan said.

The research team—which also included Professor Kathryn Arehart and Scholar in Residence James Kates, both in CU-Boulder’s Department of Speech, Language and Hearing Sciences—asked 18 experienced hearing aid users to listen to classical and rock music samples that ranged from being unprocessed to highly processed. The participants also used simulated hearing aids set at a variety of processing levels.

Regardless of which music sample the participants listened to, they generally preferred using the hearing aids with the simplest additional processing—essentially devices that just boost the volume. The participants also tended to prefer less processed music to more processed music. However, the level of processing of the music itself wasn’t as important as the type of hearing aid used for listener enjoyment.

“What’s interesting about this is that more is not necessarily better,” Arehart said. “If I am in a noisy restaurant and I want to hear the people at my table, then more processing may be better in order to suppress the background noise. But when listening to music, more processing may actually do more harm than good.”

Despite general agreement among study participants that less processing in the hearing aid was better for listening to music, individual preferences varied from person to person.

“When it comes to hearing, like a lot of things, the average result does not fit everyone,” Croghan said.

The study was funded by a grant from hearing aid manufacturer GN ReSound.

Related Links:

> Original University of Colorado at Boulder Statement
> Hearing Aids Worn By Only One Fifth Of People With Hearing Problems – University of Manchester
> Parasitic Fly Inspires New Type Of Hearing Aid

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

New Nanodevice To Improve Cancer Treatment Monitoring

Provided by William Raillant-Clark, University of Montreal
In less than a minute, a miniature device developed at the University of Montreal can measure a patient’s blood for methotrexate, a commonly used but potentially toxic cancer drug. Just as accurate and ten times less expensive than equipment currently used in hospitals, this nanoscale device has an optical system that can rapidly gauge the optimal dose of methotrexate a patient needs, while minimizing the drug’s adverse effects. The research was led by Jean-François Masson and Joelle Pelletier of the university’s Department of Chemistry.
Methotrexate has been used for many years to treat certain cancers, among other diseases, because of its ability to block the enzyme dihydrofolate reductase (DHFR). This enzyme is active in the synthesis of DNA precursors and thus promotes the proliferation of cancer cells.
“While effective, methotrexate is also highly toxic and can damage the healthy cells of patients, hence the importance of closely monitoring the drug’s concentration in the serum of treated individuals to adjust the dosage,” Masson explained.
Until now, monitoring has been done in hospitals with a device using fluorescent bioassays to measure light polarization produced by a drug sample. “The operation of the current device is based on a cumbersome, expensive platform that requires experienced personnel because of the many samples that need to be manipulated,” Masson said.
Six years ago, Joelle Pelletier, a specialist of the DHFR enzyme, and Jean-François Masson, an expert in biomedical instrument design, investigated how to simplify the measurement of methotrexate concentration in patients.

Image Above: As precise yet 10 times less expensive than current hospital equipment, this little device contains an optical system that enables it to rapidly identify the dose of methotrexate that a cancer requires, minimizing the drugs undesirable side effects. Credit: University of Montreal
Gold nanoparticles on the surface of the receptacle change the color of the light detected by the instrument. The detected color reflects the exact concentration of the drug in the blood sample. In the course of their research, they developed and manufactured a miniaturized device that works by surface plasmon resonance. Roughly, it measures the concentration of serum (or blood) methotrexate through gold nanoparticles on the surface of a receptacle. In “competing” with methotrexate to block the enzyme, the gold nanoparticles change the color of the light detected by the instrument. And the color of the light detected reflects the exact concentration of the drug in the blood sample.
The accuracy of the measurements taken by the new device were compared with those produced by equipment used at the Maisonneuve-Rosemont Hospital in Montreal. “Testing was conclusive: not only were the measurements as accurate, but our device took less than 60 seconds to produce results, compared to 30 minutes for current devices,” Masson said. Moreover, the comparative tests were performed by laboratory technicians who were not experienced with surface plasmon resonance and did not encounter major difficulties in operating the new equipment or obtaining the same conclusive results as Masson and his research team.
In addition to producing results in real time, the device designed by Masson is small and portable and requires little manipulation of samples. “In the near future, we can foresee the device in doctors’ offices or even at the bedside, where patients would receive individualized and optimal doses while minimizing the risk of complications,” Masson said. Another benefit, and a considerable one: “While traditional equipment requires an investment of around $100,000, the new mobile device would likely cost ten times less, around $10,000.”
About this study:
This research received funding from the National Science and Engineering Research Council (NSERC) of Canada, the Centre for self-assembled chemical structures (CSACS), Fonds québécois de recherche – Nature et technologies (FRQ-NT) and Institut Mérieux.
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

The Symbiotic Relationship Between NASA And Stephen Hawking

Chuck Bednar for redOrbit.com – Your Universe Online
He’s a world renowned cosmologist, a former professor of mathematics, a best-selling author, a fellow of the Royal Society, a member of the US National Academy of Science, and one of the latest and most popular new members of Facebook – and now that Stephen Hawking’s life is coming to the silver screen in the form of a biopic, NASA has paused to reflect on what the man widely regarded as one of the greatest thinkers who ever lived has meant to space exploration.
[ Watch the Video: Professor Stephen Hawking On Space Exploration ]
On Monday, NASA released a new PSA featuring Eddie Redmayne, the actor who plays Professor Hawking in the new film “The Theory of Everything.” In the video, Redmayne said, “With NASA’s cutting-edge technology, we’re discovering the inner workings of our universe. Scientists are testing many of Hawking’s theories using NASA’s space telescopes like Hubble and Chandra.” Indeed, officials at the American space agency noted that Hawking’s theories “have unlocked a universe of possibilities that NASA and the world are exploring hand in hand.”
Watch The Theory of Everything trailer…

Hawking’s credentials are well known by most. He is the former Lucasian Professor of Mathematics at the University of Cambridge, the author of the international bestseller A Brief History of Time and several other books, and is currently serving as Director of Research at the Department of Applied Mathematics and Theoretical Physics and Founder of the Centre for Theoretical Cosmology at Cambridge. He holds over a dozen honorary degrees, was awarded the CBE in 1982, and has won the Eddington Medal, the Albert Einstein Award, the Maxwell Medal and Prize, the Presidential Medal of Freedom, the Fundamental Physics Prize and countless other honors.
Often overlooked, however, are his contributions to NASA and the space exploration industry as a whole, both as a scientist and a champion of their work. Consider another video, also posted on NASA’s website Monday, in which the narrator says simply, “when Professor Hawking speaks, NASA listens.” Later in that same video, Hawking said that he fears for the future of the Earth, due largely to the ever-expanding human population and the limited resources found on the planet, and that he believed that our species’ survival depends upon space exploration.
“If our species is to survive the next hundred years, let alone a thousand, it is imperative we voyage out into the blackness of space to colonize new worlds across the cosmos,” he said, adding that the International Space Station (ISS) is “pioneering space exploration.” Without the knowledge gained through research being conducted on the internationally-operated orbiting laboratory, Hawking explained, “travel into deep space is impossible… I think the work on the ISS will take a new generation of human space explorers out into our solar system and beyond.”
Hawking has also confidently predicted that there will be human settlements on the moon within the next 50 years, and said that he was hopeful people would be living on Mars before the end of the century. Those are bold goals, to be certain, and with a recent MIT-led evaluation of the Mars One project – an attempt to create such a colony on the Red Planet – determining that such and endeavor might not be entirely feasible, the goal might seem downright unattainable.
Professor Hawking has also made it clear that he is an atheist and does not believe in a Higher Power, even going so far as to say that that scientific fact and religious miracles are incompatible. Based on his comments regarding NASA, the ISS and space exploration in general, however, I think it’s safe to say that he is a man of faith – faith that these agencies and projects will ultimately lead the human race to its salvation amongst the stars. Monday’s gesture shows NASA clearly appreciates his support, but only time will tell if they will be able to reward his faith.
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Catastrophic Population Reduction Won’t Save The Planet, New Study Claims

Chuck Bednar for redOrbit.com – Your Universe Online
Global population levels have reached the point where not even strict fertility restriction or a catastrophic mass mortality event would bring about enough of a change to solve global sustainability issues, according to new research from ecologists at the University of Adelaide’s Environment Institute.
In research published Monday in the Proceedings of the National Academy of Sciences, professors Corey Bradshaw and Barry Brook conducted multi-scenario human population models and determined that population growth was essentially “locked-in.” For more immediate sustainability gains, they write that the focus should be on new policies and technologies that reduce the increasing consumption of natural resources and enhance recycling.
“Global population has risen so fast over the past century that roughly 14 percent of all the human beings that have ever existed are still alive today – that’s a sobering statistic,” Professor Bradshaw, Director of Ecological Modelling in the Environment Institute and School of Earth and Environmental Sciences, said in a statement. “This is considered unsustainable for a range of reasons, not least being able to feed everyone as well as the impact on the climate and environment.”
“We examined various scenarios for global human population change to the year 2100 by adjusting fertility and mortality rates to determine the plausible range of population sizes at the end of this century,” he added. “Even a worldwide one-child policy like China’s, implemented over the coming century, or catastrophic mortality events like global conflict or a disease pandemic, would still likely result in 5-10 billion people by 2100.”
According to Steve Conner of The Independent, there are approximately 7.1 billion people currently living on Earth, and researchers believe that this number could increase to about nine billion by 2050 and 25 billion by 2100. However, Conner noted that those estimates are based on current fertility rates, which are expected to decrease in the decades ahead.
Professor Bradshaw told Conner that the purpose of the study was to examine population numbers through the eyes of an ecologist evaluating the natural impact on animals in order to determine whether factors such as global pandemics or world wars could have a significant impact on population projections. What they found, the professor said, was that the size of the human population was so large that it “has its own momentum. It’s like a speeding car traveling at 150mph. You can slam on the brakes but it still takes time to stop.”
With such a large percentage of the historic global population currently living and using the planet’s resources, the impact on the environment is larger than ever before, according to BBC News environmental correspondent Matt McGrath. Consumption rates are increasing, and it is expected that deforestation for agricultural purposes, urbanization, the pressure on species, pollution and climate change are expected to increase in the years ahead, even as per-capita fertility decreases.
The University of Adelaide researchers constructed nine different scenarios from population change through 2100 using World Health Organization (WHO) and US Census Bureau data, McGrath said. They also devised catastrophic scenarios to examine the impact of climate disruption, wars, pandemics and even population control limits, and found that there was no “short-term fix” to curb the population and its impact on the planet, he added.
“We were surprised that a five-year WWIII scenario mimicking the same proportion of people killed in the First and Second World Wars combined, barely registered a blip on the human population trajectory this century,” said Brook, who was Chair of Climate Change at the Environment Institute during the study and is currently Professor of Environmental Sustainability at the University of Tasmania.
“Often when I give public lectures about policies to address global change, someone will claim that we are ignoring the ‘elephant in the room’ of human population size,” he continued. “Yet, as our models show clearly, while there needs to be more policy discussion on this issue, the current inexorable momentum of the global human population precludes any demographic ‘quick fixes’ to our sustainability problems.”
“Our work reveals that effective family planning and reproduction education worldwide have great potential to constrain the size of the human population and alleviate pressure on resource availability over the longer term. Our great-great-great-great grandchildren might ultimately benefit from such planning, but people alive today will not,” the professor concluded.
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Chandra Data Helps Solve The Puzzle Of Why Galaxy Clusters Contain So Few Stars

Chuck Bednar for redOrbit.com – Your Universe Online
New observations of the Perseus and Virgo galaxy clusters made using NASA’s Chandra X-ray Observatory suggest that turbulence may be the reason that hot gas there has been unable to cool, providing a possible answer to a long-standing question as to why these galaxy clusters never seem to form large numbers of stars.
Clusters like Perseus and Virgo are the largest objects in the universe, containing hundreds of thousands of individual galaxies that are held together by gravity and immersed in gas reaching temperatures of several million degrees, according to the US space agency. Except for dark matter, this hot gas is the heftiest component of galaxy clusters and glows brightly in X-ray light detected by Chandra, a space telescope which launched in 1999.
Over time, the gas in the centers of these clusters should cool enough to lead to a tremendous amount of star formation, but astronomers have found that this is not the case in many galaxy clusters. They knew that the gas was being prevented from cooling and forming stars for some reason, but they did not understand exactly why.
Now, however, a team of researchers led by Irina Zhuravleva of Stanford University Kavli Institute for Particle Astrophysics and Cosmology in California report that they have found evidence that the heat is being channeled from turbulent motions identified by signatures detected in the X-ray images. They report their findings in the latest online edition of the journal Nature.
Previous research showed that supermassive black holes located in the center of large galaxies in the middle of these clusters emit tremendous amounts of energy around them in powerful jets of highly-energetic particles that cause cavities to develop in the hot gas, according to NASA.
The new study has found how energy can be transferred from those cavities to the surrounding gas, which could generate turbulence that disperses the gas and causes it to remain hot for up to billions of years. The evidence was found in Chandra data obtained from extended observations of Perseus and Virgo. During those observations, the team was able to measure fluctuations in the density of the gas and estimate the amount of turbulence in the gas.
“Any gas motions from the turbulence will eventually decay, releasing their energy to the gas,” explained co-author Eugene Churazov of the Max Planck Institute for Astrophysics in Munich, Germany. “But the gas won’t cool if turbulence is strong enough and generated often enough.”
“Our work gives us an estimate of how much turbulence is generated in these clusters,” added Alexander Schekochihin of the University of Oxford’s Rudolf Peierls Centre for Theoretical Physics in the UK. “From what we’ve determined so far, there’s enough turbulence to balance the cooling of the gas.”
The team’s findings support the so-called feedback model involving supermassive black holes in the centers of galaxy clusters, NASA said. While mergers between a pair of galaxy cluster also produce turbulence, the authors believe that outbursts from supermassive black holes are the primary source of this phenomenon in most clusters.
Related Links:
> Chandra X-ray Observatory – Observatories Reference Library
> What Is Dark Matter? – Podcast Interview With Dr. Matthew Walker
> Read the original NASA statement here
> Keep an eye on the cosmos with Telescopes from Amazon.com
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

US Regulators Move To List African Lions As A Threatened Species

Chuck Bednar for redOrbit.com – Your Universe Online
In the wake of research suggesting that African lions faced the danger of extinction due to increased conflicts with humans and the loss of both habitat and prey, the US Fish and Wildlife Service (FWS) has proposed listing the iconic big cats as a threatened species.
According to Darryl Fears of the Washington Post, the proposal would make the African Lion the last of the big cats to receive federal protection under the Endangered Species Act, and comes as the International Union for Conservation of Nature (IUCN) reports that their population has decreased by 30 percent over the past two decades.
“The African lion – a symbol of majesty, courage and strength – faces serious threats to its long-term survival,” FWS director Dan Ashe said in a statement Monday. “Listing it as a threatened species will bring the full protections of U.S. law to lion conservation, allowing us to strengthen enforcement and monitoring of imports and international trade.”
The agency explained that, in recent years, human settlements and agricultural activities had expanded into lion habitats and protected areas. Grazing activities have also spread into these areas, it said, putting more livestock in closer proximity to lions. With humans hunting the creatures’ prey base down to unsustainable levels, lions have been forced to kill more livestock, which in turn leads humans to kill them in retaliation and defense of their cattle.
In 1980, there were approximately 75,000 lions in Africa, Fears said, but those numbers have decreased to no more than 33,000, most of them concentrated in 10 areas in eastern and southern Africa. The FWS decision comes following two years of analysis, which has led the conclusion to declare that the lions are not at immediate risk of extinction, but are likely to “disappear in the foreseeable future” without proper legal protections.
“It is up to all of us, not just the people of Africa, to ensure that healthy, wild populations continue to roam the savannah for generations to come,” Ashe explained. “By providing incentives through the permitting process to countries and individuals who are actively contributing to lion conservation, the Service will be able to leverage a greater level of conservation than may otherwise be available.”
The Endangered Species Act prohibits the import, export, commercial activity, interstate commerce and foreign commerce, according to Reuters. Hunting a creature listed as threatened is legal if permitted by the host country, and the FWS said it is also looking to establish a rule that would allow permits for the importation of sport-hunted lion trophies to be imported into the US. The agency is seeking comments from the public over the next 90 days.
“The decision to list the big cats as threatened – one level below endangered – would allow the U.S. government to provide some level of training and assistance for on-the-ground conservation efforts and restrict the sale of lion parts or hunting trophies into the country or across state lines,” said Scientific American’s John R. Platt, adding that the FWS are calling the move “an opportunity for awareness about the challenges that wildlife faces worldwide.”
In 2012, several conservation groups, including the International Fund for Animal Welfare, the Humane Society of the United States and Defenders of Wildlife, petitioned the government to list African lions as endangered, Fears said. He added that Monday’s FWS announcement was “praised by both supporters and opponents of the petition,” as it offered protection to the creatures while still allowing for the hunting of the big cats.
Related Links:
> Threat Of Extinction Looms For West Africa’s Lion Population
> Conservationists Reflect On Four Decades Of Endangered Species Act
> U.S. Government Lists African Lions as Threatened Under Endangered Species Act – International Fund for Animal Welfare Statement
—–
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Amazon Taking On Google Chromecast, Roku Streaming Stick With New Fire TV Device

Chuck Bednar for redOrbit.com – Your Universe Online
Amazon on Monday revealed the Fire TV Stick, a new $39 streaming media gadget scheduled to ship on November 19 that is designed to go head-to-head with Google’s $35 Chromecast and Roku’s $35 Streaming Stick.
According to PCMag.com software analyst Jill Duffy, the Fire TV Stick USB device resembles Chromecast in appearance but works more like Roku’s devices, tapping directly into Netflix, Hulu, Amazon Prime and other apps and streaming content services, delivering them directly to your television.
Like its competitors, the Fire TV Stick is a Wi-Fi enabled dongle that plugs directly into the HDMI port of a TV set and is powered by a separate USB wire, said Wilson Rothman of the Wall Street Journal. In addition, it has a processor and memory that should help it run more smoothly than similar devices, and also comes with a remote, he added.
Specifically, Amazon said that its device has 8GB of flash memory, which “on paper… soundly surpasses Chromecast’s 2GB, while its full gig of RAM doubles both Roku and Google’s options,” said Ars Technica’s Sam Machkovech.
“Meanwhile, the Fire Stick’s listed dual-core processor may very well outpace its rivals’ single-core offerings, but that’s not saying much considering it’s using the same Broadcom 28155 chip that debuted in a Samsung Galaxy S2 Plus refresh that launched early last year,” he added.

The device can be used to project whatever is on the screen of a Fire phone, Fire tablet or Miracast-enabled device onto a television set, and personal photos and videos can be viewed once they are uploaded to an Amazon Cloud Drive account, Rothman said. It will also feature the new Advanced Streaming and Prediction (ASAP) system, which will predict what viewers will watch next and queue up the show or movie so that it streams more quickly.
Despite the many benefits, however, Rothman pointed out that there are “compromises” with the new unit. “Most significantly,” he said, “the Fire TV Stick’s remote is lacking built-in voice search that was so popular on the Fire TV box. The Fire TV Stick will take voice commands through the Android app, and through a coming iOS app, but that’s less convenient.”
The base price of the Fire TV Stick is $39 dollars; however, Machkovech noted that Amazon Prime members can obtain one for just $19 if they place an order by Wednesday morning. He added that it “remains to be seen” whether or not this new smaller, streamlined device’s performance “will be enough to render its $99 sibling obsolete.”
Shop Amazon.com – Fire TV – Say it. Watch it.
That device, originally unveiled by Amazon in April, was approximately the same size as an external hard-drive and was touted as more than just a streaming video player, promising that users could access music services such as Pandora and iHeartRadio as well as gaming content including Minecraft, the Walking Dead and more.
However, while the $99 dollar device “advertised itself as a legit gaming device, complete with an Xbox-styled controller,” Machkovech said that there had been little “legitimate gaming content from Amazon” released over the last six months, “and since the TV Stick will support low-powered games by default as well – this Stick may very well stab the original Fire TV’s heart.”
—–
Follow redOrbit on Twitter, Facebook and Pinterest.

Low Dose Naltrexone for Fibromyalgia Treatment

Living with pain every single day of one’s life can be extremely difficult. Unfortunately, chronic pain does not forgive – and in too many of the cases it cannot be completely cured either.

It can come under many shapes and it can affect almost any part of the human body too. It can last for hours, days, months and years. And the only thing patients who are diagnosed with a chronic pain-related illness is try to manage their bodies as well as they can.

Fibromyalgia is very frequently misunderstood and understated. Even today, in the 21st century, when so much has been said about this syndrome, there are still many doctors who refuse to accept the fact that it exists. For more than 5 million Americans diagnosed with fibromyalgia, it does exist and it is painful, hard to control and confusing at the same time.

Fibromyalgia: The Things We Know

We don’t know much about fibromyalgia. As a general definition, fibromyalgia is a syndrome whose most predominant symptom is widespread pain – but beyond that, things can get incredibly confusing even for the world’s leading researchers.

For a long time, fibromyalgia was not even officially recognized and people showing its symptoms were diagnosed with other medical conditions (depression, for example). Even today, as it as mentioned before, some doctors still don’t accept the fact that fibromyalgia does exist. And even when they do, misdiagnoses can appear very frequently because fibromyalgia can be very, very similar to a lot of other medical conditions: the myofascial pain syndrome, depression, the chronic fatigue syndrome, arthritis and even lupus.

The truth is that fibromyalgia does show very diverse symptoms that can easily lead one into confusion. From headaches to joint pain, there are really a lot of things patients can experience and each patient can show a completely different set of symptoms.

Low Dose Naltrexone for Fibromyalgia Treatment

Fibromyalgia patients can show fatigue, insomnia, restless leg syndrome, irritable bowel syndrome, muscle pain, joint pain, arthritis, stiffness when waking up, tension headaches/migraines, hypersensitivity when it comes to odors, lights and other stimuli, depression, anxiety, jaw tenderness, irritable bladder swelling and other symptoms as well.

We also know that we don’t know the exact cause of fibromyalgia. Since research has been made in this field, there are some theories that could explain the reason why some people develop this syndrome. One of them is connected to genetics and to polymorph genes which are responsible with helping the body feel pain.

Yet, the research based on this theory is inconclusive because the same genes could lead to other medical conditions too (chronic fatigue syndrome, for example). The large occurrence of fibromyalgia that runs in the family does show that there may be some connection to the genetics one has, but other than that, there are still some questions to be answered.

Furthermore, there are scientists sustaining the idea that fibromyalgia is not actual pain, but a wrongful perception of pain that arises out of the fact that the pain neurotransmitters do not function normally.

Depression and lack of sleep can also be connected very tightly to fibromyalgia and some scientists have theorized that these are not actual symptoms of this syndrome, but the very causes. Lack of sleep can indeed cause hypersensitivity and it can be at the foundation of a lot of fibromyalgia-related symptoms, but there should be other factors influencing whether one gets fibromyalgia or not.

How Is Fibromyalgia Diagnosed?

Although putting the fibromyalgia diagnosis is difficult, it can be done if your doctor is very careful about all the symptoms you experience. First and foremost, he/she will make a list of them and see if they fit into the fibromyalgia profile (or at least as much as such a wide range of symptoms can be called a profile).

Further on, the doctor will analyze the 18 tender points in the body. People used to believe that fibromyalgia can be diagnosed only when at least 11 of these points are sensitive, but this rule is not being used these days any longer because even less tender points can show the presence of fibromyalgia.

The doctor will also have to see if you don’t have any other co-morbid conditions and if you don’t suffer from anything else than fibromyalgia as well, so he/she may have to run some tests too. Even more, do bear in mind the fact that the fm/a test has been created precisely for diagnosing fibromyalgia, but the truth is that it is still inaccessible to most of the patients, as it costs around $750 and it is not yet covered by the vast majority of the health insurance agencies out there.

Naltrexone and Fibromyalgia

Since the cause of this syndrome is not known, there is no cure developed either, so the only treatment patients can get is symptomatic. There are some drugs that have been FDA approved to treat fibromyalgia and they are similar in nature to anti-depressants.

Recently, studies have been made and low-dose naltrexone appears to be an effective drug in treating fibromyalgia as well. Usually administered to alcohol and narcotic addicted people to minimize the withdrawal symptoms, naltrexone is a drug that works with the endorphins secreted by the human mind. In most of the cases, it will make the patient feel good and this is extremely helpful for those dealing with fibromyalgia (and other similar conditions such as the chronic fatigue syndrome).

Even more than that, it is believed that naltrexone can affect certain immune cells in the central nervous system (the microglial cells, to be more precise). These cells are responsible with the sickness feeling patients experience when fatigued and naltrexone may help them relieve themselves from these symptoms too.

Naltrexone’s effectiveness in treating fibromyalgia is still studies, but patients have shown decreased levels of pain, stress and fatigue – and this is only a good sign. However, do keep in mind the fact that this drug can have adverse effects too and that they include sleeping issues (at first), prolonged erections and weight loss too.

Age-Related Memory Decline Could Be Reversed By A Dietary Compound Found In Cocoa

Chuck Bednar for redOrbit.com – Your Universe Online
Antioxidant-rich naturally occurring dietary compounds found in cocoa could help reverse age-related memory decline in otherwise healthy older adults, claims research led by Columbia University Medical Center (CUMC) and published Sunday in the journal Nature Neuroscience.
In research supported by Mars Inc., a global food manufacturer known largely for its chocolate and candy products, senior author Dr. Scott A. Small and colleagues from several other US universities recruited 37 healthy volunteers between the ages of 50 and 69. Each study participant randomly received a diet that was either high in dietary cocoa flavanols (900mg per day) or a low-flavanol diet (10mg) for three months.
Brain imaging and memory tests were administered to each subject both before and after the study, the researchers said. The brain imaging measured blood volume in a part of the hippocampal formation known as the dentate gyrus, a measure of metabolism, and the memory test involved a 20-minute session of pattern-recognition exercises designed to evaluate the type of memory controlled by this particular brain structure.
[ Watch the Video: Dietary Flavanols Reverse Age-Related Memory Decline ]
According to Rosa Silverman of The Telegraph, “those who had consumed a high dose performed much faster than those who received a low dose.” The difference, she explained, is believed to be the result of an increase in blood volume in the dentate gyrus directly attributable to the higher intake of cocoa flavanols. Previous research had linked a decline in function in that region of the brain during old age to gradual memory loss.
“When we imaged our research subjects’ brains, we found noticeable improvements in the function of the dentate gyrus in those who consumed the high-cocoa-flavanol drink,” Dr. Adam M. Brickman, lead author of the study and an associate professor of neuropsychology at Columbia University’s for Research on Alzheimer’s Disease and the Aging Brain, said in a statement.
Specifically, Dr. Small told New York Times reporter Pam Belluck that the members of the high-flavanol group performed an average of 25 percent better on the memory test than their low-flavanol counterparts. That means their results were about as good as people who were two to three decades younger, the CUMC researcher noted.
“If a participant had the memory of a typical 60-year-old at the beginning of the study, after three months that person on average had the memory of a typical 30- or 40-year-old,” Dr. Small said. However, he also cautioned that their study was small, and that the findings needed to be replicated in a forthcoming larger study planned by his team.
“This well-designed but small study suggests the antioxidants found in cocoa can improve cognitive performance by improving blood flow to a certain region of the brain,” said Dr. Clare Walton, research manager at the Alzheimer’s Society, according to the Daily Mail. “The brain region is known to be affected in ageing, but as yet we don’t know whether these brain changes are involved in dementia.”
Likewise, Dr. Simon Ridley, head of research at Alzheimer’s Research UK, said that people should not look at the study findings as a sign to stockpile candy bars, because the supplement used in the study was specially formulated from cocoa beans. In fact, the researchers pointed out that most methods of processing cocoa actually remove many of the flavanols found in the cocoa plant itself – they must be specially extracted using a proprietary process.
“The precise formulation used in the CUMC study has also been shown to improve cardiovascular health. Brigham and Women’s Hospital in Boston recently announced an NIH-funded study of 18,000 men and women to see whether flavanols can help prevent heart attacks and strokes,” the university explained. “The researchers point out that the product used in the study is not the same as chocolate, and they caution against an increase in chocolate consumption in an attempt to gain this effect.”
Follow redOrbit on Twitter, Facebook and Pinterest.

Drugstore Chains CVS, Rite Aid Disable Apple Pay As In-Store Payment Method

Chuck Bednar for redOrbit.com – Your Universe Online
Apple’s plans to change the way that people pay for items at brick-and-mortar locations appears to have hit a bump in the road, as two prominent US drugstore chains have reversed policy and are no longer accepting Apple Pay.
The two companies in question, Pennsylvania-based Rite Aid and CVS Pharmacy of Rhode Island, are among the over 200,000 merchants that already have equipment capable of reading the wireless signals which allow customers to use iOS devices to make wireless payments, Bloomberg reporters Tim Higgins and Zeke Faux said on Monday.
Neither company was among those specifically listed as retailers accepting Apple Pay when the system was first announced last month, but both were reportedly accepting it as of last week. Rite Aid stopped accepting payments through the near field communication (NFC) service on Thursday, according to Paul Ausick of 24/7 Wall St., and CVS Health followed suit on Saturday. Neither firm responded to Bloomberg’s request for comment.
Reuters reports that the reason both chains disabled Apple Pay was not immediately known, but an anonymous source told Higgins and Faux that both CVS and Rite Aid are part of a consortium developing a competing payment system. Reuters, the New York Times and other media outlets have reported similar information.
“The issue appears to be a conflict between Apple Pay and a mobile payment system called CurrentC that is being developed by a retailer-owned mobile technology outfit called Merchant Customer Exchange (MCX),” said Ausick. “Unlike Apple Pay, CurrentC does not use an NFC chip, but instead generates a QR code that is displayed on the merchant’s checkout terminal. Customers who have already linked their bank accounts to the CurrentC system scan the QR code from the terminal and the transaction is completed.”
“Clearly Rite Aid and CVS are making a business decision over a customer satisfaction decision,” Patrick Moorhead, president of Moor Insights & Strategy, told Mike Isaac of the New York Times. Moorhead added that the move could upset customers who viewed Apple Pay as an easier-to-use, safer alternative to traditional credit or debit cards.
Apple has declined to comment on the issue, Isaac said, but it is worth noting that CVS and Rite Aid are far from the only merchants resisting Apple’s new digital wallet and wireless payment service. As Ausick noted, when it was first announced in September, both Wal-Mart and Best Buy said that they had no plans to adopt the system. Both companies are partners in MCX, as are Target, Sears Holdings (which includes Kmart) and Darden Restaurants.
MCX has been developing its mobile payment solution since 2011, he added, and the goal is to find a way for merchants to avoid paying the two- to three-percent fees charges by Visa and MasterCard for credit card transactions. Ausick said that CurrentC will likely be used exclusively by these merchants during the short term, since they have a direct investment in the service, but suggested that customer demand could ultimately force their hand.
“If, as most observers expect, customer demand for NFC-based systems like Apple Pay grows rapidly, these retailers are not going to adopt a ‘my way or the highway’ attitude with their customers,” he explained. “They have learned that when it comes to technology, it’s a consumer-driven world and they just live in it. And one other thing retailers have – or should have learned – is not to underestimate the power of Apple in the consumer world.”
“MCX naysayers say that CurrentC is more difficult to use than Apple Pay, which does not require customers to unlock their phones or open an app. And they say it is far more complicated than paying with cash or a credit card,” Isaac added. “MCX has time, however, to alter its strategy before releasing its product. And since MCX merchants are all working together, CurrentC could offer consumers enticing deals and loyalty rewards for shopping at any one of the dozens of participating stores.”
Follow redOrbit on Twitter, Facebook and Pinterest.

Researchers Confirm That The Megalodon Died Out Over Two Million Years Ago

Chuck Bednar for redOrbit.com – Your Universe Online
An ancient 45- to 60-foot shark that resembled a larger version of the Great White shark, became extinct over two million years ago, and its death may have caused whales to grow to their current sizes, research published last week in the journal PLOS ONE claims.
In the study, researchers from the University of Florida and the University of Zurich set out to dismiss claims that Carcharocles megalodon (megalodon), the largest shark that ever existed, was still alive. They reviewed the most recent records of the megalodon from literature and scientific collections, and through a novel use of the Optimal Linear Estimation (OLE) model, determined that the massive predator died out 2.6 million years ago.
In a statement, lead author Catalina Pimiento, a doctoral candidate at the Florida Museum of Natural History on the UF campus, said that she was drawn to the subject because “it is fundamental to know when species became extinct to then begin to understand the causes and consequences of such an event.”
“I also think people who are interested in this animal deserve to know what the scientific evidence shows, especially following Discovery Channel specials that implied megalodon may still be alive,” she added. The research represents the first phase of Pimiento’s ongoing reconstruction of megalodon’s extinction – research that she believes will lead to a better understanding of the consequences that could result if current predators suffer the same fate.
Modern predators, especially large sharks, are currently experiencing significant global population declines due to the ongoing biodiversity crisis, the university explained. Recent estimations indicate that large, shallow-water sharks are at the greatest risk among all types of marine animals, and if they die out, smaller sharks become more abundant and wind up consuming more of the invertebrates typically consumed by humans.
The situation may have been similar 2.6 million years ago. It was around this time, between the Pliocene and Pleistocene epochs, that baleen whales began to grow to their current, gigantic sizes, explained BBC News online science editor Paul Rincon. While there is no conclusive evidence that megalodons fed on baleen whales, the fossils of both species are often found together, and the removal of the predator might have allowed the prey to flourish and grow larger than ever before.
“When we calculated the time of megalodon’s extinction, we noticed that the modern function and gigantic sizes of filter feeder whales became established around that time,” Pimiento explained. “Future research will investigate if megalodon’s extinction played a part in the evolution of these new classes of whales.”
“When we found out when that happened, we noticed it coincided with the pattern mentioned in whales. Now we need to find out if one event – megalodon’s extinction – caused the other – evolution of gigantism in whales,” she told BBC News. “From modern sharks, it is known that larger individuals have a broader range of prey size, including larger prey. That means that the larger prey will be predated mostly by larger sharks.”
Pimiento and her University of Zurich colleague Dr. Christopher Clements used OLE on 42 of the most recently discovered fossils of the gargantuan shark. OLE, Rincon explained, is a mathematical technique used to assess the spacing between fossil dates. It allows researchers to come up with a statistical inference of the date at which the species could be considered extinct – in this case, that point was 2.6 million years ago.
“It’s not exact, not least because there is a margin of error on the dates of the last fossils themselves. But it represents a refinement on previous estimates of the extinction date for this fearsome species,” the BBC reporter added. “The cause of Megalodon’s disappearance, however, remains a mystery.”
Related Links:
> May we suggest – What If?: Serious Scientific Answers to Absurd Hypothetical Questions by Randall Munroe.
—–
Follow redOrbit on Twitter, Facebook and Pinterest.


Elon Musk Ups The Anti-AI Rhetoric By Comparing The Technology To Demons

Chuck Bednar for redOrbit.com – Your Universe Online
SpaceX and Tesla Motors founder Elon Musk has never been shy about sharing his concerns over the potential hazards of artificial intelligence (AI), and he pulled no punches again on Friday when asked about the topic at the MIT Aeronautics and Astronautics Department’s 2014 Centennial Symposium.
According to SlashGear’s Brittany Hillen, Musk took part in a one-on-one question and answer session at the event, and Matt McFarland of the Washington Post said that he spoke for more than an hour and even took the time to ask one MIT student what his favorite sci-fi books were.
His comments about AI were what made headlines, though. Musk, who previously tweeted that AI could pose a greater threat to humanity than nuclear weapons, said Friday that people “should be very careful about artificial intelligence” and that he believed the technology could be the “biggest existential threat” the citizens of Earth currently face.
“Increasingly scientists think there should be some regulatory oversight maybe at the national and international level, just to make sure that we don’t do something very foolish,” the South African-born inventor and entrepreneur added. “With artificial intelligence we are summoning the demon. In all those stories where there’s the guy with the pentagram and the holy water, it’s like yeah he’s sure he can control the demon. Didn’t work out.”

As Hillen pointed out, Musk is far from the only person who sees potential danger in an AI-dominated future. Earlier this year, the United Nations held a meeting about intelligent “killer robots,” she said, and in May, Stephen Hawking wrote in a column for a UK newspaper that artificial intelligence could be “the worst thing to happen to humanity.”
“The potential benefits are huge; everything that civilization has to offer is a product of human intelligence; we cannot predict what we might achieve when this intelligence is magnified by the tools that AI may provide, but the eradication of war, disease, and poverty would be high on anyone’s list,” Hawking added. “Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last, unless we learn how to avoid the risks.”
Musk’s original comments on the matter came in August, shortly after he said he had finished reading the book “Superintelligence” by Swedish philosopher and Oxford professor Nick Bostrom. The book, which explores what will happen if and when machines become more intelligent than humans, was released in the US on September 1.
The issue is clearly still weighing heavily on Musk’s mind – so much so that, according to McFarland and Mashable’s Adario Strange, after the next questioner asked him about telecommunications, Musk hesitated briefly before asking the individual to repeat the question. Musk said he hadn’t heard it because he was still mulling over the whole AI issue.
By “invoking the one thing even those with little interest in technology fear the most,” Musk “actually managed to raise the bar in terms of making AI seem scary,” Strange said. “Sure, it’s just a colorful metaphor, but coming from such an accomplished technologist, those comments are pretty startling. Musk is scared. Really scared.”
Related Links:
> Experts Divided On What Impact Robots And AI Will Have On Human Employment
> AI Algorithm Correctly Detects Subtle Changes In The Music Of The Beatles
> Google DeepMind Adds Oxford University Experts To Bolster AI Research
> FOR THE KINDLE: Virtual Reality: The History of Our Future: redOrbit Press
—–
Follow redOrbit on Twitter, Facebook and Pinterest.


Astronomers Capture First-Ever Image Of A Nova’s Exploding Fireball Stage

Chuck Bednar for redOrbit.com – Your Universe Online
Observations of the expanding thermonuclear fireball from a nova that erupted last year have resulted in the first ever images of an exploding star during this stage and revealed how the ejected material’s structure evolves as the gas cools and expands, researchers from Georgia State University reported on Sunday.
The observations, which were conducted by researchers working at GSU’s Center for High Angular Resolution Astronomy (CHARA), reveal that the expansion is more complex than previous models had predicted. Lead author Gail Schaefer, an astronomer working at Georgia State, and 37 colleagues representing 17 institutions report their findings in the latest edition of the journal Nature.
According to the university, an amateur astronomer by the name of Koichi Itagaki first discovered a “new” star on August 14, 2013. That star, which was named Nova Delphinus 2013, was “quickly confirmed as an optical transient by a number of amateur and professional observers, and also confirmed spectroscopically as a nova by amateurs and professionals who observed independently at nearly the same time,” according to the American Association of Variable Star Observers (AAVSO).
“A nova occurs following the buildup of a thin layer of hydrogen on the surface of a white dwarf, a highly evolved star with the diameter of the Earth and the mass of the sun,” the university said in a statement. “The hydrogen is provided by a close companion, which is a normal star in a binary star system, where the two stars orbit about their center of mass.”
When this so-called hydrogen ocean is approximately 650 feet, the white dwarf’s surface gravity produces enough pressure at the bottom of the hydrogen layer to trigger thermonuclear fusion, the researchers explained. The light from the resulting explosion will be far brighter than the star’s normal appearance and could make it so that the object is suddenly visible to the naked eye in a location not previously known to be home to a bright star.
“Within 15 hours of the discovery of Nova Del 2013 and within 24 hours of the actual explosion, astronomers pointed the telescopes of the CHARA Array toward the nova to image the fireball and measure its size and shape,” GSU said. “The size of Nova Del 2013 was measured on 27 nights over the course of two months. The first measurement represents the earliest size yet obtained for a nova event.”
The CHARA array, which is located on the grounds of Mount Wilson Observatory in California, uses the principles of optical interferometry to combine the light from six telescopes to create images in extremely high resolution, equal to a telescope with a diameter of over 300 meters. As a result, it can be used to observe details of objects the size of an American nickel on top of the Eiffel tower from as far away as Los Angeles, the study authors noted.
By measuring Nova Del 2013’s expansion, they were able to determine that it was located 14,800 light years from the sun, meaning that while the explosion was witnessed here on Earth last August, it actually occurred roughly 15,000 years ago. During the first CHARA observation, the fireball’s physical size was roughly as large as the Earth’s orbit, but measurements conducted 43 days after detonation revealed that it had grown nearly 20-fold.
At that time, the size of the fireball was roughly equal to the orbit of Neptune, and it had a velocity in excess of 600 km per second (over 370 miles per second). Pictures of the fireball were created from the interferometric measurements thanks to the University of Michigan’s Infrared Beam Combiner (MIRC), an instrument which combines all six telescopes of the CHARA Array simultaneously to create images.
“The observations reveal the explosion was not precisely spherical and the fireball had a slightly elliptical shape. This provides clues to understanding how material is ejected from the surface of the white dwarf during the explosion,” the university said. The researchers also reported that the outer layers “became more diffuse and transparent” as the fireball expanded, and 30 days later, they found evidence “for a brightening in the cooler, outer layers, potentially caused by the formation of dust grains that emit light at infrared wavelengths.”
“Thousands of novae have been discovered since the first one was recorded in 1670, but it has only become possible in the last decade or so to image the earliest stages of the explosion thanks to the high resolution achieved through interferometry,” GSU concluded. “Studying how the structure of novae changes at the earliest stages brings new insights to theoretical models of novae eruptions.”
Related Links:
> Animation of Fireball expansion
> Keep an eye on the cosmos with Telescopes from Amazon.com
—–
Follow redOrbit on Twitter, Facebook and Pinterest.


Hinode Satellite Captures X-ray Footage Of Solar Eclipse

Provided by Harvard-Smithsonian Center for Astrophysics

The moon passed between the Earth and the sun on Thursday, Oct. 23. While avid stargazers in North America looked up to watch the spectacle, the best vantage point was several hundred miles above the North Pole.

The Hinode spacecraft was in the right place at the right time to catch the solar eclipse. What’s more, because of its vantage point Hinode witnessed a “ring of fire” or annular eclipse.

An annular eclipse occurs when the moon passes directly in front of the sun but doesn’t cover it completely because the moon appears too small. (The apparent size of the moon depends on its distance from Earth or, in this case, the spacecraft.) About one-third of all solar eclipses are annular.

“This is only the second annular eclipse Hinode has witnessed since it launched in 2006,” says astrophysicist Patrick McCauley of the Harvard-Smithsonian Center for Astrophysics.

A movie of the eclipse as observed by Hinode’s X-ray Telescope (XRT) is online. The XRT was developed and built by the Smithsonian Astrophysical Observatory and the Japan Aerospace Exploration Agency. Hinode’s X-ray Telescope is the highest resolution solar X-ray telescope ever flown.

The XRT collects X-rays emitted from the sun’s corona – the hot, tenuous outer layer that extends from the sun’s visible surface into the inner solar system. Gas in the solar corona reaches temperatures of millions of degrees. The energy source that heats the corona is a puzzle. The sun’s surface is only 10,000 degrees Fahrenheit, while the corona is more than 100 times hotter.

“We are very interested in studying solar flares,” adds McCauley. “Flares are most dramatic in X-rays and we’re using the X-ray Telescope to better understand the physical mechanisms that drive flares so that they might someday be forecasted.”

A question-and-answer with McCauley is available on the Smithsonian Science website.

Related Links:

> Original Statement
> Solar Eclipse – Solar System Reference Library
> 20-second time-lapse movie of the eclipse

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Ebola’s Evolutionary Roots More Ancient Than Previously Thought, Study Finds

Provided by Charlotte Hsu, University at Buffalo

The family of viruses housing Ebola and Marburg is ancient, and the two viruses last shared a common ancestor millions of years ago, scientists say

A new study is helping to rewrite Ebola’s family history.

The research shows that filoviruses — a family to which Ebola and its similarly lethal relative, Marburg, belong — are at least 16-23 million years old.

Filoviruses likely existed in the Miocene Epoch, and at that time, the evolutionary lines leading to Ebola and Marburg had already diverged, the study concludes.

The research was published in the journal PeerJ in September. It adds to scientists’ developing knowledge about known filoviruses, which experts once believed came into being some 10,000 years ago, coinciding with the rise of agriculture. The new study pushes back the family’s age to the time when great apes arose.

“Filoviruses are far more ancient than previously thought,” says lead researcher Derek Taylor, PhD, a University at Buffalo professor of biological sciences. “These things have been interacting with mammals for a long time, several million years.”

According to the PeerJ article, knowing more about Ebola and Marburg’s comparative evolution could “affect design of vaccines and programs that identify emerging pathogens.”

The research does not address the age of the modern-day Ebolavirus. Instead, it shows that Ebola and Marburg are each members of ancient evolutionary lines, and that these two viruses last shared a common ancestor sometime prior to 16-23 million years ago.

Clues in ‘fossil genes’

Taylor and co-author Jeremy Bruenn, PhD, UB professor of biological sciences, research viral “fossil genes” — chunks of genetic material that animals and other organisms acquire from viruses during infection.

In the new study, the authors report finding remnants of filovirus-like genes in various rodents. One fossil gene, called VP35, appeared in the same spot in the genomes of four different rodent species: two hamsters and two voles. This meant the material was likely acquired in or before the Miocene Epoch, prior to when these rodents evolved into distinct species some 16-23 million years ago.

In other words: It appears that the known filovirus family is at least as old as the common ancestor of hamsters and voles.

“These rodents have billions of base pairs in their genomes, so the odds of a viral gene inserting itself at the same position in different species at different times are very small,” Taylor says. “It’s likely that the insertion was present in the common ancestor of these rodents.”

The genetic material in the VP35 fossil was more closely related to Ebola than to Marburg, indicating that the lines leading to these viruses had already begun diverging from each other in the Miocene.

The new study builds on Taylor’s previous work with Bruenn and other biologists, which used viral fossil genes to estimate that the entire family of filoviruses was more than 10 million years old. However, those studies used fossil genes only distantly related to Ebola and Marburg, which prevented the researchers from drawing conclusions about the age of these two viral lines.

The current PeerJ publication fills this viral “fossil gap,” enabling the scientists to explore Ebola’s historical relationship with Marburg.

Possible relevance to disease prevention

The first Ebola outbreak in humans occurred in 1976, and scientists still know little about the virus’ history. The same dearth of information applies to Marburg, which was recognized in humans in 1967 and implicated in the death of a Ugandan health worker this month.

Understanding the virus’ ancient past could aid in disease prevention, Taylor says. He notes that if a researcher were trying to create a single vaccine effective against both Ebola and Marburg, it could be helpful to know that their evolutionary lineages diverged so long ago.

Knowing more about filoviruses in general could provide insight into which host species might serve as “reservoirs” that harbor undiscovered pathogens related to Ebola and Marburg, Taylor says.

“When they first started looking for reservoirs for Ebola, they were crashing through the rainforest, looking at everything — mammals, insects, other organisms,” Taylor says. “The more we know about the evolution of filovirus-host interactions, the more we can learn about who the players might be in the system.”

Collaborators: Taylor and Bruenn’s co-authors on the PeerJ study include UB students Matthew Ballinger, Laura Hanzly and Jack Zhan, all in the UB Department of Biological Sciences.

Related Links:

> Original Statement
> The Facts About Ebola: How The Disease Can (And Can’t) Be Spread
> US Response To Ebola Scrutinized As International Situation Worsens

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Despite Awareness Campaigns, New Report Reveals That Many Children Are Still Being Bullied

Chuck Bednar for redOrbit.com – Your Universe Online
Even though more and more public figures, including NASA astronaut Scott Kelly, are getting involved in anti-bullying awareness campaigns nationwide, it remains one of the most pressing issues facing the youth of America, according to a new report by researchers from Clemson University and Professional Data Analysts Inc.
“Bullying continues to affect a great number of children in all age groups, with the highest prevalence observed in third and fourth grades, where roughly 22 percent of schoolchildren report that they are bullied two or three times or more per month,” report co-author and Clemson psychology professor Sue Limber said in a statement.
Overall, the report found that 15 percent of students reported being bullied, while six percent said that they had bullied others. The percentage of students who report being bullied decreases steadily with each increasing grade level. While 23 percent of third graders said they were being bullied two to three times a month or more, this decreased to 15 percent by seventh grade and eight percent by twelfth grade, Limber and her colleagues wrote.
Conversely, the percentage of students who report bullying others fluctuated very little across grade levels, fluctuating between five percent and six percent from third through twelfth grade. The research demonstrated that bullying affects students of all ethnic groups, genders, grades, socioeconomic statuses and community types, and can have a serious impact on students which could even last into adulthood, they explained.
The study authors used data from the Olweus Bullying Questionnaire, analyzing a representative sample of over 200,000 questionnaires intended for schools to implement (but had not yet implemented) the Olweus Bullying Prevention Program. The sample used in the preparation of the report included 1,000 girls and 1,000 boys from each grade between third and twelfth, and the results were broken down by grade level and gender.
Surprisingly, Limber said that cyberbullying “was one of the least common forms of bullying experienced,” and the study found that a substantial percentage of the victimized students did not confide in anyone about being bullied. Boys were less likely to confide in others than girls, and while over 90 percent of girls and more than 80 percent of boys said that they felt sorry for students who were being bullied, most did not make the effort to help out.
“Many students also lacked confidence in the administrative and teaching staff to address bullying and, by high school, less than one-third of bullied students had reported bullying to adults at school,” Limber explained. “Although half of students in grades three to five believed that school staff often tried to put a stop to it when a student was being bullied, this percentage dropped to just 36 percent by high school.”
“We hope that this report helps teachers, administrators, parents, policymakers and concerned citizens raise national awareness about bullying and improve school environments so every child can feel safe at school,” she added.
[ Watch the Video: Astronaut Scott Kelly Speaks Out Against Bullying ]
Kelly, who is scheduled to take part in a one-year spaceflight mission in 2015, echoed those sentiments in a special video message he recorded Friday as part of Bullying Prevention Awareness Month. In that video, Kelly encouraged both kids and adults to “be more than just a bystander. Take action and do something to stop bullying.”
Kelly, who grew up in New Jersey and is the father of two daughters, said that he “felt compelled to act after hearing about the various cases of bullying around the country last year. I thought of my own daughters, and I recalled my experiences as a child watching other kids bully others without accountability.”
“Bullying affects not only the child adversely but also stunts our growth as a society. It is everyone’s responsibility to stand up against bullying,” he added. His video message will be part of a larger cross-federal agency prevention effort which includes the White House several US cabinet departments, NASA added.
Follow redOrbit on Twitter, Facebook and Pinterest.

Century-Old Notebook Belonging To Member Of Scott’s Antarctic Expedition Team Discovered

Chuck Bednar for redOrbit.com – Your Universe Online
A photographer’s notebook that had been left behind by a member of Captain Robert Falcon Scott’s ill-fated expedition to Antarctica more than a century ago has been recovered from the British explorer’s base in Cape Evans, officials at the Antarctic Heritage Trust in New Zealand announced last week.
According to LiveScience News Editor Megan Gannon, the book belonged to a photographer, zoologist and surgeon named George Murray Levick, who was a member of Scott’s team during his ill-fated Terra Nova expedition in 1910-1913. Levick, Gannon said, is also known for his observations of Cape Adare’s Adélie penguins and their “depraved” sexual behaviors.
“The newly discovered book also shows he kept fastidious notes, scrawled in pencil, about the photographs he took at Cape Adare,” Gannon said. “The book has notes detailing the date, subjects and exposure details from his photographs. In his notes, Levick refers to a self-portrait he took while shaving in a hut… and shots he took of his fellow crewmembers as they set up theodolites (instruments for surveying) and fish traps and sat in kayaks.”

“It’s an exciting find,” Nigel Watson, executive director of the Antarctic Heritage Trust said in a statement. “The notebook is a missing part of the official expedition record. After spending seven years conserving Scott’s last expedition building and collection, we are delighted to still be finding new artifacts.”
Identified as the ‘Wellcome Photographic Exposure Record and Dairy 1910,’ the notebook’s binding has been dissolved by 100 years of ice and water damage, the Antarctic Heritage Trust officials explained. The pages were separated, stabilized and digitized before the pages were sewn back together and the cover rebuilt. It has since been returned to Antarctica, where it takes its place among the approximately 11,000 artifacts at Cape Evans.
Levick, who died in 1956, was not a member of the team that accompanied Scott during his attempt to reach the South Pole, said Ed Mazza of The Huffington Post, and his notebook contains dates, subjects and exposure details – all written before he and his colleagues were forced to spend a harsh Antarctic winter living in an ice cave in 1912.

Scott, meanwhile, was able to lead his expedition team to the South Pole on January 17, 1912, but discovered that their treacherous 10-week journey was all for naught as Norwegian explorer Roald Amundsen had made it there first, Gannon said. On the way back to their base camp, however, Scott and four other members of his crew succumbed to bad weather and dwindling supplies. Scott died on March 29 or 30, 1912, at the age of 43, Mazza added.
Last December, a team from the Antarctic Heritage Trust discovered undeveloped photos taken during one leg of Ernest Shackleton’s Imperial Trans-Antarctic Expedition in an expedition hut used by Scott during his 1912 expedition. The negatives were developed and produced 22 pictures taken during Shackleton’s 1914-1917 Ross Sea Party and found in Scott’s hut, where expedition members were forced to live when their ship blew out to sea.
Follow redOrbit on Twitter, Facebook and Pinterest.

ESA’s Rosetta Comet-Landing Mission Featured In New UK Short Film Ambition

Chuck Bednar for redOrbit.com – Your Universe Online
An innovative short film with special effects worthy of a big-budget blockbuster isn’t the average way a space agency promotes an upcoming mission, but then again, ESA’s Rosetta mission is hardly your ordinary mission.
Rosetta, which on November 12 is scheduled to have its Philae probe land on 67P/Churyumov–Gerasimenko (67P/C-G), is the subject of Tomek Bagiński’s short film Ambition, which stars Aiden Gillen and Aisling Franciosi as a futuristic world-creating mentor-and-apprentice duo that pauses to reflect on the mission’s success.

After the apprentice attempts and seemingly fails at assembling planets, moons and other objects out of rubble, her mentor offers a lesson focusing on comets – “one trillion celestial balls of dust, ice, complex molecules, left over from the birth of our Solar System. Once thought of as messengers of doom and destruction, and yet so enchanting.”
“And we were to catch one: a staggeringly ambitious plan,” he added.
The movie, which was filmed on location in Iceland and first screened on October 24 as part of British Film Institute’s celebration of Sci-Fi: Days of Fear and Wonder in London, is nearly as ambitious – and in its universe, the successful outcome of the Rosetta mission is presented as a historical fact, and a precursor to bigger and better things.
In mere weeks, ESA hopes that science fiction will become science fact, as Philae will attempt to become the first spacecraft ever to attempt a soft landing on a comet. Earlier this month, the mission successfully completed a comprehensive readiness review and the primary landing site, located on the smaller of the comet’s lobes, was approved.
“A mission that began as a dream, but that after decades of planning, construction and flight through the Solar System, has arrived at its goal,” the ESA explained in a statement Friday. “Its aim? To unlock the secrets hidden within the icy treasure chest for 4.6 billion years. To study its make-up and its history. To search for clues as to our own origins.”
“From 100 km distance, to 50, 30 and then, defying all expectations, to just 10 km, Rosetta continues to captivate and intrigue with every image and every data packet returned,” the agency added. “It will rewrite the textbooks of cometary science. But there is more, an even greater challenge, another ambitious first: to land on the comet.”
“As a science fiction writer, it’s hard to think of a more stirring theme than the origin and ultimate destiny of life in the Universe,” noted Alastair Reynolds, a science fiction writer from Wales. “With the arrival of Rosetta at 67P/Churyumov–Gerasimenko – an astonishing, audacious technical achievement, literally the stuff of science fiction – we are on the brink of a bold new chapter in our understanding of our place in the Universe.”
ESA Rosetta project scientist Matt Taylor said that the orbiter is currently less than 10 kilometers (6.2 miles) from the comet, and both of them were traveling at speeds of more than 60,000 km/h (37,000 mph). Previous reports indicate that the Philae lander will be ejected shortly after 08:30 GMT, and that its descent was expected to last about seven hours. Thirty minutes later, the mission team will know whether or not the landing attempt was a success.
Once Philae touches down on 67P/C-G, it will secure itself using harpoons and ice screws to ensure that it doesn’t get thrown off of the low-gravity comet. It will then begin recharging its solar-powered batteries, and will eventually begin using a suite of 10 instruments for analysis of the comet. It will most likely stop working next March.
Rosetta, on the other hand, is expected to follow the comet throughout its closest approach to the sun in August 2015, and then back towards the outer reaches of the solar system. The mission’s goal is to study how comets evolves and hopefully learn more about how water originally formed and perhaps even how life on Earth came to be.
“All of this is new and unique and has never been done before,” Taylor said. “It may sound like science fiction, but it’s a reality for the teams that have dedicated their entire lives to this mission, driven to push the boundaries of our technology for the benefit of science and to seek answers to the biggest questions regarding our Solar System’s origins.”
Follow redOrbit on Twitter, Facebook and Pinterest.

Synthetic Biology On Ordinary Paper, Results Off The Page

Provided by Kat J. McAlpine, Wyss Institute for Biologically Inspired Engineering

By combining efforts and innovations, Wyss Institute scientists develop synthetic gene controls for programmable diagnostics and biosensors, delivered out of the lab on pocket-sized slips of paper

New achievements in synthetic biology announced October 23 by researchers at the Wyss Institute for Biologically Inspired Engineering, which will allow complex cellular recognition reactions to proceed outside of living cells, will dare scientists to dream big: there could one day be inexpensive, shippable and accurate test kits that use saliva or a drop of blood to identify specific disease or infection — a feat that could be accomplished anywhere in the world, within minutes and without laboratory support, just by using a pocket–sized paper diagnostic tool.

That once far–fetched idea seems within closer reach as a result of two new studies describing the advances, published October 23 in Cell, accomplished through extensive cross–team collaboration between two teams at the Wyss Institute headed by Wyss Core Faculty Members James Collins, Ph.D., and Peng Yin, Ph.D..

“In the last fifteen years, there have been exciting advances in synthetic biology,” said Collins, who is also Professor of Biomedical Engineering and Medicine at Boston University, and Co–Director and Co–Founder of the Center of Synthetic Biology. “But until now, researchers have been limited in their progress due to the complexity of biological systems and the challenges faced when trying to re–purpose them. Synthetic biology has been confined to the laboratory, operating within living cells or in liquid–solution test tubes.”

The conventional process can be thought of through an analogy to computer programming. Synthetic gene networks are built to carry out functions, similar to software applications, within a living cell or in a liquid solution, which is considered the “operating system”.

“What we have been able to do is to create an in vitro, sterile, abiotic operating system upon which we can rationally design synthetic, biological mechanisms to carry out specific functions,” said Collins, senior author of the first study, “Paper–Based Synthetic Gene Networks”.

Leveraging an innovation for chemistry–based paper diagnostics previously devised by Wyss Institute Core Faculty Member George Whitesides, Ph.D. , the new in vitro operating system is ordinary paper.

“We’ve harnessed the genetic machinery of cells and embedded them in the fiber matrix of paper, which can then be freeze dried for storage and transport — we can now take synthetic biology out of the lab and use it anywhere to better understand our health and the environment,” said lead author and Wyss Staff Scientist Keith Pardee, Ph.D.

Biological programs on paper

Using standard equipment at his lab bench and commercially–available, cell–free systems, Pardee designed and built a wide range of paper–based diagnostics and biosensors. He also used commonly–used fluorescent and color–changing proteins to provide visible indication that the mechanisms were working. Once built, the paper–based tools can be freeze dried for safe room–temperature storage and shipping, maintaining their effectiveness for up to one year. To be activated, the freeze–dried paper need simply be rehydrated with water.

[ Watch the Video: Synthetic Gene Controls On Ordinary Slips Of Paper ]

The paper–based platform can also be used in the lab to save a huge amount of time and cost as compared to conventional in vivo methods of validating tools for cell–based research. “Where it would normally take two or three days to validate a tool inside of a living cell, this can be done using a synthetic biology paper–based platform in as little as 90 minutes,” Pardee said.

As proof of concept, Collins and Pardee demonstrated a variety of effective paper–based tools ranging from small molecule and RNA actuation of genetic switches, to rapid design and construction of complex gene circuits, to programmable paper–based diagnostics that can detect antibiotic resistant bacteria and even strain–specific Ebola virus.

The Ebola sensor was created by using the paper–based method and utilized a novel gene regulator called a “toehold switch”, a new system for gene expression control with unparalleled programmability and flexibility reported in the second study in Cell. Although its inventors had designed the toehold switch to regulate genes inside living cells, its function was easily transferred to the convenience of ordinary freeze–dried paper, showcasing the true robustness of both the freeze–dried paper technique and the toehold switch.

The Ebola sensor was conceived by Wyss Institute Postdoctoral Fellow Alex Green, Ph.D., co–inventor of the toehold switch regulator and lead author of its report, after the ongoing West Africa crisis brought the deadly pathogen to global spotlight. Due to its easy assembly and fast prototyping ability, Green was eager to test the paper–based platform as an operating system for the toehold switch, which he had initially developed for programming gene expression in living cells. Green reached out to Pardee and together they assembled the prototype Ebola sensor in less than a day and then developed an assay that can differentiate between Sudan and Zaire virus strains within an hour of exposure.

Putting the “synthetic” in “synthetic biology”

The toehold switch works as such an accurate biosensor because it can be programmed to only react with specific, intended targets, producing true “switch” behavior with an unprecedented ability to turn on targeted gene expression. It can be programmed to precisely detect an RNA signature of virtually any kind and then turn on production of a specific protein.

Reported in the paper “Toehold Switches: De–Novo–Designed Regulators of Gene Expression”, Green developed the toehold switch gene regulator with senior author Yin, who is Associate Professor in the Department of Systems Biology at Harvard Medical School in addition to being a Wyss Core Faculty Member.

[ Watch the Video: Animation Of Synthetic Toehold Switch Gene Regulator ]

“While conventional synthetic biology complicates accuracy and functionality because it relies on re–purposing and re–wiring existing biological parts, the toehold switch is inspired by Nature but is an entirely novel, de–novo–designed gene expression regulator,” said Yin.

“We looked at our progress to rationally design dynamic DNA nanodevices in test tubes and applied that same fundamental principle to solve problems in synthetic biology,” said Yin. The resulting toehold switch, an RNA–based organic nanodevice, is a truly “synthetic” synthetic gene regulator with 40–fold better ability to control gene expression than conventional regulators.

The toehold switch functions so precisely that many different toehold switches can operate simultaneously in the same cell. This allows several toehold switches to be linked together, creating a complex circuit, which could be programmed to carry out multiple–step functions such as first detecting a pathogen and then delivering an appropriate therapy.

“Instead of re–purposing an existing part that was evolved by Nature, we wanted to change our way of thinking, leverage naturally–occurring principles, and build from scratch,” Green said. His Ph.D. in materials science and strong computer programming skills allowed him to approach biology with a fresh perspective and start from the ground up to engineer the toehold switch, rather than merely rewiring existing natural parts.

By combining forces, the two Wyss Institute teams showed that the toehold switch, so effective in living cells for its dynamic control of in vivo gene expression, is also fully capable of functioning in vitro on freeze–dried paper. With its impressive gene regulation functions able to be transported out of the lab for easy delivery of diagnostics and gene therapies, paper–based toehold switches promise a profound impact on human and environmental health.

“Whether used in vivo or in vitro, the ability to rationally design gene regulators opens many doors for increasingly complex synthetic biological circuits,” Green said.

The Wyss effect

Standing on their own, both paper–based synthetic gene networks and toehold switch gene regulators could each have revolutionary impacts on synthetic biology: the former brings synthetic biology out of the traditional confinement of a living cell, the latter provides a rational design framework to enable de–novo design of both the parts and the network of gene regulation. But combining the two technologies together could truly set the stage for powerful, multiplex biological circuits and sensors that can be quickly and inexpensively assembled for transport and use anywhere in the world.

“The level of idea sharing and collaboration that occurred to achieve these results is evidence of the teamwork that is the lifeblood of the Wyss,” said Institute Founding Director Don Ingber, M.D., Ph.D., Professor of Vascular Biology at Harvard Medical School and Boston Children’s Hospital, and Professor of Bioengineering at Harvard School of Engineering and Applied Science. “But we go beyond collaboration, to ensure that these great ideas are translated into useful technologies that can have transformative impact in the real world.”

> Continue reading…

—–

Follow redOrbit on Twitter, Facebook and Pinterest.

Colon Hydrotherapy: An Effective Tool for Battling Fibromyalgia

Fibromyalgia is a difficult disease to live with. It affects your entire body and changes your way of life. Its symptoms include muscle pain, fatigue, and exhaustion.

All of those things can make it very difficult to live your life the way you want to. It may prevent you from doing things that you love to do. That can make you feel discouraged or depressed about your chances of living a pain-free life. However, there are options for you.

You should talk to your doctor about which medications you could be taking to help with your symptoms, and try to lead a lifestyle that doesn’t worsen your symptoms.

Besides the right medications, a healthy diet, regular exercise routine, and a well-balanced lifestyle, you may want to look into colon hydrotherapy.

Colon hydrotherapy is a colon cleanse. Colon Hydrotherapy works by using tubes to put water into your colon. The cleaning process is thought to wash away any toxins, dirt, and allergens that have accumulated there.

This is believed to lead to an increase in overall health, as well as giving a boost to your immune system. Once your immune system is functioning more efficiently, it will be more able to fight infection and illness that could come your way.

There are many different types of viruses and bacteria out there that can cause infection and illness, and the stronger your immune system is, the better the chances are that you will not get sick.

Keep Illness and Infection at Bay

How Colon Hydrotherapy can Eliminate Fibromyalgia Symptoms

An infection or illness like the cold or flu can trigger your fibromyalgia symptoms. Someone without fibromyalgia experiences aches and pains due to the cold and flu. If you already suffer from fibromyalgia symptoms, then you are probably in a great deal of more pain due to your cold or flu.

Colon hydrotherapy can eliminate your susceptibility to illnesses, and while it does not make you completely immune, it will improve your chances of not getting sick.

Besides making you better able to defend against illnesses, colon hydrotherapy is thought to reduce inflammation. Your digestive tract plays a huge role in the health and wellness of your body. Everything that you eat and drink is processed by your digestive system. Much of your immune system lives in your digestive tract.

There are a lot of factors that play into your health, and perhaps the most significant one is the health of your digestive system. If your digestive system is healthy, then you will be healthier and subsequently, feel better.

Get Rid of the Toxins

Colon hydrotherapy can flush out the toxins and impurities that are found in your colon. Allergens, toxins, germs, and impurities in your colon can cause inflammation. This inflammation can lead to pain and stiffness and fatigue, which are all symptoms of fibromyalgia. So, if you cleanse your colon, you can reduce the inflammation, and oftentimes reduce the severity of your fibromyalgia symptoms.

It is thought that people who are suffering from fibromyalgia have high levels of toxins in their bodies, which collect in your muscles, leading to the symptoms that are commonly associated with fibromyalgia. Muscle stiffness, fatigue, and pain can be severely debilitating. If you remove these toxins from your body, many people find that they get significant relief from their fibromyalgia symptoms, which allows them to live their life pain free, or at least in less pain than they were previously in before.

The colon and digestive tract are the primary place where toxins accumulate. It is where they all come in, through foot and interaction with the outside world, and they go through your digestive system. Sometimes your body has too many toxins, and they accumulate in other parts of the body, which can lead to the symptoms of fibromyalgia. By removing as many of these toxins as possible, you decrease the amount of toxins in the rest of the body, thereby reducing your symptoms.

Give your Gut a Boost

People who are suffering from fibromyalgia often also suffer from irritable bowel syndrome, something that is directly connected to the health and wellness of your digestive tract. Colon hydrotherapy can significantly alleviate this by cleaning out your colon, and allowing your body to reset its digestive process. The colon cleanse can act like a helping hand to your body, which is trying hard to flush out toxins and harmful substances. A colon cleanse can get these things out most quickly, and help get your body back on track as quickly as possible.

Once your digestive tract is healthy and working properly, this will permeate through the rest of your body. Your blood vessels will be better able to carry nutrients to all the various parts of your body, including your muscles. As your muscles receive more nutrients from your blood stream, you will feel relief from your symptoms.

Fibromyalgia is a difficult disease to deal with. The symptoms can be debilitating and difficult to live with. The constant pain and stiffness can keep you from doing things that you love to do. It can affect your happiness and mood. When you are in pain, you don’t feel good. You don’t feel like socializing or doing the things that you love to do.

Try various holistic forms of therapy. Have a healthy lifestyle. Exercise. Exercising might be painful, but it is good for your muscles, and will loosen them up a little. Just be sure to keep within your limits and not go overboard. In conjunction with a healthy lifestyle, colon hydrotherapy can help your body battle fibromyalgia. By trying these therapies until you find the combination that works for you, you will be able to see some relief from your symptoms and regain your way of life. Do your research into the various things that can alleviate your symptoms, understand the disease as much as you can, and have a working relationship with your doctor.

Move Over, Felix – There’s A New Supersonic Skydiving Record-Holder In Town

Chuck Bednar for redOrbit.com – Your Universe Online
Google senior VP Alan Eustace has shattered Felix Baumgartner’s skydiving record, jumping from over 130,000 feet and topping speeds of over 800 mph in a 4 1/2 minute freefall, various media outlets reported on Friday.
The 57-year-old Eustace, who was assisted on the project by officials at the Paragon Space Development Corporation’s Stratospheric Explorer (StratEx) team, “exceeded the speed of sound” during Friday’s historic leap in New Mexico, “setting off a small sonic boom, and set several skydiving records in the process,” according to BBC News.
Eustace was carried to heights of more than 25 miles above the ground by a large balloon filled with 35,000 cubic feet of helium, wrote John Markoff of The New York Times. Eustace, wearing a specially-designed spacesuit with a built-in life-support system, jumped from at an altitude of 135,890 feet (near the top of the stratosphere) at 9:09 local time, returning to Earth just 15 minutes after beginning his fall.
“It was amazing. It was beautiful. You could see the darkness of space and you could see the layers of atmosphere, which I had never seen before,” he told the New York Times reporter afterwards. “It was a wild, wild ride. I hugged on to the equipment module and tucked my legs and I held my heading.”

Image Above: Alan Eustace is lifted to a record-breaking 135,908 ft via high-altitude balloon, the same technology used by World View (PRNewsFoto/World View Enterprises)
The Google executive’s maximum altitude, initially reported as 135,908 feet, was officially submitted to the World Air Sports Federation as 135,890 feet based on information from two data loggers. The previous record, established during Baumgartner’s leap, was 128,100 feet, according to The New York Times.
Eustace, who is a veteran pilot and parachutist, broke world records for vertical speed reached during freefall with a peak velocity of 822mph and a total freefall distance of 123,414 feet, the BBC said. Jim Hayhurst, director of competition at the United States Parachute Association, was the official observer for the record setting attempt, according to the Associated Press (AP).
“The supersonic skydive happened with little fanfare, out of the media spotlight, unlike the 2012 attempt by daredevil Baumgartner and the Red Bull Stratos team,” the AP pointed out. Baumgartner, who reached altitudes of 128,000 feet before beginning his freefall, had been outfitted with video cameras in order to capture the event for a BBC documentary.
Eustace, on the other hand, “planned his jump in secrecy, working for almost three years with a small group of technologists skilled in spacesuit design, life-support systems, and parachute and balloon technology,” Markoff said. “He carried modest GoPro cameras aloft, connected to his ground-control center by an off-the-shelf radio.” Unlike Baumgartner, who “was widely known for death-defying feats,” Markoff said that Eustace was “an engineer… with a deep commitment to teamwork” and that co-workers call him “a risk-taker with a passion for details.”
The StratEx program, which worked with the Google VP on the freefall attempt, was created to develop a self-contained spacesuit and recovery system that would make it possible for people to explore portions of the stratosphere above 100,000 feet, according to the outfit’s website. Such a system would allow for the development of new ways to allow astronauts to egress, new high-altitude aircraft suits, and more.
“The technology that has gone into developing the balloon, the spacesuit and the other systems that were used in Friday’s launch will be used to advance commercial spaceflight, namely efforts by Arizona-based World View Enterprises to take paying tourists up in a high-altitude balloon and luxury capsule starting in late 2016,” the AP said.
Follow redOrbit on Twitter, Facebook and Pinterest.

Facebook’s Newest Member Is Smarter Than You, Because His Name Is Stephen Hawking

Chuck Bednar for redOrbit.com – Your Universe Online
Theoretical physicist, cosmologist, author, longtime mathematics professor, honorary Royal Society of Arts fellow, lifetime Pontifical Academy of Sciences member, and Presidential Medal of Freedom recipient Stephen Hawking has added a new item to his impressive and ever-growing list of credentials: Facebook member.
The 72-year-old Hawking joined the social network on October 7, and as of Friday night, Jacob Kastrenakes of The Verge said that his page was “already approach 1 million followers after being publicized just six hours ago.” Since then, it had surged past that mark and over the 1.15 million mark.
According to the page itself, the professor’s team will handle most of the day-to-day operations of the his social media activities – though Hawking himself will contribute to the page on occasion, and those posts will be marked with his initials, ‘-SH’. While Kastrenakes noted that the new Facebook page “in part appears to be a marketing move for the upcoming Hawking biopic The Theory of Everything,” it was still “an exciting moment for science enthusiasts.”
In his first post, Hawking wrote: “I have always wondered what makes the universe exist. Time and space may forever be a mystery, but that has not stopped my pursuit. Our connections to one another have grown infinitely and now that I have the chance, I’m eager to share this journey with you. Be curious, I know I will forever be. Welcome, and thank you for visiting my Facebook Page. –SH”
The page also features nearly two dozen photos, a video of an ALS Ice Bucket Challenge that he made along with his family several months ago, and a message supporting the aforementioned Focus Features biopic, which according to Mashable’s Josh Dickey “became an instant Oscar contender when it debuted at the Toronto International Film Festival” last month.
Dickey noted that the producers of the film, which stars Eddie Redmayne as Hawking and Felicity Jones as his wife Jane, said during a question and answer session following its Toronto premiere that Hawking had a tear in his eye following a private screening of the film. He also reportedly briefly commented that the film was “broadly true.”
Despite the clear tie-in with the forthcoming motion picture, Wired’s Issie Lapowsky, who quips that Hawking is “Facebook’s smartest new member,” said that the theoretical physicist “doesn’t strike us as the type to join Facebook just to cash in on the buzz surrounding the movie… He generates a substantial amount of buzz on his own.” Lapowsky added that “fans from Pakistan to Canada flooded his page with welcome messages and notes about what an inspiration the 72-year-old scientist is.”
Hawking’s most recent post on Facebook came on Friday, and in it he wrote that he “greatly enjoyed the STARMUS festival. It is a combination of science and rock music, both of which I love.” He noted that he was “interested in the talks by the astronauts and why the Soviet Union didn’t beat Neil Armstrong to the Moon,” but said that he would have to “read the transcript” later because he “didn’t understand the translation.”
Follow redOrbit on Twitter, Facebook and Pinterest.

Researchers Developing Patch That Can Collect Health-Related Data From Sweat

Chuck Bednar for redOrbit.com – Your Universe Online

Experts from the University of Cincinnati and the US Air Force Research Laboratory at Wright-Patterson Air Force Base in Ohio are joining forces to develop a lightweight, wearable device that can analyze sweat using a smartphone in order to perform a variety of health-related diagnostic tests.

The new gadget uses sensors that are as light and flexible as an adhesive bandage and capable of gathering vital medical information in almost real time. The technology is similar to that used by home pregnancy tests, collecting electrolytes, metabolites, proteins, small molecules, amino acids and other biomarkers that are carried with sweat, the researchers explained.

“With that kind of information, athletes could avoid the killer cramps that could cause them to be carried off the field at the peak of their game or competition,” the university said. “Preemies’ vitals could be monitored without drawing blood – the pain and blood loss causing even more stress on a physically-stressed infant. One day, diabetics could maybe even avoid those painful sticks as well, as they check their glucose levels.”

Jason Heikenfeld, a UC professor of electrical engineering and computing systems and a member of a team working on the device, explained in the latest issue of IEEE Spectrum Magazine that the ultimate goal is to use sweat as an alternative for being punctured for a blood test or providing a urine sample for diagnostic tests, and to figure out if perspiration could provide constant updates about how a person’s body is responding to a specific drug or injury.

Image Above: The next generation of sweat sensor pad and flexible Bluetooth circuit. Credit: UC/USAF

“Sweat contains a trove of medical information and can provide it in almost real time,” Heikenfeld said. “Researchers have understood the richness of the information carried in sweat for some 50 years, but they have been unable to take advantage of it because of the difficulty of collecting, transporting, and analyzing the samples.”

“With the many recent advances in sensing, computing, and wearable technology providing inspiration – and with more than a little perspiration in the laboratory – we are on the verge of a true revolution in wearable diagnostics,” he added. “Ultimately, sweat analysis will offer minute-by-minute insight into what is happening in the body, with on-demand sampling in a manner that is convenient and unobtrusive.”

The joint UC/USAF team initially started work on the device five years ago as part of a search for an easy-to-use and convenient method to monitor how an airman responds to disease, medication, diet, injury, stress and other physical changes during both training and missions. During their research, they developed patches that stimulate perspiration, then measured and transmitted health-related information based on those sweat samples.

He explained that paper in the patch wicks sweat in a tree-root pattern in order to maximize the amount of perspiration collected while minimizing the volume of paper required. It comes with a built-in sodium sensor, voltage meter, communications antenna, as well as microfluidics and a controller chip externally powered by the smartphone.

“Right now our industry partners are preparing to use standard flexible-electronic manufacturing processes to produce several hundred patches for more extensive human trials, which are expected to start before the end of the year,” Heikenfeld wrote. “We’re also adding about a half dozen other sensors that will detect additional ions besides sodium and chloride and use them to predict things like exertion level and muscle injury or damage.”

He noted that the initial results “look promising” and that pilot program testing on college athletes could begin as early as next year. If that does well, he said that if those trials go well, “it’s not a far stretch to imagine using the patch” in conjunction with radio frequency identification (RFID) reading mats currently used to record marathoners’ split times to also identify runners who might be at risk of a potentially-dangerous electrolyte imbalance.

“If all goes well, we could have sweat-sensing patches – at least sensors for athletics – on the market in low volume next year. These do not have to go through a lengthy approval process with the US Food and Drug Administration because they are not meant to be used for diagnosis or treatment of disease,” Heikenfeld said, adding that a second-generation patch that includes Bluetooth communication, data storage and more is “nearly complete.”

Follow redOrbit on Twitter, Facebook and Pinterest.

High-Altitude Methane Cloud Detected In The Stratosphere Of Saturn’s Moon Titan

Chuck Bednar for redOrbit.com – Your Universe Online
Scientists have unexpectedly identified a high-altitude methane ice cloud floating above the north pole of Saturn’s moon Titan – a cloud that is similar to the exotic clouds found far above Earth’s poles, NASA announced on Friday.
The cloud was first spotted by the Cassini spacecraft in 2006, and while methane clouds were already known to exist in the lowest layer of Titan’s atmosphere, the troposphere, the discovery of such clouds forming this high up is “completely new,” Cassini mission scientist Carrie Anderson said in a statement.
Anderson, who works at the Goddard Space Flight Center and is lead author of a study detailing the findings currently available online in the journal Icarus, explained that scientists at the space agency had never considered the possibility that the moon’s stratosphere could contain methane clouds.
The methane cloud was part of the winter cap of condensation over Titan’s north pole, and like rain and snow clouds on Earth, those clouds form as part of a cycle of condensation and evaporation. Like on Earth, vapor rises from the surface of the moon, encounters increasingly cooler temperatures and then returns to the ground in the form of precipitation – except on Titan, the vapor involved in the process is methane, not water.
Instead of developing in the troposphere, this newly identified cloud formed in the layer above it, the stratosphere. Earth also has polar stratospheric clouds, which typically form above the North Pole and South Pole between 49,000 and 82,000 feet. These clouds are rare and only form at temperatures of minus 108 degrees Fahrenheit.
“Other stratospheric clouds had been identified on Titan already, including a very thin, diffuse cloud of ethane, a chemical formed after methane breaks down,” NASA said. “Delicate clouds made from cyanoacetylene and hydrogen cyanide, which form from reactions of methane byproducts with nitrogen molecules, also have been found there.”

Image Above: Earth’s polar stratospheric clouds. Credit: L. NASA/JPL/U. of Ariz./LPGNantes; R. NASA/GSFC/M. Schoeberl
“But methane clouds were thought unlikely in Titan’s stratosphere,” the agency added. “Because the troposphere traps most of the moisture, stratospheric clouds require extreme cold. Even the stratosphere temperature of minus 333 degrees Fahrenheit (minus 203 degrees Celsius), observed by Cassini just south of the equator, was not frigid enough to allow the scant methane in this region of the atmosphere to condense into ice.”
Anderson and co-author Robert Samuelson, also from Goddard, reported in their new study that the temperatures in Titan’s lower stratosphere differ based on their latitudes. Using data from Cassini’s Composite Infrared Spectrometer and its radio science instrument, they were able to demonstrate that the high-altitude temperature near the north pole was actually far colder than the conditions just south of the moon’s equator.
“It turns out that this temperature difference – as much as 11 degrees Fahrenheit (minus 12 degrees Celsius) – is more than enough to yield methane ice,” NASA explained. While early observations of the cloud system were said to be “consistent with small particles composed of ethane ice,” further analysis revealed that come regions tended to be denser and clumpier, suggesting that there could be multiple forms of ice present in the clouds.
“The team confirmed that the larger particles are the right size for methane ice and that the expected amount of methane – one-and-a-half percent, which is enough to form ice particles – is present in the lower polar stratosphere,” the agency added, noting that the mechanism for forming stratospheric clouds appeared to differ from those formed at lower altitudes due to global circulation patterns that carry warm air from the summer hemisphere up from the surface, into the stratosphere and towards the winter pole, where it sinks, cools and forms methane clouds.
Like high-altitude clouds on Earth, Titan’s methane cloud was located near the winter pole, above 65 degrees north latitude, according to Anderson and Samuelson. The study authors estimate that this type of cloud system, which is known as subsidence-induced methane clouds (SIMCs), could develop between 98,000 to 164,000 feet above the moon’s surface.
“Cassini has been steadily gathering evidence of this global circulation pattern, and the identification of this new methane cloud is another strong indicator that the process works the way we think it does,” said Michael Flasar, Goddard scientist and principal investigator for Cassini’s Composite Infrared Spectrometer (CIRS).
“Titan continues to amaze with natural processes similar to those on the Earth, yet involving materials different from our familiar water,” added Scott Edgington, Cassini deputy project scientist at NASA’s Jet Propulsion Laboratory (JPL) facility in Pasadena, California. “As we approach southern winter solstice on Titan, we will further explore how these cloud formation processes might vary with season.”
—–
Follow redOrbit on Twitter, Facebook and Pinterest.
—–

Chemicals Causing Comet 67P/C-G To Reek Of Urine, Rotten Eggs, Formaldehyde

Chuck Bednar for redOrbit.com – Your Universe Online
If recent chemical signatures detected from 67P/Churyumov–Gerasimenko by the Rosetta Orbiter Sensor for Ion and Neutral Analysis (ROSINA) are any indication, then it’s a good thing that the forthcoming attempt to land a probe on the surface of the comet is an unmanned mission, because the astronauts might not be able to handle the odor.
The orbiter’s instrument has been using its two mass spectrometers to detect the mixture of molecules contained in the comet’s coma, according to an October 23 blog post by ROSINA science team member Kathrin Altwegg from the University of Bern in Switzerland.
As of September 11, ROSINA had detected water (H2O), carbon monoxide (CO), carbon dioxide (CO2), ammonia (NH3), methane (CH4) and methanol (CH3OH), but a more recent scan had also detected formaldehyde (CH2O), hydrogen sulfide (H2S), hydrogen cyanide (HCN), sulfur dioxide (SO2) and carbon disulphide (CS2).
That combination of chemicals would not make for a very pleasant fragrance.
“The odor of rotten eggs (hydrogen sulfide), horse stable (ammonia), and the pungent, suffocating odor of formaldehyde… is mixed with the faint, bitter, almond-like aroma of hydrogen cyanide,” Altwegg said. “Add some whiff of alcohol (methanol) to this mixture, paired with the vinegar-like aroma of sulfur dioxide and a hint of the sweet aromatic scent of carbon disulphide, and you arrive at the ‘perfume’ of our comet.”
The ESA team said the detection of this many different types of molecules at this stage of the project has been somewhat of a surprise, according to the AFP news agency. Rosina team scientists believed that only carbon dioxide and carbon monoxide, the most volatile of the molecules found in the comet, would be released through sublimation as its icy surface slowly began to grow warmer.
[ Watch the Video: The Rosetta Mission Asks: What Can We Learn From Comets? ]
“While this is unlikely to be a particularly attractive perfume, remember that the density of these molecules is very low, and that the main part of the coma is made up of water and carbon dioxide, mixed with carbon monoxide,” the ESA said. “The key point, however, is that a detailed analysis of this mixture and how it varies as 67P/C-G grows more active will allow scientists to determine the comet’s composition.”
“Further work will show how 67P/C-G compares with other comets, for example by revealing differences between comets originating from the Kuiper Belt (like 67P/C-G) and comets that hail from the distant Oort cloud (like Comet Siding Spring, which recently flew past Mars),” the agency added. “The goal is to gain insights into the fundamental chemical make-up of the solar nebula from which our Solar System and, ultimately, life itself emerged.”
Earlier this month, the ESA announced that it had completed a comprehensive readiness review and had officially green-lighted the Rosetta mission’s November 12 landing attempt, during which the orbiter’s Philae lander will attempt to touchdown on a landing site located on the smaller of 67P/C-G’s two lobes known as Site J.
The final readiness review took place on October 14, more than two months since Rosetta moved within 100 kilometers on the icy comet. At the time, the spacecraft was just 10 kilometers away from the center of 67P/C-G’s four kilometer body, giving ESA scientists a closer look at both the primary and backup landing sites and allowing them to complete a full hazard assessment less than one month before Philae’s historic attempted landing.
“Now that we know where we are definitely aiming for, we are an important step closer to carrying out this exciting – but high-risk – operation,” Fred Jansen, ESA’s Rosetta mission manager, said in a statement last week. “However, there are still a number of key milestones to complete before we can give the final Go for landing.”
—–
Follow redOrbit on Twitter, Facebook and Pinterest.
—–

Booze And Your Brain – Moderate Alcohol Consumption By Seniors May Boost Episodic Memory

Brett Smith for redOrbit.com – Your Universe Online

Although alcohol consumption is typically associated with negative effects on the brain, a new study from a team of American researchers has found that people over 60 who are moderate drinkers have better episodic memory, meaning they are better at recalling specific events.

Published in the American Journal of Alzheimer’s Disease and Other Dementias, the new study also found moderate alcohol consumption is associated with a larger volume of the hippocampus, a brain region vital for episodic memory.

Study author Faika Zanjani, a behavioral and community health professor at the University of Maryland, noted that the study findings were significant considering the fact that episodic memory is typically lost with dementia.

“Over time, you don’t necessarily lose memory for [how to do] things, like driving or having coffee,” Zanjani told Laura Tedesco of Yahoo Health. “You usually lose memory of events — memories that you have to retrieve, instead of just use. It’s not just forgetting your keys. It’s forgetting key moments in your life.”

The study was based on data for over 660 patients in the Framingham Heart Study Offspring Cohort. Participants in the study provided information on their alcohol intake, underwent a battery of neuropsychological tests, had MRIs taken of their brains and were tested for the genetic Alzheimer’s disease risk factor APOE e4.

The scientists learned that light and moderate alcohol usage in the over 60 set is connected with greater episodic memory and a larger hippocampal brain size. Quantity of alcohol usage did not influence executive function or overall mental ability.

Results from animal analyses indicate that moderate alcohol intake may help with preserving hippocampal volume by supporting the generation of nerve cells in the hippocampus. Additionally, subjecting the brain to moderate amounts of alcohol may boost the discharge of brain chemicals involved with mental processing, the researchers said.

“There were no significant differences in cognitive functioning and regional brain volumes during late life according to reported midlife alcohol consumption status,” said study author Brian Downer, a postdoctoral researcher at the University of Texas Medical Branch at Galveston. “This may be due to the fact that adults who are able to continue consuming alcohol into old age are healthier, and therefore have higher cognition and larger regional brain volumes, than people who had to decrease their alcohol consumption due to unfavorable health outcomes.”

Even though the advantages of light to moderate alcohol intake with respect to cognitive learning and memory down the road have been reported before – long periods of abusing alcohol, typically considered having five or more alcoholic beverages during a single session, is proven to be damaging to the brain, the study team pointed out.

“We constantly recommend that people not consume more than one drink a day,” Zanjani told Tedesco. “So when we actually find benefits for the moderate level, we’re pretty surprised.”

“As long as you don’t get intoxicated and stop doing the things you need to do, drinking alcohol seems to be okay,” she added.

—–

Amazon.com – Read eBooks using the FREE Kindle Reading App on Most Devices

—–