Scientists Solve The Mystery Of Death Valley’s "Sailing Stones"

Chuck Bednar for redOrbit.com – Your Universe Online
For nearly 80 years, scientists have been puzzled by the “sailing stones” of Death Valley National Park – massive rocks, some weighing 500 pounds or more, that inexplicably have long trails behind them as though they had been pushed through the dry, flat mud surface of the region known as Racetrack Playa.
According to National Geographic’s Jason Bittel, these “sailing stones” (also known as the “sliding rocks” of Death Valley) were first documented by miners more than a century ago and appear to change location all on their own, with the lengthy, sometimes zigzag pattern trailing after them the only evidence that the boulders had even moved.

Bittel said that scientists have been working to solve this mystery since 1948, and had submitted numerous theories claiming that “dust devils, flooding, ice sheets, hurricane-force winds, and algal films” could have been to blame. Now, the authors of a new study published Wednesday in the journal PLOS ONE have discovered the real reason for this unusual phenomenon.
Richard Norris, a paleobiologist at the Scripps Institution of Oceanography in La Jolla, California, and his colleagues traveled to the site in 2011 in order to monitor the rocks remotely using a combination of a high-resolution weather station, time-lapse cameras and motion-activated GPS units. The researchers said that they were prohibited from using native rocks by the US National Park Service, so they brought in similar boulders from an external source.
Since the stones can remain in place for more than a decade without moving, Norris and his colleagues did not expect to actually observe the motion in person. In fact, in a statement, co-author Ralph Lorenz of the Johns Hopkins University’s Applied Physics Laboratory predicted that their work would be “the most boring experiment ever.”
In December 2013, however, the research team’s efforts paid off. A day of rain caused a thin sheet of ice to form on the desert surface, said NPR’s Christopher Joyce. As the sun came up the following day, the ice started melting in the center of the playa, popping and cracking throughout the playa until finally the sheets of ice began to move.

Joyce said that the sheets were thin and approximately 40 to 50 feet across. The ice began sliding on top of a film of melted water, pushing the rocks along the muddy desert surface at a rate of several feet per minute. By the end of the day, some of them had moved hundreds of feet, and since the ice and water had evaporated before the afternoon, patterns indicating the trails taken by the rocks were left behind in the now-dry sand, he added.
“Science sometimes has an element of luck. We expected to wait five or ten years without anything moving, but only two years into the project, we just happened to be there at the right time to see it happen in person,” Norris said. As he explained to Joyce, it was just a case of lucking into the right conditions – rain, followed by cold and sunshine, accompanied by a wind that was steady but not too blustery and mud that had the right amount of slipperiness.
Norris and his team later returned to the site along with Lorenz, who had been studying the site since 2007, and managed to capture video footage and photographic evidence of the event, said Hannah Marsh of The Telegraph. In all, the scientists reported observing five movement events involving hundreds of rocks in a span of approximately 10 weeks.
“It’s so much fun!” Norris told NPR. “Pretty much everybody was out there because it was a neat problem, and it was fun to do. And I think there’s no purer form of science than that.”
—–
FOR THE KINDLE: America’s Most Popular National Parks: redOrbit Press

Twitter Opens Up Analytics Dashboard Access For Social Media Stat Buffs

Chuck Bednar for redOrbit.com – Your Universe Online
If you’ve ever wanted to know just how many people are reading that extremely cool, witty tweet you just sent out, Twitter has good news for you: the microblogging website has made its analytics tool available to all users.
The announcement was tweeted out Thursday by Twitter engineer Ian Chan, who wrote that he was “absolutely thrilled” to open up access to the website’s analytics dashboard. Previously, only advertisers and verified users could access the tool.
According to Trevor Mogg of Digital Trends, Twitter originally launched the analytics dashboard back in July. It allows users to check the number of times their tweets have been viewed by other users, the number of times that someone had engaged or interacted with the tweet (such as clicking on, retweeting or replying to it) and the engagement rate.
Furthermore, the Washington Post’s Jiaxi Lu reported Thursday that the dashboard also includes demographics about your followers, such as what percent of them are male/females, what their interests are, and where they’re located. It will take a little bit of digging to find those more detailed statistics, however.
Lu explains that opening the analytics tool will take users to a default page showing the number of people who viewed the tweets in your Twitter feed, and that in-depth analytics require clicking on the engagement tab. After doing so, you can see the total number of times that people clicked on your tweets, the number of clicks on your username or avatar, click on included links and hashtags, and real-time tracking of retweets and favorites, she added.
Finally, PC Magazine writer Angela Moscaritolo said that users can also download their tweet metrics if they want to spend extra time analyzing the data. Currently, the dashboard is available to users who tweet in English, French, Japanese, and Spanish and have been active in the past two weeks. However, “Twitter is working to roll it out to all users soon.”
“If you use your Twitter account to promote your business, the tool is certain to prove useful as it offers the opportunity to discover which kind of messages work best on the social media site,” Mogg said. “Twitter will be hoping for a knock-on effect too – as businesses and other high profile users utilize the tool to help them make more effective use of the microblogging service, this should lead to more engagement, which in turn should keep advertisers interested.”
Not everyone is 100 percent thrilled about the new feature, however. Martin Bryant of The Next Web said that while he was “impressed” by the new feature, he was also “a little concerned” that it could eventually cause “the human touch of Twitter” to be “stripped away as users regularly check their stats, seeing what tweets are most popular and tweaking their ‘strategy’ to get more ‘engagement’ and reach a wider audience.”
“It’s like video games – who doesn’t want to get a higher score?” he added. “Sure, we’ve had retweet and Favorite counts for some time (and some people’s tweets are definitely influenced by those), but the new analytics take data about our personal Twitter accounts to a whole new level. Even if we don’t deliberately change the way we tweet, subconsciously it could be a different story.”
—–
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Asteroid Smashup Observed By Spitzer Could Result In Planet Formation

Chuck Bednar for redOrbit.com – Your Universe Online
An eruption of dust around a young solar-analog star discovered using the Spitzer Space Telescope could be the type of massive collision between asteroids that ultimately result in the formation of planets, astronomers report in Thursday’s online edition of the journal Science.
Scientists began tracking the star in question, identified as NGC 2547-ID8, after it first surged with a large amount of fresh dust between August 2012 and January 2013. While Spitzer has been used to observe the aftermath of suspected asteroid smashups before, NASA officials explained that this new research marks the first time scientists have successfully collected data both before and after planetary system collisions.
In their study, lead author and University of Arizona graduate student Huan Meng and his colleagues said that they detected “a debris-producing impact in the terrestrial planet zone” as it happened around the 35 million year old star. They went on to explain that there was a “substantial brightening of the debris disk at a wavelength of 3 to 5 micrometers,” which was followed by decay over the course of a year with “quasi-periodic” disk flux modulations.
Meng and experts from the California Institute of Technology (Caltech), the University of Tokyo and several other institutions explained that this type of behavior was “consistent with the occurrence of a violent impact that produced vapor out of which a thick cloud of silicate spherules condensed that were then ground into dust by collisions,” and suggest that their observations offer a sneak-peak into the process of forming rocky planets such as Earth.
Rocky planets, the US space agency explains, start out as dusty material that orbit young stars. This material clumps together, forming asteroids that eventually collide with one another. In many cases, those asteroids are destroyed, but in some instances they actually grow over time and become proto-planets. After approximately 100 million years, they mature into full-grown terrestrial planets, similar to how our moon is believed to have formed as the result of a giant impact between proto-Earth and another object roughly the same size as Mars.
As part of their research, the astronomers focused Spitzer’s heat-seeking infrared instruments on NGC 2547-ID8, which is a star that lies roughly 1,200 light-years away in the Vela constellation and is roughly 35 million years old. While previous observations had recorded variations in the amount of dust around the star, suggesting the presence of ongoing asteroid collisions, Meng’s team was hoping to view an even larger, potentially planet-forming impact.
“Beginning in May 2012, the telescope began watching the star, sometimes daily,” NASA said. “A dramatic change in the star came during a time when Spitzer had to point away from NGC 2547-ID8 because our sun was in the way. When Spitzer started observing the star again five months later, the team was shocked by the data they received.”
“We not only witnessed what appears to be the wreckage of a huge smashup, but have been able to track how it is changing — the signal is fading as the cloud destroys itself by grinding its grains down so they escape from the star,” co-author Kate Su, also of the University of Arizona, said in a statement. “Spitzer is the best telescope for monitoring stars regularly and precisely for small changes in infrared light over months and even years.”
Currently, there is an extremely thick cloud of dusty debris orbiting NGC 2547-ID8 in the zone where rocky planets typically form, and the scientists observations indicate that the infrared signal from this cloud changes based on how much of it is visible from Earth. When it is facing us, they said that more of its surface area is exposed and the signal becomes stronger, but when the head or tail is visible, less infrared light is observed.
“By studying the infrared oscillations, the team is gathering first-of-its-kind data on the detailed process and outcome of collisions that create rocky planets like Earth,” NASA explained, with co-author George Rieke noting that he and his colleagues had “a unique chance” to study the process of rocky planet formation taking place in near-real time.
“The team is continuing to keep an eye on the star with Spitzer,” the US space agency added. “They will see how long the elevated dust levels persist, which will help them calculate how often such events happen around this and other stars. And they might see another smashup while Spitzer looks on.”
—–
Keep an eye on the cosmos with Telescopes from Amazon.com

New Study Throws Into Question Long-Held Belief About Depression

Michael Bernstein, American Chemical Society

New evidence puts into doubt the long-standing belief that a deficiency in serotonin — a chemical messenger in the brain — plays a central role in depression. In the journal ACS Chemical Neuroscience, scientists report that mice lacking the ability to make serotonin in their brains (and thus should have been “depressed” by conventional wisdom) did not show depression-like symptoms.

Donald Kuhn and colleagues at the John D. Dingell VA Medical Center and Wayne State University School of Medicine note that depression poses a major public health problem. More than 350 million people suffer from it, according to the World Health Organization, and it is the leading cause of disability across the globe. In the late 1980s, the now well-known antidepressant Prozac was introduced. The drug works mainly by increasing the amounts of one substance in the brain — serotonin. So scientists came to believe that boosting levels of the signaling molecule was the key to solving depression. Based on this idea, many other drugs to treat the condition entered the picture. But now researchers know that 60 to 70 percent of these patients continue to feel depressed, even while taking the drugs. Kuhn’s team set out to study what role, if any, serotonin played in the condition.

To do this, they developed “knockout” mice that lacked the ability to produce serotonin in their brains. The scientists ran a battery of behavioral tests. Interestingly, the mice were compulsive and extremely aggressive, but didn’t show signs of depression-like symptoms. Another surprising finding is that when put under stress, the knockout mice behaved in the same way most of the normal mice did. Also, a subset of the knockout mice responded therapeutically to antidepressant medications in a similar manner to the normal mice. These findings further suggest that serotonin is not a major player in the condition, and different factors must be involved. These results could dramatically alter how the search for new antidepressants moves forward in the future, the researchers conclude.

The authors acknowledge funding from the Department of Veterans Affairs and the Department of Psychiatry and Behavioral Neurosciences at Wayne State University.

Researchers Model Yellowstone Super-Eruption Ash Cloud

April Flowers for redOrbit.com – Your Universe Online

In 1883, a volcanic eruption in a small archipelago of the Dutch East Indies (now Indonesia) changed the world. The eruption and subsequent tsunamis caused by the Krakatau eruption resulted in over 36,000 deaths, including all 3,000 souls on the island. The ocean floor was altered, and temperature and weather patterns didn’t return to normal for 5 years. And although the ash in the atmosphere produced spectacular sunsets, the lowered temperatures and acid rains devastated crops around the world.

In 2010, a similar event occurred in Iceland with the eruption of Eyjafjallajokull. Though the actual eruptions were much smaller, in relative terms, the ash cloud from this eruption grounded about 10 million travelers in Europe for six days.

Neither of these volcanoes are comparable in size to the volcano at Yellowstone. A new study from the United States Geological Survey (USGS) suggests that the ash cloud from a Yellowstone supereruption would blanket the Rocky Mountains several meters deep, and it would deposit at least millimeters of ash at least as far away as New York, Los Angeles and Miami. The results have been published in a recent issue of Geochemistry, Geophysics, Geosystems.

The research team used an improved computer model to develop their predictions. They believe that the large hypothetical eruption would create an umbrella ash cloud — one that expands in all directions evenly — sending ash across North America.

During a supereruption (the largest kind of eruption known), more than 240 cubic miles of material can be ejected from a volcano. This sort of eruption is highly unlikely, but if it should occur, electronic communications and air travel throughout the continent would be shut down, and the climate would be altered.

The underground reservoir of hot and partly molten rock beneath Yellowstone National Park is enormous. We know of three eruptions in the past, at approximately 2.1 million, 1.3 million and 640,000 years ago. According to the University of New Mexico, one of those eruptions formed the 24 by 40 mile caldera which is now Yellowstone Lake. Current geological activity at the park shows no sign that any volcanic eruptions will occur in the near, or even far, future. A relatively non-explosive lava flow near the Pitchstone Plateau was the most recent volcanic activity at 70,000 years ago.

The model, called Ash3D, projects that cities near the supereruption would be covered by a few feet of ash, a few inches would cover the Midwest region of the country, and cities on both coasts would see a fraction of an inch at least.

Scientists can use the findings from this study to understand past eruptions at Yellowstone and the widespread ash deposits left behind. Ash3D is also being used by other USGS researchers to forecast possible ash deposit hazards from restless volcanoes in Alaska.

Typical smaller eruptions deposit ash in a fan formation. A supereruption, however, resembles a bull’s-eye; dense in the center and lessening in all directions fairly uniformly. The researchers say that this type of formation is less affected by the prevailing winds than the fan formation.

“In essence, the eruption makes its own winds that can overcome the prevailing westerlies, which normally dominate weather patterns in the United States,” said Larry Mastin, a geologist at the USGS Cascades Volcano Observatory in Vancouver, Washington. “This helps explain the distribution from large Yellowstone eruptions of the past, where considerable amounts of ash reached the west coast,” he added.

The three large past eruptions deposited ash over many tens of thousands of square miles. The deposits have been found across central and western Canada and the US.

Accurately estimating the ash deposits from these past eruptions was made challenging by erosion, as well as the limitations of previous computer models which lacked the ability to accurately determine the mechanism of transportation for the ash.

Depending on the length of the eruption, Ash3D revealed that the leading edge of the ash cloud from a supereruption could expand at a rate that exceeds the ambient wind speed for hours or days. Such an expansion could drive ash both upwind (westward) and crosswind (north to south) more than 932 miles. This would produce the distinctive bull’s-eye pattern.

The simulation showed that modern cities near the park – like Billings, Montana and Casper, Wyoming – would be covered by a few inches to more than three feet of ash. Cities in the upper Midwest — like Minneapolis, Minnesota, and Des Moines, Iowa – would receive inches, at least. The East Coast and Gulf Coast would only receive fractions of an inch, while California cities would be between one and two inches. Pacific Northwest cities might receive just over an inch.

Although this might not sound bad because some of these cities receive more than this in snow each year, the effect on the climate of only an inch or less of volcanic ash could be severe. Previous research shows that such a blanketing could reduce traction on roadways, short out electrical transformers, and cause respiratory problems. Other studies also demonstrated that multiple inches of such ash could damage infrastructure, block sewer and water lines, disrupt livestock and damage crops.

The research team discovered that other eruptions that are smaller than the Yellowstone supereruption, yet still powerful, could cause an umbrella ash cloud as well.

“These model developments have greatly enhanced our ability to anticipate possible effects from both large and small eruptions, wherever they occur,” said Jacob Lowenstern, USGS Scientist-in-Charge of the Yellowstone Volcano Observatory.

Image 2 (below): An example of the possible distribution of ash from a month-long Yellowstone supereruption. The distribution map was generated by a new model developed by the U.S. Geological Survey using wind information from January 2001. The improved computer model finds that the hypothetical, large eruption would create a distinctive kind of ash cloud known as an umbrella, which expands evenly in all directions, sending ash across North America. Credit: USGS

—–

FOR THE KINDLE: America’s Most Popular National Parks: redOrbit Press

Toke Up For A Better Marriage? New Research Suggests Marijuana Use Could Reduce Incidents Of Domestic Violence

Chuck Bednar for redOrbit.com – Your Universe Online
In a finding that could forever alter our perception of bonding with our significant others, researchers from the University at Buffalo School of Public Health and Health Professions and Research Institute on Addictions (RIA) have discovered fewer incidents of domestic violence among married couples who smoke pot together.
The study, which appears in the August online edition of Psychology of Addictive Behaviors, looked at the marijuana use of both husbands and wives and predicted the number of frequent intimate partner violence (IPV) incidents perpetrated by husbands. They found that those who used pot at least two or three times per month frequently reported the least-frequent number of IPV incidents.
According to Taryn Hillin of the Huffington Post, the Buffalo researchers, along with colleagues from Yale and Rutgers, recruited 634 couples applying for marriage licenses in New York between 1996 and 1999. After an initial interview, the authors followed the couples over a nine-year period using mail-in surveys to measure marijuana’s impact on domestic violence.
For the purposes of the study, incidents of domestic violence or IPV were defined as acts of physical aggression, such as slapping, hitting, beating and choking, Hillin said. The study authors measured incidents by asking couples to report violence committed by them or toward them in the last year, and at the end of the first year, they found that 37.1 percent of all husbands had committed acts of domestic violence against their wives.
Marijuana use was measured by asking participants how often they used marijuana over the last year, she added. They also asked the participants about other drug use, including alcohol, which they noted is often used in conjunction with marijuana. They suspected that, since alcohol and other substance abuse have been known to increase IPV rate, the same would be true with pot use. However, that was not the case.
Over the course of the first nine years of marriage, the researchers found that more frequent marijuana use by husbands and wives predicted less frequent incidents of intimate partner violence perpetration by husbands, and that the male’s marijuana use also predicted less frequent IPV perpetration by wives. Couples in which both spouses frequently used pot reported the least frequent domestic violence perpetration, and the link was most evident in women who did not have histories of prior antisocial behavior.
“These findings suggest that marijuana use is predictive of lower levels of aggression towards one’s partner in the following year,” lead investigator Dr. Kenneth Leonard, director of the UB Research Institute on Addictions, said in a statement. “As in other survey studies of marijuana and partner violence, our study examines patterns of marijuana use and the occurrence of violence within a year period. It does not examine whether using marijuana on a given day reduces the likelihood of violence at that time.”
“Although this study supports the perspective that marijuana does not increase, and may decrease, aggressive conflict, we would like to see research replicating these findings, and research examining day-to-day marijuana and alcohol use and the likelihood to IPV on the same day before drawing stronger conclusions,” he added. “While couples who reported marijuana use also reported less marital aggression, previous research with these couples found that couples who smoked marijuana were not less likely to divorce.”
The authors emphasize that while the findings are predictive, they do not necessarily indicate that there is a causal relationship between the two behaviors, said Washington Post reporter Christopher Ingraham. It could simply be that smoking pot makes couples happy, and therefore less likely to fight, or it could be that chronic cannabis use decreases the likelihood of aggressive behavior.
Dr. Leonard and his colleagues note that their paper “does not address the potential impact of parental marijuana use on children in the family and other problems associated with daily marijuana use.” Furthermore, it also does not explore other areas of marijuana use, including abuse, dependence and withdrawal – all of which could impact how spouses interact with one another, Ingraham noted.
—–
Shop Amazon – The Wedding Store

Tomato-Rich Diets May Help Reduce The Risk Of Developing Prostate Cancer

Chuck Bednar for redOrbit.com – Your Universe Online

Men looking to reduce their risk of developing prostate cancer could benefit by consuming at least 10 servings of tomatoes per week, according to new research appearing in a recent edition of the journal Cancer Epidemiology, Biomarkers & Prevention.

Doing so can decrease the risk of contracting the disease by 18 percent, Vanessa Er of the University of Bristol’s School of Social and Community Medicine and her colleagues claim in the UK National Institute for Health Research (NIHR)-funded study. The findings follow an in-depth analysis of the diets and lifestyles of 1,806 men between the ages of 50 and 69 with prostate cancer, as well as those of 12,005 cancer-free male patients.

According to BBC News online health editor Helen Briggs, not only did they find that men consuming a total of at least 10 portions of fresh tomatoes, tomato juice, baked beans and other similar products experienced the 18 percent reduction in prostate cancer risk, but they also discovered that eating five servings of fruits or vegetables per day decreased the risk by 24 percent versus men eating half that amount.

“Our findings suggest that tomatoes may be important in prostate cancer prevention,” Er, a Ph. D. student at the university, said in a statement. “However, further studies need to be conducted to confirm our findings, especially through human [clinical] trials. Men should still eat a wide variety of fruits and vegetables, maintain a healthy weight and stay active.”

Er, who also worked with scientists from the University of Cambridge and Oxford University, explained that the reason tomato products appeared to be the most beneficial in reducing prostate cancer risk was due to lycopene, an antioxidant which helps combat toxins that can cause DNA and cell damage. They note that this is the first study of its kind to develop an index of dietary components that have been linked to prostate cancer.

“Only the recommendation on plant foods – high intake of fruits, vegetables and dietary fiber – was found to be associated with a reduced risk of prostate cancer,” the university explained. “As these recommendations are not targeted at prostate cancer prevention, researchers concluded that adhering to these recommendations is not sufficient and that additional dietary recommendations should be developed.”

In addition, Anna Hodgekiss of the Daily Mail reported that Er believes the best way to get lycopene – as well as other cancer-fighting dietary components such as selenium and calcium – is directly from food, not through supplements. In all, the study calls for men to make sure they ingest between 750mg and 1,200mg of calcium and between 105mcg to 200mcg of selenium daily.

Er and her colleagues are now calling for additional research to aid in the development of additional dietary recommendations to help prevent prostate cancer. She also told Hodgekiss that it was important to be cautious with the study because while they have found “a link,” their findings do not indicate “a proof of causation.”

Similarly, Tom Stansfeld of Cancer Research UK told Briggs that while “eating foods rich in lycopene – such as tomatoes – or selenium may be associated with a reduction in the risk of prostate cancer, this has not been proven, and this study can’t confirm whether there is a link between diet and prostate cancer risk. Diet and cancer prevention is a complex issue with few black and white answers; we encourage everyone to eat a balanced diet which is high in fruit and vegetables and low in red and processed meat, fat and salt.”

—–

The Too Many Tomatoes Cookbook: Classic & Exotic Recipes from around the World by Brian Yarvin

Unique Meteorite Helps Researchers Uncover The Climate History Of Mars

Chuck Bednar for redOrbit.com – Your Universe Online

A meteorite discovered in the Sahara Desert three years ago could hold the secrets to the climate history of Mars, and may ultimately help answer the question as to whether or not the now cold, dry Red Planet was once home to a warm environment capable of supporting life, an international team of researchers claim in a new study.

That meteorite, which is known as both Black Beauty and NWA 7533, is currently being analyzed by Florida State University professor Munir Humayun and his colleagues at the National High Magnetic Field Laboratory in Tallahassee. Chemical clues contained within that meteorite could be the key to unraveling the planet’s climatic history, and could hold key evidence supporting the existence of surface water on ancient Mars.

In research published online Sunday in the journal Nature Geoscience, the scientists report that they’ve discovered evidence for the climate shift in minerals known as zircons that are embedded deep inside the dark, glossy object. Zircons form when lava cools and are extremely long lasting – a feature that allows scientists to use them as a sort of record of the passage of time.

Last year, researchers tested samples from Black Beauty and confirmed that it did actually come from Mars, and Humayun and his colleagues determined that the zircons found in the meteorite were 4.4 billion years old. That is significant, the geochemistry professor explained, because it means that the object formed during the earliest stages of the Red Planet, and might have come from a time when the planet was capable of sustaining life.

“First we learned that, about 4.5 billion years ago, water was more abundant on Mars, and now we’ve learned that something dramatically changed that,” Humayun explained in a statement Wednesday. “Now we can conclude that the conditions that we see today on Mars, this dry Martian desert, must have persisted for at least the past 1.7 billion years. We know now that Mars has been dry for a very long time.”

The zircons found in NWA 7533 (ZrSiO4) contain oxygen, an element with three isotopes. Isotopes are atoms of the same element that have the same number of protons but a varying number of neutrons, the researchers explain. On Mars, oxygen is distributed in the atmosphere (as carbon dioxide, molecular oxygen and ozone), in the hydrosphere (as water) and in rocks. Since the planet’s atmosphere is thin and dry, the sun’s UV light causes unorthodox shifts in the proportions in which each of those three oxygen isotopes occur in different atmospheric gases.

“So when water vapor that has cycled through the Martian atmosphere condenses into the Martian soil, it can interact with and exchange oxygen isotopes with zircons in the soil, effectively writing a climate record into the rocks,” Florida State’s Kathleen Laufenberg explained. “A warm, wet Mars requires a dense atmosphere that filters out the ultraviolet light making the unique isotope shifts disappear.”

In order to measure the proportions of the oxygen isotopes in the zircons, Humayun, Alexander Nemchin of the Swedish Museum of Natural History, and institutions in the US, Australia and France used a device called an ion microprobe. Using that instrument, the researchers applied a focused beam of particles to the sample, obtaining precise measurements that they said have helped them craft an accurate isotopic record of the atmospheric changes that have taken place on Mars, complete with dates.

—–

Mars Up Close: Inside the Curiosity Mission by Marc Kaufman (Author). Forward by Elon Musk

Common Community-Acquired Methicillin-Resistant Staphylococcus Aureus Originated In Africa

Jim Sliwa, American Society for Microbiology

The predominant strain of community-acquired methicillin-resistant Staphylococcus aureus (CA-MRSA) infecting people in Europe, the Middle East and northern Africa derived from a single sub-Saharan ancestor, a team of international researchers reported this week in mBio®, the online open-access journal of the American Society for Microbiology.

CA-MRSA refers to MRSA infections occurring in healthy people with no recent hospitalizations. The infections, which are typically skin infections, can be transmitted through close person-to-person contact or contact with a contaminated item like a towel or clothing.

“With increasing levels of CA-MRSA reported from most parts of the Western world, there is a great interest in understanding the origin and factors associated with the emergence of these epidemic lineages,” said lead study author Marc Stegger, PhD, of the Department of Microbiology and Infection Control at the Statens Serum Institut in Denmark. “Our study determined that a single descendant of a methicillin-sensitive ancestor circulating in sub-Saharan Africa rose to become the dominant CA-MRSA clone in Europe, the Middle East and north Africa.”

In Europe, the predominant CA-MRSA strain belongs to a family called clonal complex 80 (CC80), which are resistant to the antibiotics kanamycin/amikacin, tetracyclin and fusidic acid, in addition to beta-lactams. It was first identified sporadically in the late 1990s, but has since been identified throughout northern Africa, the Middle East and Europe, with only sporadic reports from Asia, Australia and South America.

For the study, Stegger and colleagues at 19 other institutions around the world analyzed 97 S. aureus CC80 samples from 22 countries in Europe, North Africa, sub-Saharan Africa, the Middle East and Asia isolated between 1993 and 2010. Twenty-three samples were sensitive to methicillin while 74 were resistant to methicillin. The investigators performed whole genome sequencing, a technique that determines the complete DNA sequence of an organism’s genetic material at a single time, and other tests to trace the origin, evolution and dissemination pattern of the European CA-MRSA clone CC80.

Within the samples, the team identified two distinct groups of S. aureus: a methicillin-sensitive clone from sub-Saharan Africa that was susceptible to all antibiotics, and the rest from all other areas that were MRSA and most often resistant to other antibiotics. Studying family trees among the bacteria, they found that the European CC80 clone evolved from the strain from sub-Saharan Africa. They also noted that in the transition from a methicillin-sensitive line to a CA-MRSA clone, the bacteria simultaneously acquired two highly specific genetic elements making them resistant to methicillin and became resistant to fusidic acid.

The methicillin-sensitive S. aureus resided in sub-Saharan Western Africa, potentially as a result of the local human migration patterns, Stegger said. The investigators hypothesize that CC80 moved to other countries starting in the mid-1980s due to several factors, including increased migration from sub-Saharan Africa in search of better economics, and as a result of an increase in European tourism to this region of Africa, he said. The simultaneous acquisition of methicillin and fusidic acid resistance determinants and their stability in the European CA-MRSA could be a result of a higher selective pressure in North Africa and Europe.

The study was supported by Pfizer and the Foundation pour la Recherche Médicale. A copy of the article can be found online at http://bit.ly/asmtip0814d.

> Explore Further…

Benefits For Humanity: Found At Sea – NASA Benefits For Humanity Video Series

Laura Niles, International Space Station Program Science Office and Public Affairs Office, NASA’s Johnson Space Center
In the first installment of the International Space Station Benefits for Humanity video series, NASA showed how the station’s water purification technology is used to provide clean, safe water to an area plagued by a contaminated drinking source. We also met doctors who use a transformative tool in neurosurgery adapted from the station’s complex robotic arm, and we introduced the next generation of explorers inspired to learn more because of a virtual connection to the astronauts living and working in space.
[ Watch the Video: ISS Benefits For Humanity: Found At Sea ]
Now, join us on the high seas of the frigid Atlantic for a glimpse at how technology aboard the space station is working to make travel on the oceans of the world a safer place.
The Vessel-ID System investigation on the space station demonstrated the ability for an orbit-based radio receiver to track a ship’s Automatic Identification System (AIS) signal. The AIS signal is the marine equivalent of the air traffic control system. The Norwegian User Support and Operation Centre in Trondheim, Norway, receives the data for near-continuous evaluation. The Vessel-ID System is installed on the European Space Agency’s Columbus module.
Since being turned on in 2010, Vessel-ID has been able to relay more than 400,000 ship position reports from more than 22,000 ships in a single day, greatly advancing the ship tracking ability of coast guards around the world. This ability, coupled with multiple AIS tracking satellites launched since, is providing safer travel among the waves for thousands of ships around the globe. The ship identification and tracking system technology already aided in orienting rescue services for a lone survivor stranded in the North Sea, giving new hope to once impossible situations.
“This brought a whole new dimension to the monitoring of ship traffic on the open oceans,” said Terje Wahl, of the Norwegian Space Centre. “This project demonstrates that the International Space Station is not just for science and astronauts, but it really benefits mankind with down-to-Earth applications.”

Study Calls Into Question Link Between Prenatal Antidepressant Exposure And Autism Risk

Noah Brown, Massachusetts General Hospital
Previously reported autism risk appears to be attributable to mother’s illness, not medication
Previous studies that have suggested an increased risk of autism among children of women who took antidepressants during pregnancy may actually reflect the known increased risk associated with severe maternal depression. In a study receiving advance online publication in Molecular Psychiatry, investigators from Massachusetts General Hospital (MGH) report that – while a diagnosis of autism spectrum disorder was more common in the children of mothers prescribed antidepressants during pregnancy than in those with no prenatal exposure – when the severity of the mother’s depression was accounted for, that increased risk was no longer statistically significant. An increased risk for attention-deficit hyperactivity disorder (ADHD), however, persisted even after controlling for factors relating to a mother’s mental health.
“We know that untreated depression can pose serious health risks to both a mother and child, so it’s important that women being treated with antidepressants who become pregnant, or who are thinking about becoming pregnant, know that these medications will not increase their child’s risk of autism,” says Roy Perlis, MD, MSc, MGH Department of Psychiatry, senior author of the report.
The authors note that, while genetic factors are known to play a substantial role in autism, exactly how that risk may be exacerbated by environmental factors is not well understood. While animal studies and investigations based on health records have suggested an increased risk associated with prenatal antidepressant exposure, others found no such association. And since discontinuing antidepressant treatment significantly increases the risk of relapse – including an increased risk of postpartum depression – the current study was designed to clarify whether or not any increased autism risk could actually be attributed to the medication.
To investigate this possibility, the research team analyzed electronic health record data for children born at MGH, Brigham and Women’s Hospital, or Newton Wellesley Hospital – hospitals belonging to Partners HealthCare System – for whom a diagnostic code for pervasive developmental disorder, a category that includes autism, was entered at least once between 1997 and 2010. They matched data for almost 1,400 such children with that of more than 4,000 controls with no autism diagnoses, born the same years and matched for a variety of demographic factors.
The children’s information was paired with that of their mothers, noting any factors related to the diagnosis and treatment of major depression or other mental illness, including prescriptions for antidepressants and other psychotropic drugs. A similar analysis was done for almost 2,250 children with an ADHD diagnosis, compared with more than 5,600 matched controls with no ADHD diagnoses.
While prenatal exposure to antidepressants did increase the risk for either condition, in the autism-focused comparison, adjusting for factors indicating more severe maternal depression reduced the strength of that association to an insignificant level. Taking antidepressants with stronger action in the serotonin pathway, which has been suspected of contributing to a possible autism risk, did not increase the incidence of the disorder. In addition, the children of mothers who took a serotonin-targeting non-antidepressant drug for severe morning sickness had no increased autism incidence. Prescriptions for antipsychotic drugs sometimes used to treat severe, treatment-resistant depression, as well as psychotic disorders, did appear to increase the risk for autism. For ADHD, however, the increased risk associated with prenatal antidepressant exposure remained significant, although reduced, even after adjustment for the severity of maternal depression.
“There are a range of options – medication and non-medication – for treating depression and anxiety in pregnancy,” says Perlis, an associate professor of Psychiatry at Harvard Medical School. “But if antidepressants are needed, I hope parents can feel reassured about their safety.”
Caitlin Clements of the MGH Department of Psychiatry is lead author of the Molecular Psychiatry paper. Additional coauthors are Sarah Blumenthal, Hannah Rosenfield, Maurizio Fava, MD, Alysa Doyle, PhD, and Jordan Smoller, MD, ScD, MGH Psychiatry; Victor Castro , MD, and Shawn Murphy, MD, PhD, Partners Research Computing; Anjali Kaimal, MD, MAS, MGH Obstetrics and Gynecology; Elise Robinson, PhD, MGH Center for Human Genetic Research; Jane Erb, MD, and Isaac Kohane, MD, Brigham and Women’s Hospital, and Susanne Churchill, PhD, Partners Information Systems. Support for the study includes National Institute of Mental Health grant R01MH086026 and support from the Stanley Center for Psychiatric Research.
> Explore Further…

Everest Expedition Provides First Evidence Of Effects Of Altitude On Blood Pressure

Emma Mason, European Society of Cardiology

Cardiovascular patients should be cautious when exposed to high altitudes for leisure or work

An expedition to Mount Everest by Italian researchers has shown for the first time that blood pressure monitored over a 24-hour period rises progressively as people climb to higher altitudes. The researchers also found that while a drug used for lowering blood pressure, called telmisartan, was effective in counteracting the effects of altitude up to 3400 meters, it was not effective at 5400 meters above sea level – the height of the Everest base camp.

The study is published online Aug. 27 in the European Heart Journal, and its findings have implications not just for people who live, work or undertake recreational activities such as skiing and trekking at high altitudes, but also for people at lower altitudes who may be temporarily deprived of an adequate oxygen supply – a condition known as hypoxia. Hypoxia can lead to altitude sickness at high altitudes, but is also seen at sea level in people who suffer from sleep apnea when their breathing is temporarily interrupted by a blocked airway.

For the study, 13 of the 15 authors of the EHJ paper joined an expedition of 47 volunteers to the Mount Everest south base camp (altitude 5400 meters). They flew from Milan, Italy (altitude 120 meters) to Kathmandu, Nepal (1355 meters) where they stayed for three days. Then they went to Namche Bazaar (3400 meters) where they stayed for another three days before spending the next five days climbing to the Everest base camp where they stayed for 12 days.

During the expedition, the volunteers had their blood pressure taken in the conventional way over a five-minute period in the morning, but they also wore a device that measured their blood pressure every 15-20 minutes over a 24-hour period – giving readings for ambulatory blood pressure, which is a much more accurate measure of a person’s true blood pressure. It also has the advantage of being able to measure night-time blood pressure, which is normally 10-20% lower than daytime blood pressure, and which is a better predictor of outcome than other blood pressure measurements. When the blood pressure does not “dip” at night despite the person being asleep, this may indicate a problem in the regulation of the heart and blood vessels.

The participants were randomized to receive either 80 mg of a blood pressure lowering drug, telmisartan, or placebo. Telmisartan is known as an “angiotensin receptor blocker” (ARB) because it blocks the effects of a peptide called angiotensin II, which causes blood vessels to narrow. The researchers collected blood samples as well.

The researchers found that exposure to the very high altitude of 5400 meters was responsible for an increase of 14 mmHg in ambulatory systolic blood pressure and 10 mmHg in ambulatory diastolic blood pressure, averaged over a 24-hour period. They also found that telmisartan significantly reduced ambulatory blood pressure at sea level and at 3400 meters, while no effects could be seen soon after arriving at 5400 meters.

Professor Gianfranco Parati, professor of cardiovascular medicine at the University of Milano-Bicocca and Director of the Cardiology Research Laboratory at the Istituto Auxologico Italiano (Milan, Italy), who led the research, said:

“Our study provides the first systematic demonstration that exposure to progressively higher altitudes is associated with a progressive and marked increase in ambulatory blood pressure. The increase occurred immediately after the high altitude was reached, persisted during prolonged altitude exposure, was seen throughout the 24-hour period but was particularly pronounced at night when there was a reduction in the night-time ‘dip’, and disappeared after return to sea level. After reaching Everest base camp, the effect of high altitude was greater on systolic blood pressure in people aged 50 and over compared with younger people.

“The blood pressure increase seen with exposure to progressively more severe oxygen deprivation at higher altitude among volunteers in the placebo group and in both groups at 5400 meters may have implications for the management of patients with chronic diseases, including chronic heart failure in which breathing is interrupted periodically, acute worsening of chronic obstructive pulmonary disease, obstructive sleep apnea, and severe obesity. Together, these conditions affect more than 600 million people worldwide, making our results highly significant from a clinical perspective.

“This blood pressure increase is due to several factors, the most important being the effects of oxygen deprivation in increasing activity in the body’s sympathetic nervous system. This leads to the heart working harder and the peripheral blood vessels constricting.

“Our paper also provides the first demonstration of the efficacy at high altitude of one of the most common drugs used to treat high blood pressure. The ability of telmisartan in blocking the angiotensin II receptors is preserved during acute exposure to moderately high altitudes up to 3400 meters, but is impaired when people move to the very high altitudes of 5400 meters.

“At a practical level this implies that for people already being treated with angiotensin II receptor blockers such as telmisartan, the treatment will remain effective at altitudes reached by trekkers, climbers, skiers and workers, but will not work at very high altitudes where their blood pressure will probably become uncontrolled more easily.

“Our findings will also enable us to take appropriate action to warn cardiovascular patients of the need for caution whenever they are going to be exposed to high altitudes for leisure or work. In addition, this study emphasizes the importance of ambulatory blood pressure monitoring as compared to conventional blood pressure measurements in characterizing blood pressure levels in people’s real lives; this is particularly important when focusing on the effects of hypoxia, which can be much less evident at rest than during daily life activities.”

> Explore Further…

Researchers Discover Fever’s Origin

David Engblom, Linköping University
Fever is a response to inflammation, and is triggered by an onset of the signaling substance prostaglandin. Researchers at Linköping University can now see precisely where these substances are produced – a discovery that paves the way for smarter drugs.
When you take an aspirin, all production of prostaglandins in the body is suppressed. All symptoms of inflammation are eased simultaneously, including fever, pain and loss of appetite. But it might not always be desirable to get rid of all symptoms – there is a reason why they appear.
”Perhaps you want to inhibit loss of appetite but retain fever. In the case of serious infections, fever can be a good thing,” says David Engblom, senior lecturer in neurobiology at Linköping University.
Eleven years ago he had his first breakthrough as a researcher when he uncovered the mechanism behind the formation of prostaglandin E2 during fever. These signaling molecules cannot pass the blood-brain barrier, the purpose of which is to protect the brain from hazardous substances. Engblom showed that instead, they could be synthesized from two enzymes in the blood vessels on the inside of the brain, before moving to the hypothalamus, where the body’s thermostat is located.
Previous work from the research team described a very simple mechanism, but there was not yet proof that it was important in real life. The study to be published in The Journal of Neuroscience with David Engblom and his doctoral student Daniel Wilhelms as lead authors is based on tests with mice that lack the enzymes COX-2 and mPGES-1 in the brain’s blood vessels. When they were infected with bacterial toxins the fever did not appear, while other signs of inflammation were not affected.
”This shows that those prostaglandins which cause fever are formed in the blood-brain barrier – nowhere else. Now it will be interesting to investigate the other inflammation symptoms. Knowledge of this type can be useful when developing drugs that ease certain symptoms, but not all of them,” explains David Engblom.
For many years there has been debate as to where the fever signaling originates. Three alternative ideas have been proposed. Firstly, that it comes from prostaglandins circulating in the blood, secondly that it comes from immune cells in the brain, and thirdly Engblom’s theory, which stresses the importance of the brain’s blood vessels. The third proposal can now be considered confirmed.
Article: Deletion of prostaglandin E2 synthesizing enzymes in brain endothelial cells attenuates inflammatory fever by Daniel Björk Wilhelms, Milen Kirilov, Elahe Mirrasekhian, Anna Eskilsson, Unn Örtegren Kugelberg, Christine Klar, Dirk A. Ridder, Harvey R. Herschman, Markus Schwaninger, Anders Blomqvist and David Engblom. Journal of Neuroscience, 27 August 2014. DOI:10.1523/JNEUROSCI.1838-14-2014 http://www.jneurosci.org/
> Explore Further…

Chemical Changes In Crohn’s Disease Patients Could Help Screen For The Disease

Jen Middleton, University of Edinburgh

Genetic changes that occur in patients with the bowel condition Crohn’s disease could hold clues to fighting the illness.

Scientists have identified chemical changes in the DNA of patients with Crohn’s disease that could help to screen people for the disease.

These changes can be detected in blood samples, opening the door to a simple test for Crohn’s disease.

The findings also offer clues to how the condition develops and reveal possible targets for new treatments.

Several genes have been linked to Crohn’s disease but not everybody who inherits these genes will develop the condition. The discovery sheds light on how environmental factors that vary between individuals – such as diet and gut bacteria – can trigger Crohn’s disease in some people who have inherited these genes.

A study involving children with Crohn’s disease in Edinburgh, Aberdeen, and Glasgow – led by the University of Edinburgh – identified chemical changes in their DNA that affect how their genes work.

The genes that are affected by these changes could represent useful targets for new treatments, the scientists say.

A DNA test alone would not be enough to diagnose the disease but it could pinpoint those at most risk and help to reduce the number of people who are put forward for further tests, researchers say.

It could also help to monitor progression of the disease and how patients respond to treatment.

Crohn’s disease is a type of inflammatory bowel disease and a common cause of chronic ill-health in the UK. It is a particular problem in children in Scotland, where the incidence of the disease has increased by 500 percent in the past 50 years.

At present there is no way to prevent Crohn’s disease and therapy is focused on treating the symptoms, which may include abdominal pain, diarrhea and severe weight loss.

Professor Jack Satsangi, from the Centre for Genomic and Experimental Medicine at the University of Edinburgh, said: “Our study gives the strongest evidence yet that epigenetic changes are involved in Crohn’s disease. The findings provide a potential mechanism whereby diet or other environmental factors may modify genetic material to cause Crohn’s disease. We hope the findings will help to identify much-needed treatment opportunities for this debilitating condition.”

Reference: A.T. Adams et al. Two-stage Genome-wide Methylation Profiling in Childhood-onset Crohn’s Disease Implicates Epigenetic Alterations at the VMP1/MIR21 and HLA Loci. Inflammatory Bowel Diseases, August 2014.

> Explore Further…

World’s Smallest Modem: Intel Unveils New Penny-Sized 3G Technology

Chuck Bednar for redOrbit.com – Your Universe Online
Intel has launched what it is calling the world’s smallest modem for the Internet of Things – a roughly 300 sq mm device that can connect to a cellular network and link to interconnected smart devices.
According to the company, the XMM 6255 is a 3G modem that was custom-designed to work with things like connected appliances, wearable technology, security devices and smart meters. The size makes it “perfect for devices with small, unconventional form factors,” they said, adding that the model is “an example” of the Santa Clara, California-based company’s “efforts to provide network connectivity for the billions of connected devices.”
As mentioned above, the XMM 6255 has an area of approximately 300 sq mm. To put that into perspective, BBC News said that is just slightly larger than a penny. While the modem might be small, it is also made to withstand harsh conditions and is built to protect against overheating.
In addition, VentureBeat’s Dean Takahashi, reported the XMM 6255 chip includes a SMARTi UE2p transceiver component that uses only a tiny amount of electrical power (provided by an embedded source) to function. The modem also has the ability to transmit and receive and an integrated power amplifier, he added.
The smaller size of the chip and its components, as well as the reduced electrical power required to operate it will help the modem “survive in conditions where Internet of things sensors are deployed,” Takahashi explained. For example, he said that a farmer could deploy sensors to detect ground moisture in his or her fields. Those sensors would transmit data over the modem to a computer, which generates a report on which areas need to be watered.

Sergis Mushell, a research director at analytics firm Gartner, told BBC News that the product announcement was evidence Intel was looking to become a player in the mobile connectivity market. He also said that the connectivity of the XMM 6255, not just its size, shows that the company is “going after a significant stake in the Internet of Things market. Getting connectivity right is essential for their entire product portfolio.”
While 3G modems have been available since 2001, Takahashi explained that the chips have been consistently growing smaller as the market for them grows larger. He said that over one billion units have shipped over the past decade or so, and that by 2020, some analysts predict there will be as many as 50 billion Internet-connected devices in the world. Those devices, the VentureBeat writer points out, “will need a lot of modems.”
In a Tuesday blog post, Stefan Wolff, Intel’s vice president of the Mobile and Communications Group and general manager of Multicommunications, said the XMM 6255 is currently available in the u-blox SARA-U2 module, and that the company expected to have “updates on additional partnerships in the coming months.”
He also said the tiny new modem “provides reliable communication when it comes to transmitting information in low signal zones like a parking garage or a home basement,” and that the “integration of the power amplifier and transceiver” features in the XMM 6255 “simplifies the design and minimizes device development costs, which means developers can launch more products, more quickly, and in a more cost-effective manner.”
—–
Shop Amazon – Wearable Technology: Electronics

Genetic Modification, Invasive Species Overlooked In Calculation Of Biomass Production Limits

Chuck Bednar for redOrbit.com – Your Universe Online
Recent increases in human population and economic growth have increased the demand for land-plant biomass for food, fuel and other purposes, but according to scientists, the supply of the sum of leaf, stem, root, fruit and other terrestrial plant-based materials has been hampered by a limit to what can naturally be produced.
However, in new research appearing in a recent edition of the journal Environmental Science and Technology, University of Illinois plant biology professor Evan DeLucia and his colleagues reveal that this so-called theoretical limit of terrestrial plant productivity is actually much higher than previously believed.
“When you try to estimate something over the whole planet, you have to make some simplifying assumptions,” DeLucia said in a statement. “And most previous research assumes that the maximum productivity you could get out of a landscape is what the natural ecosystem would have produced. But it turns out that in nature very few plants have evolved to maximize their growth rates.”
DeLucia and a team of experts from the University of Illinois, Colorado State University and the USDA Agricultural Research Service’s Photosynthesis Research Unit explained that, based on estimates derived from satellite images of vegetation and modeling, roughly “54 gigatons of carbon is converted into terrestrial plant biomass each year.”
Over the past several decades, that biomass value has remained stable, which has led many scientists to conclude that it represents an upper limit on the amount that can be produced worldwide. However, the authors of the new study suggest that these assumptions overlook key factors such as human attempts that could increase overall plant productivity – efforts such as genetic manipulation, plant breeding and land management.
According to DeLucia, some work in this field has already enjoyed great success. For instance, Miscanthus x giganteus, a hybrid grass produced in Illinois without the use of fertilizers or irrigation, produced between 10 and 16 tons of above-ground biomass per acre – more than doubling the output of native prairie vegetation or corn. In addition, genetically modified no-till corn produces five-times more total biomass per acre than restored prairie.
Overall biomass production could also be increased by introducing invasive or non-native species to new areas. While DeLucia and his colleagues caution this could be harmful to some ecosystems, non-native species nonetheless demonstrate the potential for increasing the overall productivity levels of plants.
For example, a non-native species introduced in Iceland (the nootka lupine) produces four times more biomass than the species it displaces (the boreal dwarf birch), while Indian bamboo plantations produce roughly 40 percent more biomass than native dry, deciduous tropical forests. These examples prove that the net primary production (NPP) of plants has yet not been maxed out, the study authors explain in their paper.
The research team used what is known as a simple light-use efficiency model in combination with the theoretical maximum efficiency with which solar radiation is converted to biomass by plant canopies to estimate the theoretical NPP limit on a worldwide scale. Their newly calculated limit was approximately “two orders of magnitude higher” than the biomass productivity levels of most current managed or natural ecosystems.
“We’re not saying that this is even approachable, but the theory tells us that what is possible on the planet is much, much higher than what current estimates are,” DeLucia explained. “Taking into account global water limitations reduced this theoretical limit by more than 20 percent in all parts of the terrestrial landscape except the tropics, but even that water-limited NPP is many times higher than we see in our current agricultural systems.”
Image 2 (below): Scientists have historically underestimated the potential productivity of the earth’s land plants, researchers report in a new study. Credit: NASA Earth Observatory image by Jesse Allen
—–
Join Amazon Student – FREE Two-Day Shipping for College Students

Education And A Dog-Friendly Environment Could Help Tackle Obesity In Both Owner And Pet

April Flowers for redOrbit.com – Your Universe Online

Pets are our companions, our friends and our caregivers. They are good for our mental well-being, but what about our physical well-being? A new study from the University of Liverpool indicates that the well-being of both the pet and the owner could be improved with education and pet-friendly facilities.

The study, published in the International Journal of Behavioral Nutrition and Physical Activity, suggests that communities invest in dog owner education and facilities to target inactivity and obesity in both pets and their owners.

The research team, led by population health scientist Dr. Carri Westgarth from Liverpool’s Department of Epidemiology and Population Health, conducted a literature review of studies published since 1990. They found that access to dog-friendly walking environments and better education about dogs’ physical needs motivate people to get out and take more exercise with their pets.

Studies estimate that at least 40 percent of dog owners do not walk their pets. Nearly one-quarter of all households in the UK own a dog, but less than 50 percent of adults meet the recommended level of 150 minutes of physical activity a week.

The research team from the University’s Institute of Infection and Global Health analyzed the results of 31 studies from the UK, US, Australia and Japan to understand how to motivate people to exercise using dog walking.

One common thread throughout the studies was the wide variety of understanding among dog owners of how much exercise their animal needs. This understanding had a direct effect on how much they walked their dog. The research team believes this could be addressed with education programs.

The researchers also found that people without access to high quality areas that support dog walking (dog parks that allow dogs off leash and provide waste disposal facilities, for example), are less likely to walk with their dog. Both the owner and the pet miss out on the associated health benefits.

Westgarth said, “It is easy to assume that people who own dogs are more likely to take exercise, but the reality can be very different. If all people who owned a dog walked with it every day, physical activity levels would be much improved, benefiting the health of both the owners and their canine companions.”

“There are a large number of reasons why people do or don’t walk their dog and it is worth considering how we can address this when designing strategies for reducing obesity, or when planning urban areas and public open space. Not being able to let their dog off the leash is a particular put-off.”

One finding that did not surprise the researchers is that the strength of the dog-owner bond is important. Dog owners with high attachments to their pets, and those who felt a great degree of support from their pets, were more likely to walk with them.

Dr. Westgarth said, “The study also found that some people are worried about their dogs’ behavior and may be less likely to take it out to the park – potentially out of embarrassment or worry about how it might act – but lack of walks may also be causing this bad behavior, due to boredom, frustration or lack of socialization.

“There aren’t many studies in this area at the moment, but with such a large proportion of people having a dog, it seems that better education, facilities and improved relationships with our pets could be a great way for a large portion of the population to feel encouraged to exercise.”

—–

Everything you need for your pet direct from Amazon.com – Pet Supplies

ALMA, Hubble Help Astronomers Obtain Best Ever View Of Early Merging Galaxies

Chuck Bednar for redOrbit.com – Your Universe Online
Using a battery of observatories that included the Atacama Large Millimeter/submillimeter Array (ALMA) and the Hubble Space Telescope, an international team of astronomers has obtained the best view to date of a collision between two galaxies that took place when the universe was just a fraction of its current age.
According to ESA, the team also utilized a gravitational lens to magnify galaxy HATLAS J142935.3-002836, revealing otherwise undetectable details and finding that this distant and complex object is similar in appearance to a local galaxy collision known as the Antennae Galaxies. Their work is detailed in the latest edition of the journal Astronomy & Astrophysics.
[ Watch the Video: Zooming In On A Gravitationally Lensed Galaxy Merger In The Distant Universe ]
“While astronomers are often limited by the power of their telescopes, in some cases our ability to see detail is hugely boosted by natural lenses created by the Universe,” lead author Hugo Messias of the Universidad de Concepción in Chile and the Centro de Astronomia e Astrofísica da Universidade de Lisboa in Portugal explained in a statement. “Einstein predicted in his theory of General Relativity that, given enough mass, light does not travel in a straight line but will be bent in a similar way to a normal lens.”
Massive structures such as galaxies and galaxy clusters help form these cosmic lenses, deflecting the light from objects obscured behind them due to their strong gravity. This effect, which is known as gravitational lensing, magnifies the properties of these hidden objects, allowing scientists to analyze them when such research would ordinarily have been impossible and allowing them to compare local galaxies with far more distant ones.
For this technique to work, however, the lensing galaxy in the foreground and the one being magnified need to be precisely aligned, which Messias said is a rare occurrence that is often hard to identify. However, he noted that recent studies have demonstrated they can be found more easily by conducting observations at far-infrared and millimeter wavelengths.
Galaxy H-ATLAS J142935.3-002836 (also known as H1429-0028), which was discovered in the Herschel Astrophysical Terahertz Large Area Survey (H-ATLAS), falls into this category. The study authors explain it is one of the brightest gravitationally lensed objects in the far-infrared regime discovered thus far, and is being observed at a time when the universe was approximately half of its current age.
[ Watch the Video: Artist’s Impression Of Gravitational Lensing Of A Distant Merger ]
After originally locating H1429-0028, the astronomers launched an extensive follow-up campaign that required the use of ALMA, Hubble, the Keck Observatory, the Karl G. Jansky Very Large Array (VLA) and other telescopes. By using so many different instruments, the research team was able to obtain several different views of the object, which they report could be combined to provide the best insight yet into the nature of these merging galaxies.
“ALMA enabled us to solve this conundrum because it gives us information about the velocity of the gas in the galaxies, which makes it possible to disentangle the various components, revealing the classic signature of a galaxy merger,” said Rob Ivison, ESO’s Director of Science and a co-author of the new study. “This beautiful study catches a galaxy merger red handed as it triggers an extreme starburst.”
“With the combined power of Hubble and these other telescopes we have been able to locate this very fortunate alignment, take advantage of the foreground galaxy’s lensing effects and characterize the properties of this distant merger and the extreme starburst within it,” he added. “It is very much a testament to the power of telescope teamwork.”
Image 2 (below): This diagram shows how the effect of gravitational lensing around a normal galaxy focuses the light coming from a very distant star-forming galaxy merger to create a distorted, but brighter view. Credit: ESO/M. Kornmesser
—–
Dragon Models 1/400 NASA Space Shuttle Discovery With Hubble Space Telescope

Ever Growing Number Of Women With Gestational Diabetes Suggests Future Will Be Filled With Children With Early Diabetes

Sonia Caprio (Yale University), Diabetologia

New research published in Diabetologia (the journal of the European Association for the Study of Diabetes) shows that children exposed to gestational diabetes in the wombs of their mothers are themselves around six times more likely to develop diabetes or prediabetes than children not exposed. The research is by Dr Sonia Caprio, Yale University School of Medicine, New Haven, CT, USA, and colleagues.

With the increase in gestational diabetes (GDM), there is a growing need to understand the effects of glucose exposure on the newborn in the womb, at birth and later in life. The risk of developing impaired glucose tolerance (IGT) (prediabetes) in individuals exposed to diabetes in the womb has not, say the authors, been adequately investigated. Thus in this new study, the authors examined the risk in obese youths of developing IGT after exposure to GDM in the womb. The authors say: “We hypothesized that prenatal exposure to GDM in obese children with normal glucose tolerance (NGT) would be associated with development of altered glucose metabolism over time, driven by an impairment of beta cell secretion relative to the insulin sensitivity.”

255 obese adolescents with a normal glucose tolerance were selected for the study. All of them were investigated for in utero exposure to GDM and underwent an OGTT, which was repeated after approximately 3 years. The authors found that 210 (82%) participants were not exposed to GDM (called the NGDM group), and 45 (18%) were exposed to GDM (the EGDM group). In the NGDM group, only 9% (n=18) developed either IGT or type 2 diabetes compared with 31% (n=14) of the EGDM group who developed either IGT or type 2 diabetes, with both results statistically significant.. “Exposure to GDM was the most significant predictor of developing IGT or type 2 diabetes, with an increased risk of almost six times for those children exposed to GDM in the womb,” say the authors.

At baseline, the EGDM group showed a reduction in beta cell function (the cells that produce insulin), and, at follow-up, they also displayed a reduction in insulin sensitivity compared with the NGDM group.

“Our study demonstrates that obese normal glucose-tolerant children of GDM mothers have pre-existing defects in beta cell function,” say the authors. “This is in turn a strong risk factor for these children to develop prediabetes or diabetes.”

They add: “The ever growing number of women with gestational diabetes (18%) suggests that the future will be filled with children with early diabetes at a rate that far exceeds the current prevalence.”

They conclude: ” Offspring of GDM mothers ought to be screened for IGT and/or impaired fasting glucose (another form of prediabetes), and preventive and therapeutic strategies should be considered before the development of full clinical manifestation of diabetes. While we cannot use this analysis for development of definitive screening guidelines, we strongly suggest that, among obese children and adolescents exposed to GDM, specifically if additional risk factors are present—such as severe obesity or being of ethnicity minorities at higher risk—oral glucose tolerance tests should be performed at baseline (specifically in mid-pubertal adolescents) and potentially repeated based on clinical judgment. Furthermore, the need for studies aimed at unravelling the role of genetic or epigenetic factors and environmental postnatal factors that might be causing functional defects in the beta cell has never been more urgent.”

Aspirin May Reduce The Risk Of Blood Clots Reoccurring

American Heart Association
Aspirin may be a promising alternative for those who can’t take long-term anticoagulant drugs that prevent clots from reoccurring in the veins, according to new research in the American Heart Association journal Circulation.
In a combined analysis of two similar independent studies, 1,224 patients who received 100 mg of aspirin a day to treat blood clots were monitored for at least two years. In the International Collaboration of Aspirin Trials for Recurrent Venous Thromboembolism or INSPIRE analysis, researchers found that aspirin reduced the risk of recurring blood clots by up to 42 percent.
Venous thromboembolism (VTE) are blood clots in veins. The blood clot can occur in the deep veins of the legs (deep vein thrombosis) and can break apart and travel to the lungs, where they block off arteries in the lungs (pulmonary embolism).
According to researchers, without treatment, people who have blood clots in their veins with no obvious cause have on average a 10 percent risk of another clot within the first year and a 5 percent risk per year thereafter.
“The treatment is warfarin or a newer anticoagulant usually given for at least six to 12 months to prevent a further blood clot,” said John Simes, M.D., lead author of study and director of the National Health and Medical Research Council Clinical Trials Centre and professor at the University of Sydney in Australia. “However, these people continue to be at risk.”
Co-author Cecilia Becattini, M.D., added, “Aspirin does not require laboratory monitoring, and is associated with about a 10-fold lower incidence of bleeding compared with oral anticoagulants. We are convinced that it will be an alternative for extended prevention of venous thromboembolism after 6–12 months of anticoagulant treatment.”
Although the study yielded clear results, researchers advise patients to talk to their doctor about taking aspirin after stopping treatment with anticoagulants.
“It is not recommended that aspirin be given instead of anticoagulant therapy, but rather be given to patients who are stopping anticoagulant therapy or for whom such treatments are considered unsuitable,” Simes said.
“Although less effective, aspirin is inexpensive, easily obtainable, safe and familiar to patients and clinicians worldwide. If cost is the main consideration, aspirin is a particularly useful therapy. The costs of treating future thromboembolic events is greater than the cost of the preventive treatment.”
Other co-authors are Giancarlo Agnelli, M.D.; John W. Eikelboom, M.B., B.S.; Adrienne C. Kirby, M.Sc.; Rebecca Mister, M.Sc.; Paolo Prandoni, M.D.; and Timothy A. Brighton, M.B., B.S. Author disclosures and funding information are on the manuscript.

Cancer Leaves A Common Fingerprint On DNA

Shawna Williams, Johns Hopkins Medicine

Chemical alterations to genes appear key to tumor development

Regardless of their stage or type, cancers appear to share a telltale signature of widespread changes to the so-called epigenome, according to a team of researchers. In a study of a broad variety of cancers, published online in Genome Medicine on Aug. 26, the investigators say they have found widespread and distinctive changes to chemical marks known as methyl groups attached to DNA. Those marks help govern whether genes are turned “on” or “off,” and ultimately how the cell behaves. Such reversible chemical marks on DNA are known as epigenetic, and together they make up the epigenome.

“Regardless of the type of solid tumor, the pattern of methylation is much different on the genomes of cancerous cells than in healthy cells,” says Andrew Feinberg, M.D., M.P.H., a professor of medicine, molecular biology and genetics, oncology, and biostatistics at the Johns Hopkins University School of Medicine. Feinberg led the new study along with Rafael Irizarry, Ph.D., a professor of biostatics at Harvard University and the Dana-Farber Cancer Institute. “These changes happen very early in tumor formation, and we think they enable tumor cells to adapt to changes in their environment and thrive by quickly turning their genes on or off,” Feinberg says.

Feinberg, along with Johns Hopkins University School of Medicine oncology professor Bert Vogelstein, M.D., first identified abnormal methylation in some cancers in 1983. Since then, Feinberg’s and other research groups have found other cancer-associated changes in epigenetic marks. But only recently, says Feinberg, did researchers gain the tools needed to find out just how widespread these changes are.

For their study, the research team took DNA samples from breast, colon, lung, thyroid and pancreas tumors, and from healthy tissue, and analyzed methylation patterns on the DNA. “All of the tumors had big blocks of DNA where the methylation was randomized in cancer, leading to loss of methylation over big chunks and gain of methylation in smaller regions,” says Winston Timp, Ph.D., an assistant professor of biomedical engineering at Johns Hopkins. “The changes arise early in cancer development, suggesting that they could conspire with genetic mutations to aid cancer development,” he says.

The overall effect, Feinberg says, appears to be that cancers can easily turn genes “on” or “off” as needed. For example, they often switch off genes that cause dangerous cells to self-destruct while switching on genes that are normally only used very early in development and that enable cancers to spread and invade healthy tissue. “They have a toolbox that their healthy neighbors lack, and that gives them a competitive advantage,” Feinberg says.

“These insights into the cancer epigenome could provide a foundation for development of early screening or preventive treatment for cancer,” Timp says, suggesting that the distinctive methylation “fingerprint” could potentially be used to tell early-stage cancers apart from other, harmless growths. Even better, he says, would be to find a way to prevent the transition to a cancerous fingerprint from happening at all.

Other authors on the paper are Hector Corrada Bravo of the University of Maryland, College Park, and Oliver G. McDonald, Michael Goggins, Chris Umbricht and Martha Zeiger, all of The Johns Hopkins University.

The study was funded by the National Human Genome Research Institute (grant number HG003223), the National Cancer Institute (grant number CA054358), the National Institute of General Medical Sciences (grant numbers GM083084 and GM103552) and the National Center for Research Resources (grant number RR021967).

> Explore Further…

Biomimetic Photodetector ‘Sees’ In Color

Jade Boyd, Rice University
Rice lab uses CMOS-compatible aluminum for on-chip color detection
Rice University researchers have created a CMOS-compatible, biomimetic color photodetector that directly responds to red, green and blue light in much the same way the human eye does.
The new device was created by researchers at Rice’s Laboratory for Nanophotonics (LANP) and is described online in a new study in the journal Advanced Materials. It uses an aluminum grating that can be added to silicon photodetectors with the silicon microchip industry’s mainstay technology, “complementary metal-oxide semiconductor,” or CMOS.
Conventional photodetectors convert light into electrical signals but have no inherent color-sensitivity. To capture color images, photodetector makers must add color filters that can separate a scene into red, green and blue color components. This color filtering is commonly done using off-chip dielectric or dye color filters, which degrade under exposure to sunlight and can also be difficult to align with imaging sensors.
“Today’s color filtering mechanisms often involve materials that are not CMOS-compatible, but this new approach has advantages beyond on-chip integration,” said LANP Director Naomi Halas, the lead scientist on the study. “It’s also more compact and simple and more closely mimics the way living organisms ‘see’ colors.
Biomimicry was no accident. The color photodetector resulted from a $6 million research program funded by the Office of Naval Research that aimed to mimic cephalopod skin using “metamaterials,” compounds that blur the line between material and machine.
Cephalopods like octopus and squid are masters of camouflage, but they are also color-blind. Halas said the “squid skin” research team, which includes marine biologists Roger Hanlon of the Marine Biological Laboratory in Woods Hole, Mass., and Thomas Cronin of the University of Maryland, Baltimore County, suspect that cephalopods may detect color directly through their skin.
Based on that hypothesis, LANP graduate student Bob Zheng, the lead author of the new Advanced Materials study, set out to design a photonic system that could detect colored light.
“Bob has created a biomimetic detector that emulates what we are hypothesizing the squid skin ‘sees,’” Halas said. “This is a great example of the serendipity that can occur in the lab. In searching for an answer to a specific research question, Bob has created a device that is far more practical and generally applicable.”
Zheng’s color photodetector uses a combination of band engineering and plasmonic gratings, comb-like aluminum structures with rows of parallel slits. Using electron-beam evaporation, which is a common technique in CMOS processing, Zheng deposited a thin layer of aluminum onto a silicon photodetector topped with an ultrathin oxide coating.
Color selection is performed by utilizing interference effects between the plasmonic grating and the photodetector’s surface. By carefully tuning the oxide thickness and the width and spacing of the slits, Zheng was able to preferentially direct different colors into the silicon photodetector or reflect it back into free space.
The metallic nanostructures use surface plasmons — waves of electrons that flow like a fluid across metal surfaces. Light of a specific wavelength can excite a plasmon, and LANP researchers often create devices where plasmons interact, sometimes with dramatic effects.
“With plasmonic gratings, not only do you get color tunability, you can also enhance near fields,” Zheng said. “The near-field interaction increases the absorption cross section, which means that the grating sort of acts as its own lens. You get this funneling of light into a concentrated area.
“Not only are we using the photodetector as an amplifier, we’re also using the plasmonic color filter as a way to increase the amount of light that goes into the detector,” he said.
Co-authors include Rice graduate student Yumin Wang and Peter Nordlander, professor of physics and astronomy at Rice. The research was supported by the Office of Naval Research, the Department of Defense’s National Security Science and Engineering Faculty Fellowship Program and the Robert A. Welch Foundation.
> Explore Further…

Common Gut Bacteria May One Day Help Protect Against Food Allergies

April Flowers for redOrbit.com – Your Universe Online

There has been a great deal of research into food allergies lately — ranging from the effect of race on food allergies to how tap water might be linked to the development of allergies. The latest study, from the University of Chicago, shows that the presence of a common gut bacteria, Clostridia, may help protect against food allergies.

The findings, published in Proceedings of the National Academy of Sciences, show that by inducing immune responses designed to prevent food allergies from entering the food stream, Clostridia is able to minimize allergen exposure and prevent sensitization in mice. Sensitization is a necessary step in the development of food allergies. The researchers hope that their results will lead toward probiotic therapies for food allergies, which so far, have been untreatable.

Food allergies are an immune system response to certain foods that can sometimes be deadly. Scientists do not know the cause of these allergies, but some studies have indicated that part of the cause might be modern hygienic and dietary practices – specifically by the way they disturb the body’s natural bacterial composition. Food allergy rates among children have risen significantly in recent years, approximately 50 percent between 1997 and 2011. Other studies have revealed a link between antibiotic and antimicrobial use and the rise of food allergies, as well.

“Environmental stimuli such as antibiotic overuse, high fat diets, caesarean birth, removal of common pathogens and even formula feeding have affected the microbiota with which we’ve co-evolved,” said Cathryn Nagler, PhD, Bunning Food Allergy Professor at the University of Chicago, in a recent statement. “Our results suggest this could contribute to the increasing susceptibility to food allergies.”

Nagler and her colleagues used food allergen responses in mice to test the effect of gut bacteria on food allergies. A group of germ-free mice that were born and raised in sterile conditions (thus having no resident microorganisms), and a group of mice treated with antibiotics as newborns to reduce gut bacteria were both exposed to peanut allergens. Strong immunological responses were seen in both groups of mice. These responses produced significantly higher levels of antibodies against the allergens than mice with normal gut bacteria.

The team was then able to reverse the sensitization to the allergens by reintroducing a mix of Clostridia bacteria into the mice. Introducing another major gut bacteria, Bacteroides, was unsuccessful in reversing the sensitization. This suggests that Clostridia has a unique, protective role in the body’s fight against food allergens.

The team continued their investigation, trying to discover the mechanism by which Clostridia’s protective role was affected. They studied the cellular and molecular responses that occur in the gut, using genetic analysis to discover Clostridia causes innate immune cells to produce high levels of a signaling molecule known to decrease the permeability of the intestinal lining known as interleukin-22 (IL-22).

“We’ve identified a bacterial population that protects against food allergen sensitization,” Nagler said. “The first step in getting sensitized to a food allergen is for it to get into your blood and be presented to your immune system. The presence of these bacteria regulates that process.” Nagler cautions that people with food allergies should not get their hopes up yet. The study findings are most likely applicable at a population level, not an individual one yet.

Factors such as genetics have a great effect on an individual’s chances of developing a food allergy, as well as how those allergies manifest. The identification of a bacteria-induced barrier-protective response, however, represents a sea change in the understanding of how to prevent sensitization to food. Common in the human gut, Clostridia provides a clear target for potential therapies to prevent or treat food allergies. The researchers are continuing their work by developing and testing compositions to be used for probiotic therapy, for which they have filed a provisional patent.

“It’s exciting because we know what the bacteria are; we have a way to intervene,” Nagler said. “There are of course no guarantees, but this is absolutely testable as a therapeutic against a disease for which there’s nothing. As a mom, I can imagine how frightening it must be to worry every time your child takes a bite of food.”

“Food allergies affect 15 million Americans, including one in 13 children, who live with this potentially life-threatening disease that currently has no cure,” said Mary Jane Marchisotto, senior vice president of research at Food Allergy Research & Education. “We have been pleased to support the research that has been conducted by Dr. Nagler and her colleagues at the University of Chicago.”

Medical Marijuana Laws May Be Linked To Decrease In Prescription Overdose Deaths

April Flowers for redOrbit.com – Your Universe Online

Despite controversy and federal laws, a few states have made marijuana legal for medical use to manage chronic pain and other conditions. A new study, published in JAMA Internal Medicine, reveals that those states have a 25 percent lower rate of death by prescription drug overdose than in states where marijuana remains illegal.

The research team included scientists from Johns Hopkins Bloomberg School of Public Health, the Philadelphia Veterans Affairs Medical Center, and the Perelman School of Medicine at the University of Pennsylvania. Medical marijuana laws are controversial, and opponents have raised concerns that such laws might promote the use of marijuana among children. The study findings, however, show that these laws have unintended benefits, as well. The researchers acknowledge that more research is needed, but suggest that the broader availability of medical marijuana might help reduce the alarming growth in overdose deaths attributed to prescription pills.

In particular, the study looked at opioid analgesics such as Vicodin, OxyContin and Percocet, which are all prescribed for moderate to severe pain. These drugs work by suppressing a person’s perception of pain.

“Prescription drug abuse and deaths due to overdose have emerged as national public health crises,” Colleen L. Barry, PhD, an associate professor in the Department of Health Policy and Management at the Bloomberg School, said in a statement. “As our awareness of the addiction and overdose risks associated with use of opioid painkillers such as Oxycontin and Vicodin grows, individuals with chronic pain and their medical providers may be opting to treat pain entirely or in part with medical marijuana, in states where this is legal.”

The researchers used Centers for Disease Control and Prevention (CDC) death certificate data, finding that the rate of prescription painkiller overdose deaths increased across all 50 states from 1999 to 2010. The rates of opioid analgesic overdose deaths, however, was approximately 25 percent lower in 13 states with medical marijuana laws active during the study period.

As of June 2014, 23 states and Washington DC have enacted medical marijuana laws. These laws allow patients with chronic or severe pain from conditions such as cancer or multiple sclerosis to use marijuana to relieve their symptoms. Patients with nausea and depressed appetite have also found relief.

“In absolute terms, states with a medical marijuana law had about 1,700 fewer opioid painkiller overdose deaths in 2010 than would be expected based on trends before the laws were passed,” said Marcus Bachhuber, MD, of the Philadelphia Veterans Affairs Medical Center and the University of Pennsylvania. The mechanism underlying these results is not yet clear, but Bachhuber says that it might be due to people with chronic pain choosing alternative treatments. Medical marijuana laws could also be causing the decrease in overdose deaths by changing the way people abuse or misuse prescription pain medications.

Most deaths, approximately 60 percent, resulting from opioid analgesic overdoses occur in patients who have legitimate prescriptions. The rate of non-cancer patients who are prescribed opioids for pain has almost doubled in the last ten years. States that allow medical marijuana use also allow doctors to prescribe marijuana instead of such opioids. Trevor Hughes of USA Today reports that more than 15,000 US citizens die annually from prescription painkiller overdose.

“People already taking opioids for pain may supplement with medical marijuana and be able to lower their painkiller dose, thus lowering their risk of overdose,” Bachhuber said in a separate statement.

The benefits and risks of using medical marijuana to treat pain remain unclear, according to Brendan Saloner, PhD, an assistant professor in the Department of Health Policy and Management at the Bloomberg School. “Given the fast pace of policy change, more research is critical to understand how medical marijuana laws might be influencing both overdose deaths and the health trajectories of individuals suffering from chronic pain,” he said.

Other key findings of the study suggest that the relationship between medical marijuana laws and lower overdose rates has strengthened overtime. The first year after the state’s law was implemented, deaths decreased by nearly 20 percent, while five years after the rate was 33.7 percent lower. The authors suggest that more states should enact such laws and that more studies are needed to understand the relationship between medical marijuana use and the decrease in opioid overdoses.

In a related commentary, Marie J. Hayes, Ph.D., of the University of Maine, Orno, and Mark S. Brown, M.D., of the Eastern Maine Medical Center, Bangor, write, “If medical marijuana laws afford a protective effect, it is not clear why. If the decline in opioid analgesic-related overdose deaths is explained, as claimed by the authors, by increased access to medical marijuana as an adjuvant medication for patients taking prescription opioids, does this mean that marijuana provides improved pain control that decreases opioid dosing to safer levels?”

Kevin Sabet, director of the Drug Policy Institute at the University of Florida College of Medicine, expressed concern over the method of data collection and analysis in this study. He told Hughes that the study should have differentiated between states with strict and lax medical marijuana laws, as well as examine emergency-room admissions and prescription data. He would also like to have seen them include data on the impact of methadone clinics, finding it hard to believe that there is such a sweeping reduction in predicted deaths.

“In today’s supercharged discussions, it could be easily misunderstood by people,” he said of the study, which he faulted for drawing distinct conclusions based on limited data. “There may be promise in marijuana-based medications but that’s a lot different than ‘here’s a joint for you to smoke.'”

Heart Association Calls For E-Cigarettes To Be Regulated As Tobacco Products

redOrbit Staff & Wire Reports – Your Universe Online
While the data suggests that the e-cigarettes appear to be less harmful than traditional cigarettes and could in some cases help people kick the habit, the American Heart Association said on Sunday that since the products contain nicotine, they should be classified as tobacco products and subject to all applicable laws.
The policy statement, which has been published in the organization’s journal Circulation, explained that there has been little research into whether or not the electronic devices work as smoking cessation aids, with just two randomized controlled trials, one large cross-sectional study, and some anecdotal reports and online surveys analyzing the issue.
“The overall health effects of e-cigarettes should be considered both in the context of the intrinsic toxicity of e-cigarettes and with regard to their relative toxicity compared with the well-known injurious effects of smoking conventional cigarettes,” the panel of authors, who completed the work on the group’s Advocacy Coordinating Committee, Council on Care and Outcomes Research, wrote in their report.
“Even if there are some intrinsic adverse health effects of e-cigarettes, there would be a public health benefit if e-cigarettes proved to be much less hazardous than combustible cigarettes and if smokers could switch entirely from conventional cigarettes to e-cigarettes,” they added. “However, in general, the health effects of e-cigarettes have not been well studied, and the potential harm incurred by long-term use of these devices remains completely unknown.”
According to AP Chief Medical Writer Marilynn Marchione, the American Heart Association’s policy statement takes a similar stance as that unofficially assumed by the American Cancer Society back in May. Both groups are concerned about the products and in favor of additional regulation, especially when it comes to younger smokers, and both encourage smokers to try proven traditional cessation methods first.
However, if those established techniques fail, Heart Association president Dr. Elliott Antman said that it would be “reasonable to have a conversation” about e-cigarettes, which are battery-powered devices that vaporize nicotine, Marchione said. Similarly, the Cancer Society had previously said that e-cigarettes “may be a reasonable option” for those who had already unsuccessfully tried to quit using methods like nicotine patches, she added.
However, in a statement, Heart Association CEO Nancy Brown emphasized the group’s concerns over the use of the devices by adolescents and teenagers. In fact, the organization said that a recent survey of 6th through 12th grade students found that 1.78 million high school and middle school students in the US had tried e-cigarettes as of 2012, and that 76.3 percent of e-cigarette users said they also smoked conventional cigarettes.
“Over the last 50 years, 20 million Americans died because of tobacco. We are fiercely committed to preventing the tobacco industry from addicting another generation of smokers,” she said. “Recent studies raise concerns that e-cigarettes may be a gateway to traditional tobacco products for the nation’s youth, and could renormalize smoking in our society.”
“Nicotine is a dangerous and highly addictive chemical no matter what form it takes – conventional cigarettes or some other tobacco product,” added Dr. Antman. “Every life that has been lost to tobacco addiction could have been prevented. We must protect future generations from any potential smokescreens in the tobacco product landscape that will cause us to lose precious ground in the fight to make our nation 100 percent tobacco-free.”
E-cigarettes, which were created in China and first sold in 2003, are available in over 7,000 different flavors, including many that are attractive to youngsters (such as bubble gum, caramel, chocolate, fruit and mint) the Heart Association said. In April, the US Food and Drug Administration (FDA) proposed rules banning the sale of e-cigarettes to people under the age of 18 and subjecting industry to federal regulation for the first time.
However, the organization said that the proposal “fell short” of what they had been hoping for. The Heart Association said that e-cigarettes “should be regulated under the same laws as other tobacco products and prohibited from being marketed or sold to young people,” and that the FDA’s proposal “did not go far enough in limiting online sales, advertising and flavored products, all tactics used to make e-cigarettes appealing to young people.”

Newly Discovery Atlantic Methane Vents Could Pose Global Warming Threat

redOrbit Staff & Wire Reports – Your Universe Online
Geologists from Mississippi State University, Brown University, the US Geological Survey (USGS) and Maryland-based Earth Resources Technology, Inc. have discovered more than 500 bubbling methane vents on the seafloor of the northern part of the US Atlantic margin, various media outlets reported on Sunday.
Previously, only three of these vents (which are also known as seeps) had been identified, but according to Terrence McCoy of the Washington Post, the researchers behind this new study have determined that there are actually 570 leaking methane gas seeps just off the country’s East Coast.
Their research, published online Sunday in the journal Nature Geoscience, suggests that these seeps might be emitting as much as 90 tons of greenhouse gases every year. If there are more of these vents – and the scientists predict that there might be up to 30,000 of them worldwide – they could represent a previously unknown source of environmentally harmful CO2 emissions.
“The bubble streams showed up on sonar scans of the sea floor taken between September 2011 and August 2013 during oceanographic expeditions ranging from Cape Hatteras in North Carolina to Georges Bank off Cape Cod,” explained Scientific American reporter Sid Perkins. The authors said they have analyzed data covering a 94,000 square kilometer arc of land, and discovered the 570 seeps in an area of approximately 950 kilometers.
Co-author and Mississippi State geologist Adam Skarke told Perkins that the number was astonishing, given how few scientists had detected in the region up to that point. While some of the plumes extended hundreds of meters above the ocean floor, Skarke (formerly a physical scientist with the NOAA) said that bubbles originating from deep-water sources typically become dissolved in the sea water well before they would be able to reach the surface.
He and his colleagues report they have not yet collected samples of the bubbles produced by the seeps, but USGS geophysicist Carolyn Ruppel told Perkins that the team presumes that they contain methane because many of them exist in areas that had once been methane-producing wetlands before becoming submerged. Studying those bubbles and the waters surrounding the plumes will help experts estimate the impact of the emissions.
The gas reacts to and diminishes dissolved oxygen, and this process creates carbon dioxide that will cause the waters in and around the vents to become more acidic, the researchers explained to Scientific American. Ronald Cohen, a geologist at the Carnegie Institution for Science in Washington DC who was not involved in the study, said that the study was “very careful” and “lays the groundwork for further research” in the field.
“It is the first time we have seen this level of seepage outside the Arctic that is not associated with features like oil or gas reservoirs or active tectonic margins,” Skarke told BBC News environmental correspondent Matt McGrath. “The methane is dissolving into the ocean at depths of hundreds of meters and being oxidized to CO2. But it is important to say we simply don’t have any evidence in this paper to suggest that any carbon coming from these seeps is entering the atmosphere.”
Skarke and his colleagues believe that many of these newly discovered seeps could be related to the breakdown of a frozen combination of methane in sediments below 500 meters of ocean water. This gas hydrate, also known as “methane ice,” can release its methane into the sediments with slight changes in ocean temperature, the researchers explained in a statement. This would allow the gas to escape at the seafloor to form plumes in the water column.
The study authors told BBC News that they estimate there might be 30,000 of these seeps worldwide, but admit that this is an unconfirmed calculation. These vents might not pose an immediate global warming threat, they explained, but if they are correct about the number of them, it could force climate scientists to revisit existing calculations on the potential sources of greenhouse gases, McGrath explained.
Skarke said the research “does not provide sufficient evidence to draw objective conclusions about the relationship between these methane seeps and global climate change.” However, he added that the discovery “introduces a number of related questions that require further exploration and investigation to address.”
—–
Join Amazon Student – FREE Two-Day Shipping for College Students

Facebook Announces News Feed Changes To Discourage Click-Baiting

redOrbit Staff & Wire Reports – Your Universe Online
Social media users: if you’re tired of clicking on sensationalistic-sounding news headlines only to be disappointed by the content (or lack thereof) in the actual article, you should be pleased with changes to the News Feed announced by Facebook officials on Monday.
In an online post, research scientist Khalid El-Arini and product specialist Joyce Tang explained that the website is looking to cut down on the practice of click-baiting, which they describe as “when a publisher posts a link with a headline that encourages people to click to see more, without telling them much information about what they will see.”
“Posts like these tend to get a lot of clicks, which means that these posts get shown to more people, and get shown higher up in News Feed,” they added. “However, when we asked people in an initial survey what type of content they preferred to see in their News Feeds, 80 percent of the time people preferred headlines that helped them decide if they wanted to read the full article before they had to click through.”
This kind of bait-and-switch style headline has become increasingly popular in recent years as online media companies have found that they can attract more users, thus increasing advertising revenue, explained CNN Money’s Brian Stelter. Critics complain that this practice tends to leave web surfers unsatisfied, he added, and Facebook apparently agrees with those individuals.
According to Chloe Albanesius of PC Magazine, Facebook plans to use a two-pronged approach to determine whether or not an article is guilty of click-baiting. To begin with, it will gauge how long people stay on the stories that they clicked on through the news feed, and then it will take into account how many clicks an article gets in comparison to the number of likes and/or comments it receives.
“If people click on an article and spend time reading it, it suggests they clicked through to something valuable. If they click through to a link and then come straight back to Facebook, it suggests that they didn’t find something that they wanted,” El-Arini and Tang explained. “With this update we will start taking into account whether people tend to spend time away from Facebook after clicking a link, or whether they tend to come straight back to News Feed.”
“Another factor we will use to try and show fewer of these types of stories is to look at the ratio of people clicking on the content compared to people discussing and sharing it with their friends,” they added. “If a lot of people click on the link, but relatively few people click Like, or comment on the story when they return to Facebook, this also suggests that people didn’t click through to something that was valuable to them.”
Mashable business reporter Jason Abbruzzese said that the use of the “time spent on page” metric that the website plans to start employing has become increasingly popular on the Internet. He also speculates that digital media companies will be the most affected by the new policies, as Facebook has become an increasingly influential publishing platform and has been responsible for driving large amounts of traffic to websites with a strong presence on the social media website.
In addition, Facebook also announced changes to the way in which shared links appear in a person’s News Feed. Moving forward, the social network said that it will prioritize shared websites in what they call the link-format (which appears when you paste a link while drafting a post and includes a large picture, a headline and some text offering some context on the link’s content) over those shared in text captions above photos or in status updates.
The inclusion of additional information about the article provided in the link-format “makes it easier for someone to decide if they want to click through,” said El-Arini and Tang. The Facebook representatives added that this display method “makes it easier for someone to click through on mobile devices, which have a smaller screen.”
“The best way to share a link after these updates will be to use the link format,” they said. “In our studies, these posts have received twice as many clicks compared to links embedded in photo captions. In general, we recommend that you use the story type that best fits the message that you want to tell – whether that’s a status, photo, link or video.”
—–
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Amazon Announces Surprise Acquisition Of Video Game Streaming Service Twitch

redOrbit Staff & Wire Reports – Your Universe Online
Popular video game streaming website Twitch will soon have a new owner – not Google, as previous reports had suggested, but Amazon, who sources say swooped in after the previous deal fell through.
Back in July, various media outlets had said that the Mountain View, California-based tech giant had finalized a buyout of Twitch, which allows video game players to upload and watch live, free footage that can be streamed from Microsoft Xbox and Sony PlayStation 4 consoles, to add content to its own YouTube video service.
That deal was worth a reported $1 billion, but on Monday, Chris Welch of The Verge explained that Twitch had reversed course and reached an agreement with the Seattle, Washington-based e-commerce giant instead.
“We chose Amazon because they believe in our community, they share our values and long-term vision, and they want to help us get there faster,” Twitch CEO Emmett Shear said in a statement. “We’re keeping most everything the same: our office, our employees, our brand, and most importantly our independence. But with Amazon’s support we’ll have the resources to bring you an even better Twitch.”
Welch and the Wall Street Journal reported that it was an all-cash deal worth about $970 million, while Bloomberg writers Adam Satariano and Brad Stone called it Amazon’s “biggest acquisition ever” and said sources told them the dollar figure was closer to $1.1 billion.
“Broadcasting and watching gameplay is a global phenomenon and Twitch has built a platform that brings together tens of millions of people who watch billions of minutes of games each month… and, amazingly, Twitch is only three years old,” Amazon founder and CEO Jeff Bezos added in a separate statement. “We look forward to learning from them and helping them move even faster to build new services for the gaming community.”
Twitch was founded by Shear and Justin Kan in 2011, and according to Wall Street Journal reporters Douglas MacMillan and Greg Bensinger, network research firm DeepField Inc. said that it is the most popular online destination for watching and broadcasting video game play and the fourth-largest source of Internet traffic in the US. In February, it accounted for a reported 2 percent of all US Internet traffic, they added.
“Before today’s curve, a pairing between YouTube and Twitch made a lot of sense,” Welch said. “Many gamers upload highlights, impressive level playthroughs, and other content to YouTube. But there’s no mistaking the fact that the game-streaming revolution is happening at Twitch. With over 55 million monthly visitors, the site has become so popular that it’s faced challenges trying to scale for its ever-growing audience.”
“YouTube’s vast infrastructure was seen as a logical answer to that problem when reports about an acquisition first surfaced in May,” he added. “At first glance, Amazon seems like a less ideal fit. Infrastructure isn’t at all a problem; Amazon’s AWS and cloud computing services can provide Twitch anything it needs and then some in terms of resources and reliability. But from a culture perspective, the link isn’t as obvious.”
However, Welch added that Amazon has long expressed an interest in the gaming industry, as demonstrated by the way that it encouraged developers to publish software for its Kindle Fire tablets and its Fire TV hardware. A Twitch app for the online retailer’s set-top box first launched in May, he added, but acquiring ownership of the company is the boldest statement yet that Amazon is serious about becoming a serious player in the gaming industry.

X-ray Laser Probes Tiny Quantum Tornadoes In Superfluid Droplets

Andrew Gordon, DOE/SLAC National Accelerator Laboratory

SLAC Experiment Reveals Mysterious Order in Liquid Helium

An experiment at the Department of Energy’s SLAC National Accelerator Laboratory revealed a well-organized 3-D grid of quantum “tornadoes” inside microscopic droplets of supercooled liquid helium – the first time this formation has been seen at such a tiny scale.

The findings by an international research team provide new insight on the strange nanoscale traits of a so-called “superfluid” state of liquid helium. When chilled to extremes, liquid helium behaves according to the rules of quantum mechanics that apply to matter at the smallest scales and defy the laws of classical physics. This superfluid state is one of just a few examples of quantum behavior on a large scale that makes the behavior easier to see and study.

The results, detailed in the Aug. 22 issue of Science, could help shed light on similar quantum states, such as those in superconducting materials that conduct electricity with 100 percent efficiency or the strange collectives of particles, dubbed Bose-Einstein condensates, which act as a single unit.

“What we found in this experiment was really surprising. We did not expect the beauty and clarity of the results,” said Christoph Bostedt, a co-leader of the experiment and a senior scientist at SLAC’s Linac Coherent Light Source (LCLS), the DOE Office of Science User Facility where the experiment was conducted.

“We were able to see a manifestation of the quantum world on a macroscopic scale,” said Ken Ferguson, a PhD student from Stanford University working at LCLS.

While tiny tornadoes had been seen before in chilled helium, they hadn’t been seen in such tiny droplets, where they were packed 100,000 times more densely than in any previous experiment on superfluids, Ferguson said.

Studying the Quantum Traits of a Superfluid

Helium can be cooled to the point where it becomes a frictionless substance that remains liquid well below the freezing point of most fluids. The light, weakly attracting atoms have an endless wobble – a quantum state of perpetual motion that prevents them from freezing. The unique properties of superfluid helium, which have been the subject of several Nobel prizes, allow it to coat and climb the sides of a container, and to seep through molecule-wide holes that would have held in the same liquid at higher temperatures.

In the LCLS experiment, researchers jetted a thin stream of helium droplets, like a nanoscale string of pearls, into a vacuum. Each droplet acquired a spin as it flew out of the jet, rotating up to 2 million turns per second, and cooled to a temperature colder than outer space. The X-ray laser took snapshots of individual droplets, revealing dozens of tiny twisters, called “quantum vortices,” with swirling cores that are the width of an atom.

The fast rotation of the chilled helium nanodroplets caused a regularly spaced, dense 3-D pattern of vortices to form. This exotic formation, which resembles the ordered structure of a solid crystal and provides proof of the droplets’ quantum state, is far different than the lone whirlpool that would form in a regular liquid, such as briskly stirred cup of coffee.

More Surprises in Store

Researchers also discovered surprising shapes in some superfluid droplets. In a normal liquid, droplets can form peanut shapes when rotated swiftly, but the superfluid droplets took a very different form. About 1 percent of them formed unexpected wheel-like shapes and reached rotation speeds never before observed for their classical counterparts.

Oliver Gessner, a senior scientist at Lawrence Berkeley Laboratory and a co-leader in the experiment, said, “Now that we have shown that we can detect and characterize quantum rotation in helium nanodroplets, it will be important to understand its origin and, ultimately, to try to control it.”

Andrey Vilesov of the University of Southern California, the third experiment co-leader, added, “The experiment has exceeded our best expectations. Attaining proof of the vortices, their configurations in the droplets and the shapes of the rotating droplets was only possible with LCLS imaging.”

He said further analysis of the LCLS data should yield more detailed information on the shape and arrangement of the vortices: “There will definitely be more surprises to come.”

Other research collaborators were from the Stanford PULSE Institute; University of California, Berkeley; the Max Planck Society; Center for Free-Electron Laser Science at DESY; PNSensor GmbH; Chinese University of Hong Kong; and Kansas State University. This work was supported by the National Science Foundation, the U.S. Department of Energy Office of Science and the Max Planck Society.

> Explore Further…

> Read the Lawrence Berkeley National Laboratory statement…

Ice Cream Goes Southern, Okra Extracts May Increase Shelf-Life

Institute of Food Technologists
While okra has been widely used as a vegetable for soups and stews, a new study in the Journal of Food Science, published by the Institute of Food Technologists (IFT), shows how okra extracts can be used as a stabilizer in ice cream.
Ice cream quality is highly dependent on the size of ice crystals. As ice cream melts and refreezes during distribution and storage, the ice crystals grow in size causing ice cream to become courser in texture which limits shelf life. Stabilizers are used to maintain a smooth consistency, hinder melting, improve the handling properties, and make ice cream last longer.
This study found that water extracts of okra fiber can be prepared and used to maintain ice cream quality during storage. These naturally extracted stabilizers offer an alternative food ingredient for the ice cream industry as well as for other food products.
View the abstract in Journal of Food Science.

US Pediatricians Call For Later Start To School Day For Adolescents, Teens

redOrbit Staff & Wire Reports – Your Universe Online

A group of US pediatricians is encouraging middle schools and high schools to delay their start times by as much as an hour to avoid the physical and mental issues that can arise as a result of sleep deprivation.

According to Andrew M. Seaman of Reuters Health, a new policy statement issued Monday by the American Academy of Pediatrics (AAP) said that schools should start no earlier than 8:30am, as previous research suggested that students performed better and tended to be safer with the later start than when they had to be in school by 7:30am or 8:00am.

“We want to engage in at least starting a discussion in the community,” Dr. Judith Owens, a sleep medicine specialist at Children’s National Health System in Washington, DC and who led the AAP’s Adolescent Sleep Working Group, Council on School Health and Committee on Adolescence responsible for the new policy, told Reuters.

“Hopefully as a result of that the importance of sleep health as a priority will become more prominent,” she added. “I think that we definitely acknowledge that changing school start times is a challenge for many communities and that there are political, logistical and financial considerations associated with that, but at the end of the day this is something that communities can do to have a significant and definite impact of the health of their population.”

In the new policy statement, the AAP said that delaying the start of the school day would allow adolescent and teenage students to have their biological sleep cycles, which change at the start of puberty, to match-up better with their school scheduled.

The group cites research which has found the average American adolescent is chronically sleep-deprived and pathologically tired, as well as a National Sleep Foundation poll which found that 59 percent of 6th through 8th graders and 87 percent of US high school students were getting less than the 8.5 to 9.5 hours of sleep recommended for school nights. The later start time, the AAP argues, would help rectify that problem.

“Chronic sleep loss in children and adolescents is one of the most common – and easily fixable – public health issues in the U.S. today,” Dr. Owens said. “The research is clear that adolescents who get enough sleep… have better grades, higher standardized test scores and an overall better quality of life. Studies have shown that delaying early school start times is one key factor that can help adolescents get the sleep they need to grow and learn.”

In addition to harming academic performance, the organization said that insufficient sleep could increase a teenager’s risk of becoming obese, suffering a stroke and contracting type 2 diabetes, said USA Today’s Michelle Healy. A lack of sleep could also make them more likely to be involved in an automobile accident, suffer from anxiety and depression, and engage in risky behaviors while decreasing the amount of time they expend exercising.

The AAP said the reasons that teens aren’t getting enough sleep on school nights “are complex, and include homework, extracurricular activities, after-school jobs and use of technology that can keep them up late on week nights.” The organization added that it encourages parents to enforce a media curfew for their children, and said that taking naps, sleeping in on weekends and consuming caffeine were temporary measures that can help students stay awake but “do not restore optimal alertness and are not a substitute for regular, sufficient sleep.”

“The AAP is making a definitive and powerful statement about the importance of sleep to the health, safety, performance and well-being of our nation’s youth,” said Dr. Owens. “By advocating for later school start times for middle and high school students, the AAP is both promoting the compelling scientific evidence that supports school start time delay as an important public health measure, and providing support and encouragement to those school districts around the country contemplating that change.”

—–

Shop Amazon – Back to School

Newly Discovered Hot Springs Bacteria Can Use Far-Red Light For Photosynthesis

redOrbit Staff & Wire Reports – Your Universe Online
A type of bacteria growing in a hot spring near Yellowstone National Park in Montana uses a previously unidentified process to harvest energy and produce oxygen from sunlight, according to new research published in a recent edition of the journal Science.
The bacteria grows in far-red light, researchers from the Pennsylvania State University, the University of California, Davis, and Montana State University report in their study, and their discovery could help scientists discover ways to improve plant growth, harvest energy from the Sun and better understand dense blooms growing on lakes.
“We have shown that some cyanobacteria, also called blue-green algae, can grow in far-red wavelengths of light, a range not seen well by most humans,” Penn State biotechnology, biochemistry and molecular biology professor Donald A. Bryant explained in a statement Thursday.
“Most cyanobacteria can’t ‘see’ this light either. But we have found a new subgroup that can absorb and use it, and we have discovered some of the surprising ways they manipulate their genes in order to grow using only these wavelengths,” he added.
The new subgroup is known as Leptolyngbya, and Bryant and his colleagues report that this cyanobacterial strain completely alters its photosynthetic apparatus in order to use far-red light that has wavelengths longer than 700 nanometers (slightly longer than the range of light visible to most humans).
Their experiments revealed that these cyanobacteria replace 17 proteins in three major light-using complexes, while manufacturing new chlorophyll pigments capable of capturing the far-red light and also using pigments known as bilins in unusual ways. They are able to accomplish all of this by quickly activating several genes to modify cellular metabolism and switching off a large number of other genes, the researchers said.
Bryant and his colleagues have dubbed this process Far-Red Light Photoacclimation (FaRLiP), and they explain that since the genes that are activated determine which proteins will be produced by the organism, the massive changes in the bacteria’s available gene profile has a dramatic impact.
“Our studies reveal that the particular cyanobacterium that we studied can massively change its physiology and metabolism, and its photosynthetic apparatus,” Bryant explained. “It changes the core components of the three major photosynthetic complexes, so one ends up with a very differentiated cell that is then capable of growing in far-red light. The impact is that they are better than other strains of cyanobacteria at producing oxygen in far-red light.”
In fact, the researchers report that cells grown in far-red light produce 40 percent more oxygen when studied in far-red light than those grown in red light assayed under the same types of conditions. This discovery was made through various biological, genetic, physical, and chemical experiments, all focused on better understanding how this unusual photosynthesis system works.
The authors explained that their study of this process included biochemical analyses, spectroscopic analyses, studies of the structures and functions of proteins, profiles of gene-transcription processes, and sequencing and comparisons of cyanobacteria genomes. Bryant said their genome-sequence analysis of the various strains found an addition 13 types of cyanobacteria that are also capable of using far-red light for photosynthesis.
The Leptolyngbya cyanobacterial strain used in the study was collected at the LaDuke hot spring in Montana, and was living in the underside of a thick mat that was so densely covered with microbes that only far-red wavelengths of light can penetrate to the bottom. The study suggests that it might be possible to introduce the ability to use far-red wavelengths in plants, though Bryant cautions that additional research will be necessary first.
“Our research already has shown that it would not be enough to insert a new far-red-light-absorbing pigment into a plant unless you also have the right protein scaffolds to bind it so that it will work efficiently,” he said. “In fact, it could be quite deleterious to just start sticking long-wavelength-absorbing chlorophylls into the photosynthetic apparatus.”
“We now have clearly established that photosynthesis can occur in far-red light, in a wavelength range where people previously did not think that oxygenic photosynthesis could take place, and we have provided details about many of the processes involved,” Bryant added. “Now there are a whole set of associated scientific questions that need to be answered about more of the details before we can begin to investigate any applications that may or may not be possible. Our research has opened up many new questions for basic scientific research.”
—–
FOR THE KINDLE: America’s Most Popular National Parks: redOrbit Press

Genetic Analysis Reveals How Honeybees Respond To Diseases, Climate Change

redOrbit Staff & Wire Reports – Your Universe Online

Honeybees are more genetically diverse than originally thought, and the species might have originated from Asia and not Africa as previously believed, according to new research published online Sunday in the journal Nature Genetics.

As part of their study, researchers from the Uppsala University Department of Medical Biochemistry and Microbiology and an international team of colleagues present the first global analysis of genome variation in the honeybee (Apis mellifera) – an insect that is partially responsible for pollinating one-third of our fruits, nuts and vegetables.

Despite their importance to global food supplies, however, there is great concern over the extensive loss of honeybee colonies that has taken place in recent years. Honeybees are threatened by disease, climate change and management practices, the authors said, and it is essential to gain a better understanding of their evolutionary history and learn how they managed to adapt to different environments in order to combat these threats.

“We have used state-of-the-art high-throughput genomics to address these questions, and have identified high levels of genetic diversity in honeybees,” researcher Matthew Webster of the university’s Science for Life Laboratory, said in a statement.

“In contrast to other domestic species, management of honeybees seems to have increased levels of genetic variation by mixing bees from different parts of the world,” he added. “The findings may also indicate that high levels of inbreeding are not a major cause of global colony losses.”

In addition, Webster and his co-authors found that honeybees do not appear to be of African origin, as suggested in previous studies. Instead, the new study suggests that they appear to be derived from an ancient lineage of cavity-nesting bees from Asia that began spreading through Europe and Africa approximately 30,000 years ago.

They found evidence suggesting large cyclical fluctuations in population size that mirror historical patterns of glaciation hidden in the patterns of genome variation of the insects – a discovery indicating that honeybee populations had been greatly affected by climate change throughout their history.

“The evolutionary tree we constructed from genome sequences does not support an origin in Africa, this gives us new insight into how honeybees spread and became adapted to habitats across the world,” Webster said. “Populations in Europe appear to have contracted during ice ages whereas African populations have expanded at those times, suggesting that environmental conditions there were more favorable.”

He and his fellow investigators also identified specific genetic mutations important to adaptation to factors such as climate and pathogens, including those involved in morphology, behavior and innate immunity. Webster said that their findings offer new insight into the evolution and genetic adaptation of the bees, while also creating a framework for studying the biological mechanisms behind disease resistance and adaptation to climate.

“We find evidence that population sizes have fluctuated greatly, mirroring historical fluctuations in climate, although contemporary populations have high genetic diversity, indicating the absence of domestication bottlenecks,” the authors wrote. “Levels of genetic variation are strongly shaped by natural selection and are highly correlated with patterns of gene expression and DNA methylation.”

“We identify genomic signatures of local adaptation, which are enriched in genes expressed in workers and in immune system- and sperm motility- related genes that might underlie geographic variation in reproduction, dispersal and disease resistance,” they added. “This study provides a framework for future investigations into responses to pathogens and climate change in honeybees.”

—–

The Beekeeper’s Bible: Bees, Honey, Recipes & Other Home Uses by Richard Jones and Sharon Sweeney-Lynch

PlayStation Network Back Online After DDoS Attacks Disrupt Weekend Service

redOrbit Staff & Wire Reports – Your Universe Online
Sony’s PlayStation Network was back online Monday following a weekend cyberattack that affected its servers, as well as those of fellow video game companies Blizzard Entertainment, Riot Games and Grinding Gear Games.
According to Brett Molina of USA Today, Sony Online Entertainment chief John Smedley originally confirmed the outage on Sunday, stating that both the PlayStation Network and Sony Entertainment Network online services had been flooded with artificially high traffic as part of a distributed denial of service (DDOS) attack and were temporarily unavailable to users.
Molina added that an outfit calling itself ‘Lizard Squad’ claimed responsibility for the attacks on Sony, as well as those on Blizzard, Riot and other game makers, via Twitter. In addition, the group sent a tweet from its account to American Airlines claiming that an explosive was on a flight from Dallas/Fort Worth to San Diego – a flight on which Smedley was a passenger. The Sony executive later confirmed via Twitter that his flight had been delayed.
“A Twitter user with the handle @LizardSquad claimed responsibility for the attack on Sunday, and said the attack was meant to pressure Sony to spend more of its profits on the network,” Reuters reporters Malathi Nayak and Sophie Knight wrote Monday. The hackers had also threatened to attack Microsoft, and Reuters said that some users reported experiencing issues on Sunday, but Xbox spokesman David Dennis said that “the core Xbox Live services” were “up and running.”
Late Sunday night, Sony Computer Entertainment Europe (SCEE) Blog Manager Fred Dutton said that both the PlayStation Network and Sony Entertainment Network were “back online” and that the company had seen “no evidence of any intrusion to the network and no evidence of any unauthorized access to users’ personal information.” He also said that regularly scheduled maintenance originally scheduled for Monday would be postponed in order to keep the networks online.
This is not the first time that the PlayStation Network has been targeted by DDoS attacks. In 2011, the network suffered a massive security breach when Anonymous claimed responsibility for an attack that compromised thousands of usernames and passwords, explained Forbes contributor Paul Tassi. Tassi suggested that this latest attack might have been an attempt to illustrate that the company had not done enough to upgrade its server security in the wake of that incident.
As for the bomb threat, Sony told BBC News that the US Federal Bureau of Investigation (FBI) is investigating the incident, which required Smedley’s plane to be diverted to Phoenix, Arizona. The FBI told Reuters that it had no comment on the incident, while American Airlines said via Twitter that it was “aware” of the threats made over the social media website and that it had alerted the authorities.
In addition to the PlayStation Network, the Lizard Squad DDoS attacks affected Riot Games’ League of Legends and Grinding Gear Games’ Path of Exile, as well as Blizzard and its Battle.net online gaming service, Tassi reported. As of 9:35pm Sunday night, they had been “harassing random streamers and interfering with [Battle.net’s] connectivity,” he added. Reuters said that officials from Blizzard were “not immediately reachable for comment, though its customer support Twitter account said the company’s servers were stabilizing.”
—–
Shop Amazon.com for all your PlayStation 4 gear.

Regardless Of Nicotine Levels, Smokers Consume Same Amount Of Cigarettes

University of Waterloo

Cigarettes with very low levels of nicotine may reduce addiction without increasing exposure to toxic chemicals, according to a new study from the University of Waterloo.

The study published in the journal Cancer Epidemiology monitored the smoking behaviors of 72 adults as they switched to three types of cigarettes with markedly reduced nicotine levels.

Unlike when smokers switch between conventional cigarette brands—all of which have very similar levels of nicotine content—the study found no change in participants’ puffing behavior, number of cigarettes consumed or levels of toxic chemicals in their systems.

The landmark findings may ease concerns that smokers would increase their consumption of cigarettes or puff harder if governments reduced nicotine levels to negligible amounts.

“One of the primary barriers to reducing nicotine levels is the belief that individuals who continue to smoke will smoke more cigarettes in an effort to extract the same nicotine levels, thereby exposing themselves to greater amounts of toxic chemicals. Our findings suggest this is not the case,” said Professor David Hammond, of the Faculty of Applied Health Sciences at Waterloo, and lead author on the paper. “The smokers were unable or unwilling to compensate when there was markedly less nicotine in the cigarette and when the experience of smoking is far less rewarding.”

The cigarettes used in the study—Quest 1, Quest 2 and Quest 3—had a nicotine content of 8.9, 8.4 and 0.6 mg, respectively, as opposed to an average of 12 mg in a regular cigarette.

“There is ample evidence from inside and outside the tobacco industry that major reductions in the nicotine content of cigarettes would result in a less-addictive product,” said Professor Hammond. “Overall, the impact of a less-addictive cigarette on reducing smoking uptake and cancer prevention is potentially massive.”

At time of the study, Quest cigarettes were the only commercially available cigarettes in the world with significantly reduced nicotine levels.

Playing Hunger Games: Are Gamified Health Apps Putting Odds In Your Favor?

Brigham Young University

Study breaks down prevalence of apps using game-like rewards to motivate

For many people, finding motivation to exercise is a challenge. Thankfully, there are Zombies chasing you.

At least that’s the approach of Zombies, Run! — one of more than 31,000 health and fitness apps on the market today, and one of the growing number of apps that use games to increase physical activity.

Gamification is currently the popular trend for mobile fitness app makers looking to cash in to help people get fit. Whether or not it’s the best way to exercise remains to be seen.

“It’s just been assumed that gamified apps will work, but there has been no research to show that they’re effective for people long-term,” said Cameron Lister, lead author of a new BYU study on gamified health apps appearing in the Journal of Medical Internet Research. “Does earning a badge on your screen actually change your health behavior?”

Lister, along with BYU health science professor Josh West, analyzed more than 2,000 health and fitness apps and found that the majority of the most popular and widely used apps feature gamification.

As part of their study, the duo also downloaded and used 132 of the apps personally to see how well they worked. In addition to Zombies, Run! they tried out:

– Pact: An app that pits users against friends to see who keeps their exercise routine. Those who keep their goals make money at the expense of those who don’t.

– Fitbit: Users can enlist friends to help them reach goals by sharing stats, joining fitness challenges or competing on leaderboards.

– DietBet: Like Pact, users put their money on the line to keep weight loss goals. Those who lose 4 percent of their starting weight in four weeks earn money from those who don’t.

The researchers are concerned that gamification is ignoring key elements of behavior change and could be demotivating in the long run. For example, over time people can view the rewards and badges on these apps as work instead of play. Once the rewards disappear, the motivation drops.

One suggestion is for the apps to also focus on skill development.

“There’s a missed opportunity to influence healthy behavior because most gamified health apps are only aimed at motivation,” West said. “Motivation is important, but people also need to develop skills that makes behavior change easy to do.”

According to the study, the most common form of motivation in the apps centered on social or peer pressure (45% of apps), followed by digital rewards (24%), competitions (18%), and leaderboards (14%).

“It’s like people assuming that you hate health and you hate taking care of your body so they offer to give you some stuff in order for you to do what they want you to do,” Lister said. “But really, you should intrinsically want to be healthy and be engaged in healthy activity.”

While they found the health games are fun and engaging, West and Lister aren’t sure they can sustain major changes in healthy behavior. They believe more research needs to be carried out in an industry projected to hit the $2.8 billion mark by 2016.

But funding for this type of research is scarce because the technology is so new and developers either don’t have the money or are conflicted about subjecting their apps to scrutiny.

“I would caution developers and users to not have unrealistic expectations about the potential impact of gamified apps,” West said. “Everybody wants to know if they result in more sustainable behavior change but we just don’t know yet.”

> Explore Further…

—–

GET FIT WITH FITBIT – Fitbit Flex Wireless Activity + Sleep Wristband, Black

Citizen Scientists Safeguarding Communities Around The ‘Throat Of Fire’ Volcano

University of East Anglia
Citizen scientists are saving the lives of people living in the shadow of deadly volcanoes according to new research from the University of East Anglia.
A new report published today reveals the success of a volunteer group set up to safeguard communities around the ‘Throat of Fire’ Tungurahua volcano in the Ecuadorian Andes.
More than 600 million people live close to active volcanoes worldwide. The research shows that living safely in these dangerous areas can depend on effective communication and collaboration between volcanologists, risk managers and vulnerable communities.
It is hoped that the research will help inform similar community engagement in volcanic and other disaster risk reduction projects around the world.
The report looks at a 35-strong network of volunteers called ‘vigías’, which was set up 14 years ago in the wake of renewed activity at a historically deadly volcano. The eruptions led to a military evacuation of around 25,000 people from Baños and the surrounding area. But leaving homes, land and livelihoods was hard, and the community rallied together to over-run checkpoints and re-occupy the town.
Lead researcher Jonathan Stone from UEA’s school of Environmental Sciences said: “This pattern of re-occupation is common in volcanic areas and after other natural disasters. The people of Baños wanted to go home even though it wasn’t safe.
“The volcano’s activity has varied from small explosions with ash emissions to violent eruptions with fast-moving pyroclastic flows. Living close by is a real risk, and so the vigía network was set up to help monitor the volcano and protect the community.
“It was initially a compromise between the community and civil protection agencies who were attempting to ensure their safety.”
The Spanish word ‘vigía’ means watchman, guard, sentinel or lookout – but the research shows that the role extends beyond that which the name suggests and that of the normal citizen scientist.
“The vigías are members of the community who help scientists collect data about volcanic activity, are part of a vital early warning system for eruptions, and facilitate evacuations of the community during a crisis.
“The network enables citizens to continue to live and work in a hazardous area by enhancing their capacity to respond quickly to escalating threats. The ideal risk reduction scenario would be to move people out of the way of the volcano permanently, but clearly this is not always practical – people often want to live and work in particular locations for a number of reasons, and anyway – there are few places that you can move in the Ecuadorian Andes that aren’t threatened by one or several volcanoes!
“Community based monitoring has the potential to reduce risk by providing useful data, fostering collaboration between scientists and communities, and providing a way in which citizens are empowered to take actions to preserve lives and livelihoods.”
The report reveals how one particular eruption in August 2006 was pivotal, with many lives saved in the Juive Grande area thanks to the vigía network. And when further eruptions took place in 2013 and as recently as February and April this year, scientists and responding agencies attributed in part the zero loss of life and injury to the quick actions of the volunteers.
The research team interviewed vigías, other people in the community and scientists to discover why the network was such a success.
Jonathan Stone said: “The area is potentially becoming more dangerous with villages and grazing lands around the volcano’s base particularly at risk. One of the reasons why the vigías network really works is because they have a vested interest to be ready for the next eruptive event. They want to work with the authorities to help their communities.
“Scientists are considered friends and colleagues, which also has a big impact on the success of the network. The vigías act as a bridge between the community and the scientists. The communities are able to more rapidly trust and act upon advice from the scientists and authorities, because of the vigías.
Prof Jenni Barclay from UEA’s school of Environmental Sciences leads the Strengthening Resilience in Volcanic Areas (STREVA) project and contributed to the research. She said: “This kind of research is very important because by examining cases like this, we can learn lessons about the potential of community-based disaster risk reduction in other contexts. It provides valuable evidence for how to reduce volcanic risk in practice, which is a critical step in finding ways of increasing society’s resilience to events of this nature.”

Using Cognitive Therapy With Medications Could Help Some Depression Patients

redOrbit Staff & Wire Reports – Your Universe Online

The chances that a person suffering from severe, nonchronic depression will make a full recovery increases by up to 30 percent if the individual is treated using a combination of cognitive therapy and antidepressant medication, a team of experts led by Vanderbilt University professor Steven Hollon claim in a new study.

Hollon and his colleagues conducted a randomized clinical trial involving 452 adult outpatients dealing with chronic or recurrent major depressive disorder (MDD). The patients were treated at university medical centers in Philadelphia, Chicago and Nashville, and either received antidepressants alone or in combination with cognitive therapy.

Those treatments lasted for up to 42 months until the patient had recovered. For the purposes of the study, the researchers declared that a patient was in remission if they experienced four consecutive weeks of minimal symptoms, and that they had recovered if they lasted another 26 consecutive weeks without suffering a relapse.

The results of the study, which were published online in the journal JAMA Psychiatry, found that combined treatment involving both cognitive therapy and antidepressant improved recovery rates when compared to drugs alone (72.6 percent vs. 62.5 percent).

However, the main effects of treatment on recovery were impacted by the severity and the chronic nature of the condition, the researchers explained. The advantage for combined treatment was limited to the 146 patients suffering from severe, nonchronic depression (81.3 percent vs. 51.7 percent). Recovery rates were also found to be similar in the two groups for patients suffering from less severe MDD or chronic MDD, they added.

“Our results indicate that combining cognitive therapy with antidepressant medicine can make a much bigger difference than we had thought to about one-third of patients suffering from major depressive disorder,” Hollon explained. “On the other hand, it does not appear to provide any additional benefit for the other two-thirds.”

“Now, we have to reconsider our general rule of thumb that combining the two treatments keeps the benefits of both,” the lead investigator and Vanderbilt psychology professor added in a statement Wednesday. “This provided us with enough data so that we could drill down and see how the combined treatment was working for patients with different types and severity of depression: chronic, recurrent, severe and moderate.”

According to Hollon, the results could have a significant impact on the way that MDD is treated – especially in the UK, which he said is a decade ahead of the US in terms of treatment for the condition. The use of combined cognitive therapy and antidepressive medicine is already standard for severe cases in the UK, and health officials there are training therapists in cognitive therapy and other empirically-supported forms of psychotherapy.

Other scientists involved in the study included Robert DeRubeis and Jay Amsterdam from the University of Pennsylvania; Jan Fawcett from the University of New Mexico; Richard Shelton from the University of Alabama-Birmingham; John Zajecka and Paula Young from Rush University; and Robert Gallop from West Chester University. The research was funded in part by grants from the National Institute of Mental Health.

—–

Amazon.com – Read eBooks using the FREE Kindle Reading App on Most Devices

Discovery Of Genetic Secret "Recipe" For How Lizards Regrow Their Tails Could Help Human Recovery

John Hopton for redOrbit.com – Your Universe Online
Researchers at Arizona State University have taken a giant step towards uncovering the genetic secrets behind lizards’ ability to regrow their own tails, and believe the knowledge could be used to stimulate regrowth in humans.
The process with which lizards regenerate their tails after losing them in attacks from predators is described by the researchers as a genetic “recipe,” until now a secret recipe, because like good cooking exactly the right mixture and amounts of genetic ingredients are thought to be required.
According to lead author Kenro Kusumi, professor in Arizona State’s School of Life Sciences and associate dean in the College of Liberal Arts and Sciences, “Lizards basically share the same toolbox of genes as humans… we discovered that they turn on at least 326 genes in specific regions of the regenerating tail, including genes involved in embryonic development, response to hormonal signals and wound healing.”
Other creatures like salamanders, frog tadpoles and fish are also able to regenerate their tails, and all of the animals “turn on” genes in what is known as the ‘Wnt pathway,’ a process that is required to control stem cells in organs, such as the brain, blood vessels and hair follicles. But whereas growth is mostly at the tip, lizards have a pattern of tissue growth that is distributed throughout the tail.
Elizabeth Hutchins, a graduate student in ASU’s molecular and cellular biology program and co-author of the paper, says that the process, which takes around 60 days, involves “a complex regenerating structure with cells growing into tissues at a number of sites along the tail.”
Lizards are the most closely related creatures to humans that are able to regenerate in such a remarkable way. It is hoped the findings could contribute significantly to the treatment of spinal cord injuries, repairing birth defects and treating diseases like arthritis.
Kusumi tells us that now they have “unlocked the mystery of what genes are needed to regrow the lizard tail,” by applying “the genetic recipe for regeneration that is found in lizards, and then harnessing those same genes in human cells, it may be possible to regrow new cartilage, muscle or even spinal cord in the future.”
An interdisciplinary team of scientists utilized “next-generation molecular and computer analysis tools” to observe the genes that are at work during tail regeneration. The lucky lizard they used to unlock the genetic secrets of regenerating body parts was the green anole lizard – also known as Carolina anole, American anole or Anolis carolinensis.
“We have identified one type of cell that is important for tissue regeneration,” said Jeanne Wilson-Rawls, co-author and associate professor with ASU’s School of Life Sciences. “Just like in mice and humans, lizards have satellite cells that can grow and develop into skeletal muscle and other tissues.” It is the connection with human anatomical processes, and the potential medical benefits, that make an already fascinating discovery even more exciting.
The researchers have published their finding in the journal PLOS ONE.
—–
Shop Amazon – Hot New Releases – Updated Every Hour

Majority Of Smartphone Owners Download No Apps During The Average Month

redOrbit Staff & Wire Reports – Your Universe Online
If you haven’t downloaded a smartphone app in a while, you’re not alone, as new research from internet analytics firm comScore indicates nearly two-thirds of mobile device users download zero pieces of software during the average month.
According to Quartz reporter Dan Frommer, 65.5 percent of smartphone owners said they downloaded zero apps during the average month, while 8.4 percent said that they downloaded one app per month and 8.9 percent answered they downloaded two.
Only 2.4 percent of those who responded to the survey said that they downloaded eight apps per month or more – surprising, considering that comScore also reported that apps now represent 52 percent of all time spent with digital media in the US (up from 40 percent in 2013). In addition, Frommer noted that Apple reported July was the best month ever for their App Store, both in terms of revenue and the number of people downloading apps.
Despite the download statistics, the comScore survey also found that over half of all smartphone owners use apps on their devices on a daily basis, explained Kevin C. Tofel of GigaOm. The reason for that is that nearly half of all download activity is done by just seven percent of all American mobile device owners, the study discovered.
“These are the app addicts, if you will. The rest of us just use what we already have on our phones, for the most part,” Tofel said. “At first, this scenario sounds odd. But when you think about it, it makes some sense. This is 2014, not 2008 when the mobile app economy was just starting to take off. When you start with a blank slate, there’s plenty of room for new apps to fill up your smartphone. These days, with more than a billion apps available from various mobile app stores, chances are that most of the new apps you see will give you a feeling of ‘been there, done that.’”
The comScore study seems to support that theory, as it claims that 42 percent of all time spent by smartphone users occurs with the individual’s single most used app, whatever that might be. New apps and games are always being released, Frommer explained, but those that truly connect with users are becoming exceedingly rare, and many of the apps topping the most-used list are from venerable companies such as Google, Pandora and Facebook.
Speaking of those most popular apps, Mark Zuckerberg’s social media network topped comScore’s list of the most frequently used apps, drawing 115.4 million unique American users in June, according to Quartz. Google apps YouTube (83.4 million), Google Play (72.2 million) and Google Search (70.2 million) take the next three spots, followed by Pandora (69 million) and two more Google apps, Google Maps (64.5 million) and Gmail (60.3 million).
“Another likely reason is that it’s still not easy enough to find and download new apps,” he added. “Apple’s App Store has received (deserved) criticism for its lousy discovery features, with users relying heavily on top-25 lists, a bad search engine, and few editorial features. This mostly helps the rich get richer, and makes it harder for clever new apps to get noticed. Apple is right to be proud of its app program – it has been one of the most important inventions in the history of software, and Google has done a decent job copying. But there is certainly room for improvement.”
—–
Amazon for Tablets

NASA Releases New Map Of Neptune’s Moon Triton To Mark 25th Anniversary Of Voyager 2’s Visit

redOrbit Staff & Wire Reports – Your Universe Online
In honor of the 25th anniversary of Voyager 2’s first up-close look at Neptune and Triton, NASA has “restored” footage obtained by the probe and used it to construct the highest-quality global color map of the moon to date.
According to the US space agency, the map was produced by Dr. Paul Schenk, a scientist at the Houston-based Lunar and Planetary Institute (http://www.lpi.usra.edu/) (LPI), and has also been used to develop a movie recreating the historic Voyager 2 encounter that took place on August 25, 1989.
[ Watch the Video: Sailing Past Neptune’s Moon Triton ]
The new map of Triton has a resolution of 1,970 feet (600 meters) per pixel, NASA explained, and the colors have been enhanced to emphasize contrast while still attempting to closely approximate the natural colors of the moon. Originally, the map was produced using orange, green and blue filter images, they noted.
Unfortunately, at the time of Voyager’s arrival at Triton, most of the moon’s northern hemisphere was in darkness and could not be observed by the spacecraft. Due to the speed of the visit, and the slow rotation of the moon, only one of Triton’s hemispheres could been seen at close distance, while the rest of the surface was either dark or blurry.
Among the improvements to the map are more accurate feature locations, the enhancement of feature details, the removal of some of the camera’s blurring effects, and improved color processing.
In addition to the 25th anniversary of the spacecraft’s arrival at Neptune, NASA officials said the new map of Triton was inspired in part by New Horizons’ upcoming encounter with Pluto, which is expected to occur on July 14, 2015. NASA said the flyby “will not be a replay of Voyager but more of a sequel and a reboot, with a new and more technologically advanced spacecraft and, more importantly, a new cast of characters.”
Even though Triton is a moon and Pluto is a dwarf planet, NASA said that Neptune’s moon “serves as a preview of sorts” for next year’s encounter with Pluto and its five known moons, which will be observed for the first time next summer. The agency added that while Triton “may not be a perfect preview of coming attractions, but it serves as a prequel to the cosmic blockbuster expected when New Horizons arrives at Pluto next year.”
While both bodies were formed in the outer solar system, Triton was captured by Neptune, and as a result, its thermal history had been radically different than that of Pluto’s, the space agency explained. The moon’s interior was likely melted by tidal heating, and that resulted in the volcanoes, fractures and other features observed by Voyager on the otherwise bitterly cold and ice-covered surface. Pluto is expected to possess some of the same features.
“Triton is slightly larger than Pluto, has a very similar internal density and bulk composition, and has the same low-temperature volatiles frozen on its surface,” NASA explained. “The surface composition of both bodies includes carbon monoxide, carbon dioxide, methane and nitrogen ices.”
“Voyager also discovered atmospheric plumes on Triton, making it one of the known active bodies in the outer solar system, along with objects such as Jupiter’s moon Io and Saturn’s moon Enceladus,” the agency added. “Scientists will be looking at Pluto next year to see if it will join this list. They will also be looking to see how Pluto and Triton compare and contrast, and how their different histories have shaped the surfaces we see.”
August 25 also marks the two-year anniversary of Voyager 1 departing the heliosphere and passing into interstellar space, although recent research has called into question if it had actually passed that threshold. Last month, Voyager scientists George Gloeckler and Len Fisk devised a test which they claimed would definitively prove whether or not the probe had actually exited the magnetic bubble surrounding the sun and planets and reached the space between the stars.
—–
Unmanned Space Probe Voyager (Plastic model)

The Relationship Between Hypothyroidism and Fibromyalgia

Hypothyroidism is a condition affecting the thyroid gland. The thyroid gland is located in the front of the neck, and is best characterized as a ‘small, butterfly-shaped gland.’ It’s responsible for releasing triiodothyronine and thyroxine hormones, which help the body regulate its growth, metabolism and cellular development.

When the thyroid gland is impaired, it fails to produce enough of the aforementioned hormones needed to regulate the body. Once that happens, people start developing symptoms from hypothyroidism.

People with fibromyalgia may have hypothyroidism, especially if they experience symptoms from the condition. Many people around the world actually experience symptoms from hypothyroidism if they already have fibromyalgia.

Since the symptoms from both conditions are similar, it’s difficult for people to know if they’re suffering with one or the other. Sometimes, people see symptoms from both because they’re suffering from both conditions at the same time.

What is the ‘true’ relationship between hypothyroidism and fibromyalgia? In this article, we’re going to examine the relationship between both conditions and learn why they manifest in similar ways.

The Relationship Between Hypothyroidism And Fibromyalgia

In order to understand the relationship between hypothyroidism and fibromyalgia, it’s important to understand more about either condition first. Let’s take a brief look at both in this section.

Hypothyroidism And Its Connection With Fibromyalgia

As mentioned, hypothyroidism is a disorder that affects the thyroid gland in the neck. It actually affects as much as 20 million people in the United States alone. Both men and women can develop hypothyroidism. Women over age 40 are actually more at risk of developing hypothyroidism than older men at the same age. At least 17 percent of women were found to have hypothyroidism, too. Other factors that increase the risk of developing hypothyroidism include genetics, diabetes, thyroid surgery and radiation therapy.

 Fibromyalgia And Hyperthyroidism

Hypothyroidism is characterized by having an under active thyroid gland. The condition can be caused by an autoimmune disease, thyroid surgery, pituitary disorders and any exposure to radiation therapy. Even though it’s simple to say hypothyroidism is caused by those conditions, the symptoms originating from that condition actually correlate with those from fibromyalgia. Both fibromyalgia and hypothyroidism share similarities, particularly their symptoms, such as the following: depression, muscle stiffness, widespread pain in the muscles and problems with sleep.

As much as 15 percent of people with hypothyroidism suffer from fibromyalgia, while people with fibromyalgia actually have low functioning thyroid glands. Due to this, some researchers now assume that fibromyalgia and hypothyroidism develop from the same causes. Both fibromyalgia and hypothyroidism are linked to chemical exposure, infection, illnesses and disorders of the nervous system.

Some people with fibromyalgia never get a proper diagnosis for hypothyroidism, especially if they actually possess the condition. As mentioned, the symptoms for one disorder (often fibromyalgia) typically overlap with the symptoms of the other (hypothyroidism), making the actual diagnosis either condition difficult.

Behind The Relationship Of Fibromyalgia And Hypothyroidism

Fibromyalgia and hypothyroidism have a lot to do with each other for one reason: the function of the thyroid gland. Back in 1997, Dr. John Lowe, a progenitor of fibromyalgia research, published his first report concerning the relationship between fibromyalgia and the thyroid’s intrinsic functions.

The report, which was published in the Clinical Bulletin of Myofascial Therapy, depicted Dr. Lowe’s findings that ‘clear relationships between fibromyalgia and thyroid function existed in some form.’ Due to that, he believed that ‘some form of hypometabolism, including the dysfunction of thyroids, provided some explanation behind the disorder.

In order to find those particular results, Dr. Lowe conducted thyroid tests on a group of patients diagnosed with fibromyalgia. People with elevated levels of a thyroid stimulating hormone weren’t tested beyond the first test. People with normal levels of the thyroid stimulating hormone, however, were given a thyrotropin releasing hormone stimulation test.

The results from the tests found that:

  • At least 10.5 percent of fibromyalgia patients had some form of primary hypothyroidism.
  • At least 36.8 percent of those tested were in a ‘normal thyroid state.’
  • At least 52.6 percent tested with TRH has results similar to that of central hypothyroidism.

The complete results of the tests revealed that as much as ’64 percent of patients with fibromyalgia had some type of deficiency with their thyroid hormones. In other words, the tests revealed that the amount of patients with fibromyalgia and hypothyroidism exceeds the levels of hypothyroidism found in the general population. On an interesting note, Dr. Lowe also provided suggestions for potentially treating fibromyalgia with thyroid hormone.

He suggested treating fibromyalgia using T3 thyroid hormone. Compared to that, other doctors prescribe ‘thyroid replacement’ medications like levothyroxine sodium and Thyrosine Complex, in addition to other fibromyalgia treatment alternatives like grape seed extract, magnesium and vitamin C.

Fibromyalgia And Hypothyroidism: What’s The Link?

After seeing evidence proving there’s some relationship between fibromyalgia and hypothyroidism, specifically their impairment of the thyroid, it’s time to see how all of that links together.

People with fibromyalgia know two things about how thyroid hormones affect them:

  • They play a role in affecting how people sleep.
  • They play a role in determining the sensitivity of hormones.

Now, fibromyalgia is thought to originate from an increase in pain sensitivity caused by neurochemical imbalances in the brain and spine. The link between fibromyalgia and hypothyroidism has some basis in that. They may have a link because of how the thyroid hormones work. Since the thyroid hormones actually set the body’s sensitivity thresholds to other hormones, it’s believed the alteration of the brain’s and spine’s neurochemicals may cause this to happen.

Interestingly enough, thyroid hormones also affect serotonin level in the brain, and serotonin is one of the brain chemicals affected by fibromyalgia. The thyroid hormone also plays a role in maintaining people’s regular sleep cycles. Due to this, researchers speculate that its sleep regulation might play a part in fibromyalgia development, too.

Medical researchers and doctors are still working on uncovering possible true links between fibromyalgia and hypothyroidism. With the information we do know, we pretty much can assume there’s a relationship between fibromyalgia and hypothyroidism after all.

Researchers Discover Biological Mechanism Behind The Hummingbird’s Sweet Tooth

redOrbit Staff & Wire Reports – Your Universe Online
Unlike many types of birds, hummingbirds not only have the ability to detect sweetness, they also have a craving for sugary substances, and now the authors of a new Science study have discovered the biological reason why these tiny flying creatures differ from their avian counterparts.
According to experts from the Harvard University Department of Organismic and Evolutionary Biology, the University of Tokyo Graduate School of Agricultural and Life Sciences, the Dublin City University Bioinformatics and Molecular Evolution Group, the University of California Department of Animal Science and the Harvard Medical School Department of Cell Biology, the reason resides in a taste receptor typically used to detect savory-type flavors.
As Live Science News Editor Megan Gannon explains, in many creatures the sweet-taste receptor that responds to sugars in plant-based carbohydrates is comprised of the T1R2 and T1R3 proteins, while the taste receptor that detects savory or umami flavors in meat and mushrooms is made up of the T1R1 and T1R3 proteins.
In 2004, researchers sequenced the genome of the chicken and its DNA revealed it was missing the T1R2 gene. The discovery suggested that chickens could not taste sweet flavors, and since it was the first bird to have its genome fully sequenced, Harvard University’s Maude Baldwin and Yasuka Toda of the University of Tokyo wondered if the same was true of other birds. They went on to sequence the genomes of 10 other birds, none of which had T1R2.
“Alligators do, and they’re some of the closest living relatives of birds. So at some point, as birds evolved from small dinosaurs, they lost their sweet tooth,” explained National Geographic’s Ed Yong. “What about hummingbirds? Hummingbirds feed largely on nectar, the sweet liquid that flowers produce… the sweeter the better; they’ll actually reject flowers whose nectar isn’t sweet enough. They lack the T1R2 gene, but they can clearly taste sugar.”
Baldwin, Toda and their colleagues have now discovered that the savory sensors on a hummingbird’s tongue also double as sugar sensors. The researchers figured that if hummingbirds had lost T1R2, it was possible that the T1R1-T1R3 savory sensor could detect sugars instead, Yong said. They tested their hypothesis on the Anna’s hummingbird, a medium-sized hummingbird native to the western North America, and discovered they were right.
Anna’s hummingbirds, as it turns out, can detect simple sugars like glucose and fructose, as well as some sweeteners like sorbitol and erythritol. They can also detect amino acids, according to Yong – it simply gained a new ability sometime during the last 42 to 72 million years, thanks to dramatic changes to the T1R1 and T1R3 proteins.
In order to figure out just how drastic those changes were, and which ones were important, the study authors spliced together both the chicken and hummingbird versions of the proteins in different combinations, then tested their responsiveness to sugars. They identified 19 amino acids located in one region of T1R3 that had changed during hummingbird evolution, warming the shape of the proteins and allowing them to stick to sugars, Yong noted.
In a statement, Baldwin said that this marks “the first time that this umami receptor has ever been shown to respond to carbohydrates.” She went on to tell National Geographic that she and her colleagues are investigating to see whether the 19 mutations happened all at once or in batches, and whether or not they are all directly involved in detecting sugars. “This has the possibility of answering bigger questions in evolutionary biology,” she added.
—–
Hummingbirds: A Life-size Guide to Every Species by Michael Fogden, Marianne Taylor and Sheri L. Williamson

Fluorine Used In Toothpaste Formed In Dying Stars

John Hopton for redOrbit.com – Your Universe Online

From ancient, dying stars to our toothbrushes, the chemical element fluorine which is used in toothpaste was formed in stars of the same type as our sun billions of years ago, scientists from Lund University in Sweden believe.

The formation happens towards the end of a star’s life, a point at which they have expanded, are heavier and are referred to as a red giant. Various chemical elements are created within the high pressure and temperature conditions inside a star. During the red giant stage, the star sheds its outer parts and forms a planetary nebula. The fluorine that is thrown out mixes with the gas surrounding the stars, known as the interstellar medium. The interstellar medium forms new stars and planets, and continues to be enriched when the new stars die. The planets in our own solar system, as well as our own sun, were actually formed out of material from these dead stars.

The theory that the creation of fluorine, until now something of a mystery, happened in this way is one of three that have been put forward in the past. According to Nils Ryde, a reader in astronomy at Lund, the new research indicates that “fluorine in our toothpaste originates from the sun’s dead ancestors.” Working with doctoral student Henrik Jönsson and colleagues from Ireland and the US, Ryde looked at stars formed at various points in the history of the universe to ascertain if the amount of fluorine they contain fits with this theory.

Remarkably, the amount of different elements a star contains can be calculated by analyzing the light emitted it emits. This is due to the fact that light of a certain wavelength is indicative of a certain element. Light with a wavelength in the middle of the infrared spectrum, the area where the important signal is found in this case, was studied by the researchers using a telescope in Hawaii and a new kind of instrument that is sensitive to this aspect of light.

“Constructing instruments that can measure infrared light with high resolution is very complicated and they have only recently become available,” Ryde explained.

The team now intends to look at other types of stars too. One of the things they want to assess is whether fluorine could have been produced in the early universe, before the first red giants had formed. They will also study environments in the universe that are different from the environment surrounding the sun. For example, that which is close to the supermassive black hole at the center of the Milky Way, where the cycle of stars dying and new ones being born goes much faster than it does around the sun.

Ryde tells us that “By looking at the level of fluorine in the stars there, we can say whether the processes that form it are different.”

Treating Pain By Blocking The ‘Chili-Pepper Receptor’

American Chemical Society

As anyone who has bitten into a chili pepper knows, its burning spiciness — though irresistible to some — is intolerable to others. Scientists exploring the chili pepper’s effect are using their findings to develop a new drug candidate for many kinds of pain, which can be caused by inflammation or other problems. They reported their progress on the compound, which is being tested in clinical trials, in ACS’ Journal of Medicinal Chemistry.

Laykea Tafesse and colleagues explain that decades ago, scientists had pegged a compound called capsaicin as the active ingredient in chili peppers that causes fiery pain. In the 1990s, researchers were able to sequence the genetic sequence for the protein “receptor” that capsaicin attaches to in the body. The receptor is a protein on cells that acts as a gate, allowing only certain substances into a cell. The advance launched a hunt for compounds that can block this gate, cut off the pain signal and potentially treat pain that current drugs are no match for. Some of the molecules resulting from this search have been tested in people but cause unwanted side effects, or they wouldn’t work well as oral medication. Tafesse’s team wanted to explore variations on this theme to find a better drug candidate.

They produced more than two dozen similar compounds, each with its own unique molecular tweak. They tested them in the lab and in animals for the traits they were looking for, such as potency, safety, the ability to dissolve in water and whether they can be taken orally. One prospect showed the most promise, and it has advanced into clinical trials.

Reference: “Structure-Activity Relationship Studies and Discovery of a Potent Transient Receptor Potential Vanilloid (TRPV1) Antagonist 4-[3-Chloro-5-[(1S)-1,2-dihydroxyethyl]-2-pyridyl]-N-[5-(trifluoromethyl)-2-pyridyl]-3,6-dihydro-2H-pyridine-1-carboxamide (V116517) as a Clinical Candidate for Pain Management” – Journal of Medicinal Chemistry

Emergency Department Nurses Aren’t Like The Rest Of Us

Kobi Print, University of Sydney
Emergency department nurses aren’t like the rest of us – they are more extroverted, agreeable and open – attributes that make them successful in the demanding, fast-paced and often stressful environment of an emergency department, according to a new study by University of Sydney.
“Emergency nurses are a special breed,” says Belinda Kennedy from Sydney Nursing School, a 15 year critical care veteran who led the study.
“Despite numerous studies about personalities of nurses in general, there has been little research done on the personalities of nurses in clinical specialty areas.
“My years working as a critical care nurse has made me aware of the difficulty in retaining emergency nurses and I have observed apparent differences in personality among these specialty groups. This prompted me to undertake this research which is the first on this topic in more than 20 years.
“We found that emergency nurses demonstrated significantly higher levels of openness to experience, agreeableness, and extroversion personality domains compared to the normal population.
“Emergency departments (ED) are a highly stressful environment – busy, noisy, and with high patient turnover. It is the entry point for approximately 40 per cent of all hospital admissions, and the frequency and type of presentations is unpredictable.
“Emergency nurses must have the capacity to care for the full spectrum of physical, psychological and social health problems within their community.
“They must also able to develop a rapport with individuals from all age groups and socioeconomic and cultural backgrounds, in time-critical situations and often at a time when these individuals are at their most vulnerable.
“For these reasons, ED staff experience high levels of stress and emotional exhaustion, so it’s understandable that it takes a certain personality type to function in this working environment.
“Our research findings have potential implications for workforce recruitment and retention in emergency nursing.
“With ever-increasing demands on emergency services it is necessary to consider how to enhance the recruitment and retention of emergency nurses in public hospitals. Assessment of personality and knowledge of its influence on specialty selection may assist in improving this.
“The retention of emergency nurses not only has potential economic advantages, but also a likely positive impact on patient care and outcomes, as well as improved morale among the nursing workforce,” she said.
The research team consisted of Associate Professor Kate Curtis and Associate Professor Donna Waters from Sydney Nursing School, University of Sydney.
Research published in Australasian Emergency Nursing Journal.
> Explore Further…

Does Holistic Medicine Beat Fibromyalgia Pain?

About Holistic Treatments

The medical community advises patients not to combine conventional medical regimens with unconventional methods of treatment. That said, an unconventional, holistic treatment may accompany conventional treatment as an additional measure of symptom relief.

There are many treatment options available in the holistic health community, including acupuncture, acupressure, massage, meditation, and chiropractic treatments. This article will investigate the critical applications of these treatments, particularly that of acupuncture. Acupuncture assumes the greatest level of precedence in this article because it predates Western medicine and played a preeminent role in eastern practices.

Acupuncture, chiropractic, massage, meditation and additional treatments have long been used to mitigate the symptoms of fibromyalgia, in conjunction with western medical treatments. The demand for homeopathic medicine is steadily increasing, as they serve to integrate the body, mind and soul, collectively. Of course, prior to embarking upon an alternative medicine treatment, one should consult with a doctor to ensure that any conventional treatments do not counteract the alternative treatment, and vice versa. A doctor can help a patient devise an effective blend of holistic and conventional medicine.

Acupuncture

Acupuncture entails the insertion of needles into the epidermis and dermis layers of the skin. This is done to catalyze specific energy pathways at designated points. The acupuncture professional generally moves the needles very gently and this induces a pain suppressing effect in the body. Acupuncture needles, at specific points, actually signal the influx of endorphins throughout the body. This is believed to remove energy blockages from the body.

According to studies, acupuncture is highly effective in modifying pain pathways in the nervous system. Specifically, it has been noted that acupuncture can transform the brain’s chemistry altogether, by altering which neurotransmitters are released, especially neurotransmitters involved in pain perception and management. Even a single session of acupuncture has been said to significantly reduce pain in acupuncture patients. Of course, these observed effects will require subsequent studies in the field of pain management and fibromyalgia.

Acupuncture as Fibromyalgia Treatment

Chiropractic

Chiropractic treatments are frequently sought out by fibromyalgia patients. Many individuals rely on this alternative form of medicine in order to attend to pain and aches in a number of different regions of the body. These treatments may effectively increase lumbar and cervical range of motion as well.

Chiropractic treatments are predicated on the notion that the body engages in its own concerted healing efforts. By aligning the spine properly and improving the flow of nerve impulses throughout the body, the chiropractor makes it easier for the body to correct problems.

The objective of this practice is to increase the alignment and fluidity of the spine. Chiropractors go about this by making small alterations using their hands. Furthermore, they modify stretching, pressure and movements accordingly in order to restore the balance of the skeletal structure. This is believed to improve an individual’s health.

Deep Tissue Massage

Many individuals who suffer from fibromyalgia opt for deep tissue massage to increase circulation to the muscles, and to alleviate tension, stiffness and pain. Deep tissue massage was designed to establish balance and proper alignment of connective tissue and muscles.

This form of massage is similar to other types of massage therapy, but integrates deeper, more intensive strokes designed to dissolve tension and pain.  Typically, those who suffer from chronic muscle tension have a preexisting condition or injury. When adhesions develop in the muscle and connective tissue, those adhesions inhibit circulation. Adhesions are commonly referred to as “knots” in the muscles, because that is what they often feel like.

When a professional conducts a deep tissue massage, he/she dissolves these adhesions in order to promote pain reduction and a better flow of movement. In order to ensure the effectiveness of these massages, it is critical that that the client relax the muscles to permit the masseuse to dig deeply into the muscle tissue.

Sometimes, these massages are uncomfortable, and individuals occasionally report some degree of pain. Of course, if the pain becomes unbearable, it is critical to notify the massage therapist about this. While stiffness and pain typically does ensue after such a massage, it normally disappears within 1-2 days.

A poll of chronic pain sufferers discussed the effectiveness of deep tissue massage for chronic pain. 34,000 people reported that deep tissue massage was just as effective as acupuncture, medication exercise and diet for reducing pain. Patients who suffered from consistent fibromyalgia pain also indicated that this was one of the best pain reduction methods.

Acupressure

While acupressure possesses some similarities with acupuncture, it is different in many respects. Essentially, it involves applying pressure to acupressure points that inhibit the transmission of pain signals, rather than inserting needles into those points. It has been found to provide temporary pain relief.

Biofeedback

This therapy is used extensively in order to treat pain. Generally speaking, the body harbors a number of involuntary responses and mechanisms, over which the human mind usually lacks control. These functions are controlled by the autonomic nervous system, and they include heart rate and blood pressure. However, with the aid of biofeedback, individuals with fibromyalgia can assume more control over their involuntary bodily mechanisms.

When a person undergoes biofeedback, they are monitored with electrodes, which are attached to the surface of their skin. In some cases, a device is used to monitor the fingers, as well. The sensors that are attached to one’s body transmit stimuli that are displayed in the form of an image, a flash, or even a sound. These sensory stimuli correspond to the patient’s heart rate, blood pressure, etc. When the human body experiences some degree of stress or anxiety, these physiological functions will change.

By monitoring the biofeedback stimuli while performing a relaxation exercise, one can learn exactly what it feels like when the relaxation exercise is performed quickly, and therefore, can master the exercise much more quickly. For example, a biofeedback session may teach one to regulate their brain waves in order to induce relaxation and relieve their headache. As individuals engage in more and more biofeedback sessions, they gain increasingly more control over their bodies.

US Officials Rule That Monkey Selfies Cannot Be Copyrighted

redOrbit Staff & Wire Reports – Your Universe Online
Bad news for the British photographer who allowed a curious monkey to take a selfie using his equipment – or, indeed, any of us looking to allow wild animals to borrow our smartphones or cameras to take self-portraits – US officials have issued new rules declaring that the images cannot be copyrighted.
In a recent update to its regulations, the US Copyright Office has ruled that it “will not register works produced by nature, animals, or plants,” and that it “cannot register a work purportedly created by divine or supernatural beings.” It even goes on to list a series of examples, the first of which specifically states that “a photograph taken by a monkey” could not be copyrighted by the agency.
The ruling effectively closes the book on an intellectual property debate between David Slater, the UK photographer whose camera was recently used by a curious crested black macaque living in the Indonesian island of Sulawesi to snap a picture of herself, and Wikimedia Commons, which argued that the image should be considered public domain and not Slater’s property.
“Slater insists he owns the rights to the image of a monkey staring curiously into the camera as he snapped a selfie, and the photographer claims he’s suffered considerable expense to secure the photos,” explained Abby Phillip of the Washington Post. “The fact that they have been, essentially, distributed for free on the Internet through the Wikimedia Commons Web site has cost him untold amounts of money.”
Slater told Phillip on Wednesday that the free distribution of the photo was “ruining my business,” that if it was “a normal photograph and I had claimed I had taken it, I would potentially be a lot richer than I am.” When the photograph first appeared on Wikimedia Commons in 2012, he requested that it be taken down, and the website complied. However, it would later be re-added by another user, and this time it would remain online.
Slater had reportedly been planning to take Wikimedia to court for refusing to comply with his request to remove the photos, claiming it had cost him royalties, said Engadget’s Mariella Moon. The Copyright Office’s ruling that animal-created content cannot be registered as the intellectual property of a human is a clear victory for Wikipedia, added Mashable’s Jason Abbruzzese.
Hundreds of photos were taken by a critically endangered macaque that had swiped Slater’s camera when the wildlife photographer visited Indonesia in 2011, said Los Angeles Times reporter Lauren Raab. The animal, which is well known for their pinkish rear-ends and the punkish tufts of head hair, ended up pushing the shutter button repeatedly.
In claiming ownership of the photos, Slater had argued that the monkey should be viewed as his assistant, Raab added. Wikimedia Foundation’s Chief Communications Officer Katherine Maher countered that the picture’s copyright belonged to the person who took the picture – which, as mentioned earlier, was not a person at all in this instance.
“Monkeys don’t own copyrights. What we found is that US copyright law says that works that originate from a non-human source can’t claim copyright,” Maher told Phillip, explaining that Slater would have had to make “substantial changes” to the photos and not just cosmetic ones in order to have a valid copyright claim on the finished product.
“So what we found was that if the photographer doesn’t have copyright and the monkey doesn’t have copyright then there’s no one to bestow the copyright upon,” she added. The Copyright Office apparently agrees, placing the macaque selfies in the same categories as claiming ownership of a mural that was painted by an elephant.
—–
SHOP NOW – GoPro Hero3: White Edition – (131’/ 40m Waterproof Housing)

Novel Method Can Hack Popular Apps With Up To A 92 Percent Success Rate

redOrbit Staff & Wire Reports – Your Universe Online

A new security vulnerability in mobile operating systems could allow hackers to gain access to a user’s personal information with a surprising success rate, researchers from the University of California, Riverside and the University of Michigan claim in a new report.

According to Sean Nealon of Phys.org, study authors Qi Alfred Chen, Zhiyun Qian, Z. Morley Mao reported that they believe the flaw exists in Android, Windows and iOS platforms, though they only demonstrated it using an Android device.

The method was found to be successful between 82 percent and 92 percent of the time on six of seven popular apps tested, including those of Gmail, Chase Bank, WebMD and H&R Block, according to reports. Only the Amazon app, with a success rate of 48 percent, proved to be somewhat more difficult to crack.

The authors, who will present their findings Friday at the USENIX Security Symposium, explained that the type of attack is known as a user-interface (UI) state interference attack, said CBS News reporter Michael Roppolo. This type of attack allows hackers to run the malicious software in the background without the user being alerted to the activity.

“The researchers say it could allow a hacker to steal a user’s password and social security number, peek at a photo of a check on a banking app, or swipe credit card numbers and other sensitive data,” Roppolo explained. “In Android, an entry point that the researchers call a ‘shared-memory side channel’ could allow hackers to detect what’s going on in a user’s app,” and iOS device and Windows phone users could also be affected by the issue.

“A user would be vulnerable if they downloaded an app that appeared to be benign but in reality was malware; hackers could then exploit this vulnerability to observe whatever personal data the user entered,” the CBS reporter added. “One example might be when a user opens a banking app and logs in. The hacker would be notified and could begin an ‘activity hijacking attack,’ allowing them to get a user’s personal information.”

While the authors report they have yet to test their method on other mobile platforms, they believe that it will work because the operating systems share one of the features exploited during the Android system test. The researchers began investigating the method because they believed there were security concerns associated with so many different apps being created by the same developers and running on largely the same shared infrastructure.

“The assumption has always been that these apps can’t interfere with each other easily. We show that assumption is not correct and one app can in fact significantly impact another and result in harmful consequences for the user,” Qian said. “By design, Android allows apps to be preempted or hijacked, but the thing is you have to do it at the right time so the user doesn’t notice. We do that and that’s what makes our attack unique.”

Unique, and effective, according to the University of California, Riverside. Qian, an assistant professor in the university’s Computer Science and Engineering department, and his colleagues reported a 92 percent success rate in attacking both Gmail and H&R Block using this new method.

Their technique was 86 percent successful against Newegg, 85 percent successful against WebMD, and 83 percent successful against Chase Bank and Hotels.com. Only the Amazon app had success rate of under 80 percent, and the authors explained that it was “more difficult to attack because its app allows one activity to transition to almost any other activity, increasing the difficulty of guessing which activity it is currently in.”

—–

Amazon.com – Read eBooks using the FREE Kindle Reading App on Most Devices

The Best Argument EVER In Support Of Sunscreen: VIDEO

Rayshell Clapper for redOrbit.com – Your Universe Online

Most people know that the sun damages skin. This damage can lead to freckles, wrinkles, sun spots, or even cancer. Of all the skin cancers, melanoma is the most deadly. Yet many people do not wear sunscreen on a daily basis. Some refuse to do so for one reason or another while others simply have not incorporated it into their morning routine out of laziness or even simply because they have just forgotten to do so. Some actively choose not to wear sunscreen out of disbelief or a lack of information. But an NBC News article shows how one photographer and artist by the name of Thomas Leveritt creates a visual argument that should convince everyone to wear sunscreen.

Leveritt took one camera that showed people as they see themselves in a mirror and had an ultraviolet camera that showed their reflections with a UV lens. The difference was astonishing. Where in the first camera, skin looked flawless and clear, and in the other the person could see all the sun damage, from major to minor. Someone with no freckles as we see them normally might have hundreds under the UV camera. It really was enlightening.

Then he asked each person to put on sunscreen, and wham! They each saw an incredible difference. A Slate article about the video explains that “The camera detects UV from the Sun that’s reflected off people’s skin; the point of sunscreen is to absorb that UV so it doesn’t even reach the skin. Since no UV is reflected from sunscreen, it appears black in the video, even though in visible light it looks white. It looks like people are smearing crude oil on their faces.”

By watching this video, viewers can see how dramatic the protection is that sunscreen provides. It is obvious that the skin is protected with even just a little smudge.

Slate writer Phil Plait identifies the three varieties of UV: UVA, UVB, UVC. UVC is the worst and brings enough energy to kill cells. The good news is that the sun does not emit a ton of UVC and what it does emit, the Earth’s air and the ozone layer absorb. UVB is the next most dangerous, and most of it is also absorbed in the ozone. Too much UVB exposure, though, damages skin, causes sunburn, destroys vitamin A, and causes cancer. In small amounts, UVB is good for producing vitamin D in the skin, but too much can be deadly. UVA is the least dangerous but still not completely safe. Serious long-term exposure can lead to cancer and it also destroys collagen in the skin thus causing aging such as wrinkles, freckles and sun spots. All three can lead to melanoma, which leads to the death of 9,000 people a year.

What is of particular interest here is that Leveritt simply wanted to shoot the video for the art of it as well as to showcase the cool technology. As he told Fast Company reporter Jeff Beer, he primarily used three Canon cameras: a modified Canon 7D, a regular Canon 7D, and a Powershot.

“I figured out how to get enough light into a DSLR to let it record at sub 380nm wavelengths–it’s a pretty interesting problem–and then had this almost magical camera lying around, which takes great pictures. (I) found that people reacted so strongly and interestingly to seeing themselves in UV, I decided I should capture that. I messed around with various beamsplitter/teleprompter/coaxial rigs to get the right shot, before getting something which looked about right,” Leveritt told Beer.

Despite the fact that Leveritt’s sole purpose was not to inform others of the impressive benefits of sunscreen, his project does just that. Through the visual, we see the argument supporting the necessity of sunscreen. When a face goes from freckled and speckled to completely black and obviously protected, even the staunchest suspect cannot deny that sunscreen blocks out dangerous UVA and UVB rays. Plus, it is pretty emotional to see the reactions of those who participate as they see their faces first in the regular camera and then in the UV one.

—–

Amazon.com – Read eBooks using the FREE Kindle Reading App on Most Devices