Home-Cooked Meals Not Necessarily Better For Your Health

Chuck Bednar for redOrbit.com – Your Universe Online
While eating home-cooked meals has long been viewed as being a healthier alternative to eating out, especially at fast food restaurants, the same cannot necessarily be said about preparing those dinners, researchers from Rush University Medical Center in Chicago have discovered.
In fact, while experts from the Johns Hopkins Bloomberg School of Public Health recently reported that those who eat home-cooked meals typically consume fewer calories, fewer carbohydrates, less sugar and less fat than those who cook less or not at all, the new study indicates that spending a lot of time cooking can also be unhealthy.
According to Rodger Dobson of The Telegraph, lead investigator Dr. Brad Appelhans and his colleagues found a link between the amount of time people spend preparing dinners and an increased risk of high blood pressure, high cholesterol and other problems associated with heart disease.
On the other hand, those who spent less time cooking meals at home reduced their risk of developing these conditions by more than one-third, according to the results of research involving over 2,700 women. The reason may be that people who cook for themselves tend to eat larger portions than those buying ready-made meals or snack foods, or those easier-to-prepare foods may have gotten healthier, Dobson said.
“While the reasons underlying this association are still unclear, we think these findings indicate the need to revise our public health messaging, including the need to emphasize healthy cooking methods and to consider the potential benefits of healthy convenience meals,” said Dr. Appelhans. Little research has been done on the impact of home cooking on a person’s wellbeing, he said, suggesting that doctors take another look at the practice.
He and his co-authors examined 14 years of data provided by more than 2,755 women ranging in age from 40 to 60. They measured five markers of metabolic syndrome, a condition in which a person has three out of the five factors that put them at greater risk for heart disease and published their findings in the latest edition of the journal Preventative Medicine.
The women who spent the longest time cooking and cleaning up meals were more likely to be at risk for developing those symptoms, which include obesity, fat levels in the blood, cholesterol, hypertension and blood glucose levels, said Fiona Macrae of the Daily Mail. The risk of developing metabolic syndrome increased over time, especially in those who spent the most time preparing foods, while that risk fell in those individuals who reduced the time they spent cooking.
“In the past three or four decades, the proportion of our food that we prepare at home has decreased, and the prevalence of obesity has increased. Noting this, public health experts frequently promote home cooking as a way to curb the obesity epidemic and reduce risk factors for heart disease and diabetes,” Dr. Appelhans told Macrae.
“However, our research… [found] that greater time spent preparing food each week is actually linked to increasing odds of having risk factors for heart disease and diabetes,” he added.
This is not the first study to find that preparing home-cooked meals can be unhealthy, as researchers from the North Carolina State University sociology and anthropology departments reported in September that making dinners for the whole family to enjoy can be a source of stress and conflict for many mothers.
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.
—–
May we suggest – 100 Days of Real Food: How We Did It, What We Learned, and 100 Easy, Wholesome Recipes Your Family Will Love by Lisa Leake

The Brains Of Obese Children May Be Wired Differently

Chuck Bednar for redOrbit.com – Your Universe Online

Obese children may crave more sugar than other children, according to new research that demonstrates that the reward centers of their brains are stimulated more intensely when exposed to sweet foods.

According to Sarah Knapton, Science Editor for The Telegraph, the study authors are not sure if the change takes place over time, or if some kids are simply born with a biological impulse to crave sugary foods. Their research appears in the International Journal of Obesity.

“The study is a wake-up call that prevention has to start very early because some children may be born with a hypersensitivity to food rewards or they may be able to learn a relationship between food and feeling better faster than other children,” first author Kerri Boutelle of the University of California, San Diego (UCSD) told Knapton.

“The take-home message is that obese children, compared to healthy weight children, have enhanced responses in their brain to sugar,” added Boutelle, a professor in the Department of Psychiatry and founder of the university’s Center for Health Eating and Activity Research (CHEAR). “That we can detect these brain differences in children as young as eight years old is the most remarkable and clinically significant part of the study.”

The UCSD researchers scanned the brains of 23 children between the ages of eight and 12 as they tasted one-fifth of a teaspoon of water mixed with table sugar (sucrose). They were told to swirl the sugar-water mixture in their mounts with their eyes closed, mentally focusing on the taste, the university said in a statement.

Ten of the youngsters participating were obese, while the remaining 13 had healthy weights, as classified by their body mass index (BMI) readings. They were all pre-screened for potential confounding factors, including psychiatric disorders such as ADHD, and all of them liked the taste of sucrose.

“The brain images showed that obese children had heightened activity in the insular cortex and amygdala, regions of the brain involved in perception, emotion, awareness, taste, motivation and reward,” the university said. “Notably, the obese children did not show any heightened neuronal activity in a third area of the brain – the striatum – that is also part of the response-reward circuitry and whose activity has… been associated with obesity in adults.”

Typically, this part of the brain does not fully develop until adolescence, the researchers said. One of the more interesting findings of their study, they added, was the fact that the brain scans could be documenting the early development of the food reward circuitry in pre-adolescents for the first time.

Sumit Passary of Tech Times noted that the study does not show a direct relationship between overeating and sugar hypersensitivity, but it does support the belief that obese kids likely have an increased psychological reward response to food. That means they could be attracted to sugary foods throughout their lifetimes.

The Centers for Disease Control and Prevention (CDC) suggests that more than one-third of the overall American population is obese, Passary said. In addition, research suggests that children who are obese are between 80 percent and 90 percent likely to become obese adults, added Hannah Osborne of International Business Times.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Could A Substance Produced By Bees Be The Cure To Hair Loss?

Chuck Bednar for redOrbit.com – Your Universe Online
A substance produced by bees to coat and seal their hives, could help balding people reverse their hair loss, researchers from Hokkaido University in Japan report in a recent edition of the Journal of Agricultural and Food Chemistry.
Propolis, a natural compound that is produced by honeybees which Medical Daily’s Susan Scutti explains is used in much the same way that people use caulk in their own homes, was found to encourage hair growth in mice. While more research is required, the study authors believe it could also work on humans.
Lead author Ken Kobayashi and his colleagues said that the resin-like substance, which is comprised primarily of tree sap, is used to seal small gaps in their hives. Not only does propolis serve as a physical barrier, but it also contains active compounds that can help prevent fungi and bacteria from infiltrating the bees’ home, the American Chemical Society (ACS) said in a statement.
“People from ancient times had noticed propolis’ special properties and used it to treat tumors, inflammation and wounds. More recently, research has shown that the substance promotes the growth of certain cells involved in hair growth though no one had yet tested whether that in turn would result in new locks,” the ACS said.
Kobayashi and his colleagues set out to do just that, and when they tested the substance on mice that had been waxed or shaved, they found that those who had been treated with it regrew their fur faster than those that had not been. Furthermore, they also found that applying the topical solution “stimulated migration of hair matrix keratinocytes into the hair shaft” – in other words, increasing the number of cells involved in the process of growing hair.
“Although they tried the material on mice that could grow fur rather than balding mice, the researchers note that hair loss conditions often result from abnormal inflammation,” the Chemical Society noted. “Propolis contains anti-inflammatory compounds, so they expect it could help treat balding conditions. They add that further testing is needed to see if the beehive material affects human hair follicles.”
Scutti said that propolis contains biologically active compounds called flavonoids, and over 2,00 years ago, people started using it for a variety of medical reasons, including tumor treatment and fighting infection in open wounds. These days, the substance is available at pharmacies and health food markets in several forms, including creams, ointments, powders, tablets, capsules, and extracts.
“In most cases, it is recommended you apply propolis directly to the area being treated (with the exception of the eyes), and generally it is considered nonirritating to the skin,” she said. “Oral uses, though less common, also exist and safety studies suggest it is nontoxic. However, it is not uncommon for people to have an allergic reaction.”
“Tested by scientists, propolis is active against bacteria, viruses, and protozoans, among other microorganisms,” Scutti added. “Most often, people use it to treat wounds and also to speed the healing of canker sores and outbreaks of genital herpes. Taken in the form of a mouthwash, propolis helps soothe following oral surgery. And a study… found the extract worked about as well as the drug against the parasite giardiasis.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Modest Gains Made In Latest UN Climate Summit Agreement

Chuck Bednar for redOrbit.com – Your Universe Online
UN climate negotiators had to go into overtime to reach a climate agreement, ending their extended session with an agreement that for the first time requires action from developing countries as well as industrial nations.
The agreement was announced early Sunday morning following 11 days of negotiations involving representatives from over 200 countries, according to USA Today and Washington Post reports. The talks, which were described as “often rancorous,” went more than 24 hours past their deadline as the delegates worked towards a comprehensive climate treaty.
Joby Warrick of the Washington Post described the gains as “modest,” and said that many of the requirements were “repeatedly watered down” in order to convince over 190 countries to sign off on the agreement. However, it also was said to bring the world one step closer to a global treaty, which will be finalized during next year’s meeting in Paris.

Secretary-General Ban Ki-moon (center right) speaks with Ollanta Humala (center left), President of Peru, at the opening of the Lima Climate Action High-level Dialogue. This year's Climate Change Conference in Lima is the twentieth session of the UN Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP20). Credit: UN Photo/Mark Garten


“Under the agreement, each country will have to submit early next year a detailed plan for addressing carbon emissions,” Warrick said. “But a series of compromises Friday and Saturday stripped away specific requirements for cutting pollution and left no provisions for outside verification to ensure that the plans are carried out. The softened language was denounced by environmental groups as unacceptably weak.”
According to BBC News, the final draft of the document restored a promise to poorer nations that a “loss and damage” program would be established to help they cope with the financial ramifications of rising global temperature. However, instead of saying that countries “shall” demonstrate how they intend to meet their designated emissions target, it said that those nations “may” do so.
In addition, the document calls for an “ambitious agreement” in 2015 that reflects “differentiated responsibilities and respective capabilities” of each nation; for developed countries to provide financial support to “vulnerable” developing nations; for national pledges to be submitted by the first quarter of 2015 by those states “ready to do so”; and for countries to set targets that go beyond their “current undertaking,” the British news agency noted.
Sam Smith, chief of climate policy for the environmental group WWF, told BBC that the agreement “went from weak to weaker to weakest and it’s very weak indeed,” while Friends of the Earth International chairperson Jagoda Munic said concerns over the lack of “a fair and ambitious outcome” had been proven “tragically accurate.”
However, according to USA Today, Manuel Pulgar-Vidal, Peru’s environment minister and the conference president, and Christiana Figueres, the UN’s top climate official, proclaimed the talks a success. French foreign minister Laurent Fabius added that “a lot of good work was done in Lima, but it left at least a little work to be done in Paris.”
Among the issues still to be resolved include demands from poorer nations for more financial assistance from wealthier countries to help them reduce emissions, according to Emily Godsen of The Telegraph. Rich countries had previously promised a vague goal of mobilizing $100 billion of funds for poor nations starting in the year 2020, but Godsen said that the program has not been well defined which led poorer nations to accuse their wealthier counterparts of not pulling their weight.
“The biggest thing that is really, really unresolved is the money,” Michael Jacobs, visiting professor at the LSE’s Grantham climate research institute, told The Telegraph. “The developed countries have got to find some way of showing they can provide the $100bn they promised, and at least some financial contribution post-2020. This is hard: this is a core demand of the developing countries but the hardest things for the developed countries, both because they don’t feel they have got so much money but also because it’s hard to budget ahead.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Like It Or Not, Facebook Has No Plans To Add A Dislike Button

Chuck Bednar for redOrbit.com – Your Universe Online
Mark Zuckerberg has officially given the thumbs-down to adding a dislike button to Facebook, effectively putting an end to speculation that users of the social network would be able to express their dislike for their friends’ posts.
According to Yahoo News tech columnist Alyssa Bereznak, the rumors that Facebook was considering adding the long sought-after dislike button came during a video Q&A session involving Zuckerberg. During the town hall-style event, a University of California-Davis law student asked if the company was considering adding the feature.
Zuckerberg said that they were “thinking about it,” before clarifying that he really was not in favor of the idea. “Some people have asked for a dislike button because they want to be able to say, ‘That thing isn’t good,’ ” he said. “And that’s not something that we think is good for the world. So we’re not going to build that.”
“I don’t think there needs to be a voting mechanism on Facebook about whether posts are good or bad. I don’t think that’s socially very valuable or good for the community to help people share the important moments in their lives,” he added. However, he did state that he was open to a system for expressing emotions other than positivity, including surprise, sadness or empathy, said Deborah Hastings of the New York Daily News.
“One of the things we’ve had some dialogue about internally…is, what’s the right way for people to quickly be able to express a broader range of emotions?” Zuckerberg explained. “I think giving people the power to do that in more ways with more emotions would be powerful. But we need to figure out the right way to do it, so it ends up being a force for good and not a force for bad. We don’t have anything that’s coming soon.”
One of the potential drawbacks to adding an actual dislike button, analysts told the AFP news agency, is the possibility that members could use it on marketing messages, meaning that the feature could cause Facebook to alienate its advertisers. Zuckerberg also noted that people were free to express themselves by commenting.
Richi Jennings of Computerworld’s IT Blogwatch column said that Zuckerberg’s comments indicate that he wanted “to brush off user suggestions and complaints,” and that it was becoming “increasingly clear that Facebook is just about making money out of you (not that we’re surprised)… With ‘friends’ like Facebook, who the heck needs enemies?”
Facebook did roll out a new feature last week that allows users to go a business’s profile page and perform a variety of tasks, including using an app, going to their external website, book reservations or sign up for a subscription service by clicking on a single button, according to the Wall Street Journal.
The social media service is calling them “call-to-action” buttons, and says that they will be located to the left of a page’s Like button. The buttons provide “new ways for people to interact with businesses,” Facebook added, and include book now, contact us, use app, play game, shop now, sign up and watch video. They will begin appearing on pages in the US over the next few weeks, according to the Journal, and will launch globally next year.
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

NASA Interns Go Viral Thanks To Orion-Inspired Parody Song

Chuck Bednar for redOrbit.com – Your Universe Online
A group of NASA interns are apparently reaching for the stars, and perhaps even looking to launch themselves into a new career, thanks to their new parody of Meghan Trainor’s hit song “All About That Bass.”

The song, fittingly titled “All About That Space,” was made by the Pathway Interns at the Johnson Space Center as part of an effort to publicize the Orion spacecraft’s first test flight – at least, that’s what they claim, according to Chris Mills of Gizmodo.
“But let’s be honest, this is probably just the result of budget cuts and bored interns. Either way, it’s everything a NASA pop parody video should be – bad puns, gratuitous use of rocket backdrops, and surprisingly OK vocals,” Mills said, quipping that most interns “get coffee and do endless photocopying,” but not those working at NASA!
Space.com Staff Writer Miriam Kramer said that the lyrics to the song were written by NASA intern Sarah Schlieder. Those lyrics include: “If you got boosters boosters, just raise ’em up / ‘Cause every spacecraft needs propulsion / From the bottom to the top / Hey, they’re working so hard, don’t you love these NASA guys? / They will take us so far the first time that Orion flies.”
CNET pointed out that this is far from the only performance to riff on Trainor’s hit song, including the Thanksgiving-themed “All About That Baste” and a Star Wars-inspired cover called “All About That Base.” While the song itself “might not be as catchy as the original,” the website said it was “easy to look past that because the… interns are just so goofily endearing.”
As of Saturday, the NASA video was already nearing the one-million views milestone – just one day after being uploaded to YouTube, according to RT.com. While that may be a far cry from the more than 375 million views Trainor’s video has received, it has nonetheless made the interns somewhat of a viral video sensation.
Orion, which successfully completed its Exploration Flight Test-1 (EFT-1) earlier this month, is expected to eventually carry US astronauts on a manned mission to Mars. It is the first American spacecraft built for astronauts destined for deep space since the Apollo program, according to NASA, and is designed to travel farther than any previous manned mission.
“With lessons learned from Orion’s flight test, NASA can improve the spacecraft’s design while building the first Space Launch System rocket, a heavy booster with enough power to send the next Orion around the moon for Exploration Mission-1,” the US space agency explained. “Following that, astronauts are gearing up to fly Orion on the second SLS rocket on a mission that will return humans to deep space for the first time in more than 40 years.”
Shortly after the successful completion of the test flight, NASA confirmed that it was already working on its next capsule, which would be used to fly Exploration Mission-1 (EM-1). While the next Orion mission will also be unmanned, it will involve a far longer flight, traveling around the moon while carrying an operational service module to produce power.
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Many Workers Sacrifice Sleep For Work Hours And Long Commutes

Provided by the American Academy of Sleep Medicine

A new study shows that paid work time is the primary waking activity exchanged for sleep and suggests that chronic sleep loss potentially could be prevented by strategies that make work start times more flexible.

Results show that work is the dominant activity exchanged for less sleep across practically all sociodemographic categories. Compared to normal sleepers, short sleepers who reported sleeping 6 hours or less worked 1.55 more hours on weekdays and 1.86 more hours on weekends or holidays, and they started working earlier in the morning and stopped working later at night. The highest odds of being a short sleeper were found among adults working multiple jobs, who were 61 percent more likely than others to report sleeping 6 hours or less on weekdays. Respondents who were unemployed, retired or absent from the labor force also obtained significantly more sleep and were less likely to be short sleepers.

“The evidence that time spent working was the most prominent sleep thief was overwhelming,” said lead author Dr. Mathias Basner, assistant professor of sleep and chronobiology in psychiatry at the University of Pennsylvania Perelman School of Medicine in Philadelphia.

Short sleepers also traveled more, started traveling earlier in the morning, and stopped later in the evening than normal sleepers. The travel pattern, with peaks at 7 a.m. and 5 p.m., strongly suggests that the majority of travel time is associated with commuting.

According to Basner, the results point to several possible solutions for workers’ lack of sleep.

“Potential intervention strategies to decrease the prevalence of chronic sleep loss in the population include greater flexibility in morning work and class start times, reducing the prevalence of multiple jobs, and shortening morning and evening commute times,” he said.

Results show that with every hour that work or educational training started later in the morning, sleep time increased by approximately 20 minutes. Respondents slept an average of only 6 hours when starting work before or at 6 a.m. and 7.29 hours when starting work between 9 a.m. and 10 a.m. Self-employed respondents with more flexible work times also obtained significantly more sleep than private sector employees and were 17 percent less likely to be a short sleeper.

Study results are published in the December issue of the journal Sleep.

“Getting at least seven hours of nightly sleep is essential to be at your mental, emotional and physical best for whatever you will pour yourself into, either at work or at home,” said American Academy of Sleep Medicine President Dr. Timothy Morgenthaler, who was not involved in the study.

Basner and colleagues Andrea M. Spaeth, PhD, and David F. Dinges, PhD, analyzed responses from 124,517 Americans 15 years and older who completed the American Time Use Survey (ATUS) between 2003 and 2011. The computer-assisted telephone interview, which is sponsored by the U.S. Bureau of Labor Statistics and conducted annually by the U.S. Census Bureau, asks participants how they spent their time between 4 a.m. on the previous day and 4 a.m. on the interview day. Responses were combined into 40 distinct activities that captured 99.1 percent of the 24-hour day. Responses combined into the “sleeping” category included napping, waking up and dreaming.

According to the Centers for Disease Control and Prevention (CDC), 30 percent of employed U.S. adults typically sleep 6 hours or less in a 24-hour period, which represents approximately 40.6 million workers. The American Academy of Sleep Medicine recommends that adults get about 7 to 9 hours of nightly sleep for optimal health, productivity and daytime alertness.

The study was supported by funding from the National Institute of Nursing Research (NINR) of the National Institutes of Health (NIH) and by the National Space Biomedical Research Institute (NSBRI) through NASA. The work was performed at the Division of Sleep and Chronobiology, Department of Psychiatry, at the University of Pennsylvania in Philadelphia.

> Continue reading…

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Twitter May Help Shine New Light On Mental Illness Trends

Provided by Phil Sneiderman, Johns Hopkins University

Johns Hopkins computers scientists, who have already used Twitter posts to track flu cases, say their techniques also show promise as a tool to gather important information about some common mental illnesses.

By reviewing tweets from users who publicly mentioned their diagnosis and by looking for language cues linked to certain disorders, the researchers say, they’ve been able to quickly and inexpensively collect new data on post-traumatic stress disorder, depression, bipolar disorder and seasonal affective disorder.

In research presented at three scientific conferences this year, the scholars described how their techniques of mining public data have yielded fresh numbers on cases of these illnesses, allowing for analyses that were previously difficult or expensive to obtain. The scholars emphasized, however, that their findings did not disclose the names of people who publicly tweeted about their disorders.

The researchers said their goal is to share with treatment providers and public health officials some timely additional information about the prevalence of certain mental illnesses. Using computer technology to sift through tweets, they said, can help address the slow pace and high costs associated with collecting mental health data through surveys and other traditional methods.

“With many physical illnesses, including the flu, there are lots of quantifiable facts and figures that can be used to study things like how often and where the disease is occurring, which people are most vulnerable and what treatments are most successful,” said Glen Coppersmith, a Johns Hopkins senior research scientist who has played a key role in the project. “But it’s much tougher and more time-consuming to collect this kind of data about mental illnesses because the underlying causes are so complex and because there is a long-standing stigma that makes even talking about the subject all but taboo.”

Coppersmith, who is affiliated with the university’s Center for Language and Speech Processing and its Department of Applied Mathematics and Statistics, added, “We’re not aiming to replace the long-standing survey methods of tracking mental illness trends. We believe our new techniques could complement that process. We’re trying to show that analyzing tweets could uncover similar results, but could do so more quickly and at a much lower cost.”

Earlier this year, Coppersmith, with Johns Hopkins colleagues Mark Dredze and Craig Harman, presented two papers describing their methods at two professional conferences in Baltimore and Ann Arbor, Mich.

Also, in August, at the Joint Statistical Meetings in Boston, Coppersmith and colleagues from the U.S. Naval Surface Warfare Center spoke about their promising early results in an ongoing study that uses Twitter posts to study mental illness in particular geographic areas.

Their analyses indicated that PTSD was more prevalent at military installations that frequently deployed during the recent Iraq and Afghanistan conflicts, and that signs of depression were more evident in locations with higher unemployment rates. While neither of these findings is surprising, they demonstrate that analyzing Twitter posts could become a useful yardstick in quickly measuring mental health trends, particularly after dramatic events such as natural disasters and military conflicts.

The computer algorithms used to discover mental health data from tweets look for words and language patterns associated with these ailments, including word cues linked to anxiety and insomnia, and phrases such as “I just don’t want to get out of bed.” The formula for zeroing in on mental health cases was based on a review of more than 8 billion tweets. The technique is built upon earlier Johns Hopkins work led by Dredze that successfully used Twitter posts to track outbreaks of the flu.

“Using Twitter to get a fix on mental health cases could be very helpful to health practitioners and governmental officials who need to decide where counseling and other care is needed most,” said Dredze, an assistant research professor in the Whiting School of Engineering’s Department of Computer Science. “It could point to places where many veterans may be experiencing PTSD, for example, or to towns where people have been traumatized by a shooting spree or widespread tornado damage.”

The idea has begun to generate some positive attention. After a recent conference presentation on the team’s social media research, an editorial in the Boston Globe stated, “Twitter is, apparently, the quiet therapist to whom we reveal much more that we realize. As such, it could be a valuable public-health tool. More work needs to be done in considering how such information could be used while still preserving privacy, but it’s an inquiry worth pursuing.”

A recent Newsweek article on new high-tech methods of tracking mental health trends also quoted Coppersmith as saying, “Mental health is something that has touched every single one of us at some point in our lives, whether it’s a personal experience or watching family or friends go through it. I don’t know how you can’t attack this problem. This is the one everyone should care about.”

In November, the Johns Hopkins team hosted a 48-hour hack-a-thon, bringing together scholars from six universities and a handful of industry partners to foster collaboration and innovation on these topics. The hack-a-thon was organized to prepare for the second Computational Linguistics and Clinical Psychology Workshop, which will be held in Denver next spring in conjunction with the 2015 Conference of the North American Chapter of the Association for Computational Linguistics. Coppersmith and Dredze will serve on the program and organization committee, headed by a previous Johns Hopkins post-doctoral researcher, Margaret Mitchell, now at Microsoft Research.

> Continue reading…

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Facebook Drops Bing As Its Search Results Provider

Chuck Bednar for redOrbit.com – Your Universe Online
Facebook has replaced Microsoft’s Bing search engine with a new tool developed in house at the popular social media website, according to various media reports.
Reuters reporter Alexei Oreskovic said on Friday that the decision to stop including results from Bing was confirmed by a company spokesperson, and came as Facebook has “revamped its own search offerings, introducing a tool on Monday that allows users to quickly find past comments and other information posted by their friends.”
“The decision may reflect the increasing importance that Facebook sees in Web search technology, a market dominated by rival Google,” Oreskovic added. “Searches on Facebook have long been geared toward helping users connect with friends and to find other information that exists within the walls of the… social networking service. But for years, Facebook’s search results also included links to standalone websites that were provided by Bing.”
A company spokesperson told Reuters that the social network was not currently showing web search results because it was focusing on internal content posted by the over 1.3 billion members of the website. The spokesperson said that Facebook continued to work with the Redmond, Washington-based tech company in other areas.
In comments made to VentureBeat’s Daniel Terdiman, a Microsoft spokesperson said that Facebook has “recently changed its search experience to focus on helping people tap into information that’s been shared with them on Facebook versus a broader set of web results.” The individual confirmed to Teridman that the move had actually happened “a while ago.”
Microsoft and Facebook first started working together in 2007, when the Windows-developer invested $240 million for a 1.6 percent stake in what was then a fledgling startup, according to Nicole Arce of Tech Times. However, Facebook founder and CEO Mark Zuckerberg told analysts during a July conference call that search was one of the more important areas of growth for the company, suggesting that Internet surfers could eventually turn to Facebook before Google or Bing when it came to looking something up online.
Facebook had been using Bing as its in-house search engine, allowing users to look for more information about their friends while keeping longtime rival Google from becoming directly involved in the social network, Mashable writer Samantha Murphy Kelly said. The announcement comes as the website has taken other steps to improve both its search and trending news sections, including the addition of a Twitter-style live feed that includes user mentions, she added.
Earlier this week, Facebook vice president of search Tom Stocky announced that the company has revamped Graph Search so that users could directly look for older posts written by themselves or other users, Arce explained. However, she also noted “the fact that pretty much nobody immediately noticed that Facebook search dropped Bing almost a week ago indicates that not too many people care about what search on Facebook offers them.”
Bing is the second-ranked Web search provider in the US, with a nearly 20 percent market share, according to Oreskovic. As part of its original deal with Microsoft, Facebook started carrying Microsoft-provided banner ads in international markets in October 2007, but ceased doing so three years later in order to assume greater control of its advertising business. During that same time, Facebook expanded its use of Bing search results, he added.

FBI Warns US Businesses Of Potential Iranian Hacker Activity

Chuck Bednar for redOrbit.com – Your Universe Online
A confidential US Federal Bureau of Investigation (FBI) report is warning American businesses about a sophisticated Iranian hacking operation targeting airlines, defense contractors, energy firms and educational institutions.
According to Jim Finkle of Reuters, the activity was first discovered earlier this month by cyber security firm Cylance, which said that those involved were targeting critical infrastructure organizations all over the globe. The company said that it had already discovered over 50 victims in 16 countries, including the US, in what is known as “Operation Cleaver.”
The new FBI “flash” report, which Reuters said it had seen on Friday, provides technical details about malware and various other techniques used in the attacks, as well as advice on how to best deal with the hackers. It also asked organizations to contact the bureau if they believed that they were victims of the Iranian hacking campaign.
While the FBI did not offer additional details, Cylance chief executive Stuart McClure told reporters that the agency’s warning suggested that Operation Cleaver might have been a larger-scale endeavor than its own research had previously indicated. He added that it “underscores Iran’s determination and fixation on large-scale compromise of critical infrastructure.”
Finkle said that the FBI technical documents indicate that the hackers typically launch attacks from two IP addresses located in Iran, but did not specifically claim that the country’s government was behind the activity. Cylance said that it believes that the Tehran regime is behind Operation Cleaver, but officials there deny those accusations.
According to the Daily Mail, experts state that Iran has been investing heavily in its cyber capabilities since its nuclear program was hit by the Stuxnet computer virus in 2010. Dave Kennedy, CEO of TrustedSEC said that those efforts have turned them into “a serious threat” with “a lot of talent” in the field, and according to Menchie Mendoza of Tech Times, the country is believed to have already been responsible for several attacks.
“In February, the group was believed to be responsible for the devastating attack on Las Vegas Sands Corp, a casino operating business,” Mendoza said. “The attack shut down thousands of servers, which had been wiped with destructive malware. The hackers later admitted that the attack was meant to punish Sheldon Adelson, Sands CEO, after he made comments about a plan to detonate a nuclear bomb in Iran.”
While no one has officially claimed responsibility for developing the Stuxnet trojan virus, it has frequently been reported that the US and Israel commissioned its development in order to attack Iran. Evidence has suggested that whoever is behind the malware program may have started work on it as early as 2005, five years before it was first deployed, and that it was likely developed by people hired by an outside organization and now a vengeance-seeking band of hackers.
In November, the same researchers at security firm Symantec that first discovered Stuxnet reported on a new, similar type of trojan that they believe has been used to spy on governments, companies and researchers for more than six years. The origins of the program, which has been identified as Regin, are unknown, but Symantec said that nearly 100 infections involving the cyber-espionage tool had been discovered as of November 24, 2014.

Losing weight could drastically reduce breast cancer death risk

Modest weight loss could dramatically improve a breast cancer patient’s chances of survival, according to new research published earlier this month at the San Antonio Breast Cancer Symposium.
In fact, trials involving 2,400 women who were being treated for the disease found that death rates a decade later were nearly 70 percent lower among those who had the deadliest form of cancer and who had lost weight, according to Laura Donnelly, health editor for the Telegraph.
Lead investigator Dr. Rowan Chlebowski of the Harbor-UCLA Medical Center, and his colleagues divided the study participants into two groups, with half of them being placed on a low-fat diet, Donnelly said. Those who lost around five to six pounds and kept the weight off for at least five years had lower death rates over the next two decades.
The most significant results was in the 20 percent of women who had types of cancer not linked to hormones, added Rose Troup Buchanan of The Independent. This group includes triple negative cancers and those stemming from faulty genes, such as BRCA1, which have the fewest treatment options and the worst prognosis unless they are caught early enough.
One possible explanation for the results is that the weight loss also reduced the levels of glucose and insulin that would otherwise help feed cancer cells, Buchanan added. Previous studies have found that overweight or obese women are more likely to develop breast cancer and more resistant to treatments for the disease. Scientists believe that this resistance could be the result of reduced estrogen levels, which itself has been linked to cancer.
“Academics say the findings are so promising they suggest that weight loss is as effective as a breakthrough treatment or chemotherapy,” said Daily Mail reporter Sophie Borland. “It is now well established that being overweight dramatically increases a woman’s risk of developing breast cancer… but this is the first study to show that purposely losing weight can greatly boost the survival odds.”
Borland said that a total of 975 women who were diagnosed with early stage breast cancer were placed on the special diets for five years, reducing their fat intake 50 percent by avoiding butter, margarine, oil and desserts. They kept the weight off, and 10 years later, the researchers compared those women to another group that did not lose weight.
Overall, the weight-loss group was 19 percent less likely to have a recurrence, but the triple-negative group was found to be 69 percent less likely to have the cancer return within a decade, she added. On average, these women lived 1.9 years longer than those who did not lose weight (13.6 years versus 11.7 years). Previous findings from the same study showed that five years later, these women were 24 percent less likely to have cancer recur.
Professor Tony Howell, director of research at the Genesis Breast Cancer Prevention charity, told Donnelly that the results of the study were “extraordinarily important,” and that “a 69 percent reduction in deaths in a group with few alternative treatments – that’s as good as any drug. For 20 [percent] of women, this is as effective as chemotherapy.”
“Despite the promise of the results, researchers cannot be sure that they weren’t just a statistical blip. In scientific terms they are not ‘significant’,” Borland added. “But other trials involving breast cancer patients being put on diets are under way, and the results are expected within three years. If they show similar benefits, women could be routinely told to lose weight as part of their treatment.”

Unique New Worm Species Has Reversed Its Own Evolution

Chuck Bednar for redOrbit.com – Your Universe Online
An unusual, newly discovered type deep-sea worm lives on the bones of dead animals and features males that have grown significantly larger than their predecessors, researchers from the Scripps Institution of Oceanography report in a new study.
According to San Francisco Chronicle science editor David Perlman, the authors of the new Current Biology paper detailing the discovery report that the worms have reversed their own course of evolution like no other creature before. Not only have they grown larger than their forebears, they mate in vastly different ways than their closest relatives.
[ Watch the Video: The Story Of A Bizarre Deep-Sea Bone Worm Takes An Unexpected Twist ]
The worms were initially discovered on the remains of a long-drowned seal at the bottom of Monterey Bay, some 3,000 feet below the surface, by marine biologist Greg Rouse during expeditions above the mile-deep Monterey Canyon and off the Oregon coast. The species, which has been named Osedax priapus, is the second known species of the bone-consuming Osedax worms discovered (the first came in 2002).
Monterey Bay Aquarium Research Institute (MBARI) evolutionary biologist Robert Vrijenhoek, who captained the research vessel used by Rouse during the new discovery, was also responsible for the previous find. At the time, he indicated that the creatures had no mouth, stomach, legs or eyes. However, the body of each of the females held hundreds of males that were so small, they resembled larvae and lived on small bits of the female’s eggs.
Rouse told Perlman that the new worms, which were named in honor of the mythological god of fertility, were an “evolutionary oddity unlike any other in the animal kingdom.” The males are now as large as the females and tens of thousands of times larger than the other species, he explained, and instead of surviving on scraps inside the opposite sex of the species, they were witnessed devouring the same rotting bones as the females.
“This worm was weird enough as it was and now it’s even weirder,” Rouse said, according to the website Sci-News.com. “This shows us that there continue to be mysteries in the sea and there is still so much more to discover, especially since we only found these creatures 12 years ago.”
The mating process of Osedax priapus was also found to be vastly different than its predecessors. While the earlier type of Osedax males are permanently attached to their female hosts, the new species have to seek out a mate. To account for this, the males have evolved to have an extremely extendable body that allows them to reach far out (up to ten times its contracted state) to find female mating partners, Dr. Rouse explained.
Luis Georg of Perfect Science noted that the new worms were found at largely the same area of Monterey Bay that the previous species had been found 12 years ago. They are members of Siboglinidae, a family of worms that are also known as bearded worms and that live in unexpected locations, such as hot and acidic hydrothermal sea vents.
However, Vrijenhoek told Perlman that this new species “is exceptional because the genes for producing full-size adult males should have deteriorated over time because they weren’t used by the dwarf males. But apparently the genes are still there. And although those microscopic dwarf males weren’t competing with the females for food, in this much larger species they do.”
“So it’s our hypothesis that here there’s a new potential for sexual conflict, and the ability of the males to stretch themselves out like rubber bands to roam for females suggests that they’ve reinvented mating,” he added. “It’s a throwback to an earlier ancestral species more than 40 million years ago. We’re continuing to collect more species to see what their genes are telling us.”

Google, CNES Joining Forces To Improve Project Loon

Chuck Bednar for redOrbit.com – Your Universe Online
France’s space agency, Centre National d’Etudes Spatiales (CNES), has announced plans to join forces with Google on the Mountain View, California-based company’s balloon-based effort to bring the Internet to rural areas and remote countries, Project Loon.
Project Loon, launched by the tech company’s semi-secret Google X division back in 2011, currently has balloons capable of remaining in the air for up to 100 days, according to Engadget reporter Mariella Moon. Now, CNES scientists will be assisting their effort to bring the Internet to underserved parts of the world by analyzing data from ongoing tests and designing next-gen balloons.
On their end, Google will assist the French space agency by assisting with “Strateole-type long-duration balloon campaigns… similar to the Concordiasi project in 2011 but with a wider stratospheric coverage,” CNES said. Essentially, Google will be helping their new partners conduct long-haul balloon flights into the stratosphere.
“This project comes at just the right time as we seek ways to bring the Internet to underserved areas. It is a unique experience for CNES to work with a leading light of Silicon Valley like Google. Collaborations like this bring down barriers and spawn new cross-disciplinary projects,” added CNES President Jean-Yves Le Gall. “We are proud to be providing our expertise while benefiting in return from the assistance of such a great global company.”
Late last month, Google revealed that the hot air balloons being used as part of Project Loon were capable of flying 10 times longer than they did in 2013, with some of them remaining in the air for more than 100 days. In addition, the company revealed that improvements to its autofill equipment allowed them to fill the ballons with air in less than five minutes, thus making it possible to launch as many as 20 in a single day.
Furthermore, Google revealed that Project Loon balloons had collectively flown three million kilometers (roughly 1.86 million miles) – the equivalent of circumnavigating the Earth nearly 75 times or traveling to the moon and back approximately four times. While the company attributed those successes to trial and error and learning from its mistakes, this new partnership will also allow Google to draw on the expertise of others.
Dante D’Orazio of The Verge said that the announcement “should be good news for Google, because CNES isn’t just some backwater space agency. It’s said to have one of the largest stratospheric balloon programs in the world, second only to NASA, with a team of roughly 60 scientists on board.” He added that the CNES has over five decades of experience working on high-altitude research balloons.
“While the partnership between Google and CNES could improve Project Loon’s prospects, it could help Google’s prospects too,” added CNET technology columnist Don Reisinger. “The company is under a lot of pressure in Europe, having endured antitrust scrutiny, the European Parliament promoting a Google breakup, the shutting down of Google News in Spain, and difficulties scrubbing information out of search results to comply with the right-to-be-forgotten rule.”

Google Launches New WiFi-Less Guest Mode For Chromecast

Chuck Bednar for redOrbit.com – Your Universe Online
Google Chromecast users who will be entertaining friends and family over the holidays can now allow their Android smartphone-owning guests to operate the streaming video device without first having to connect to WiFi.
According to Engadget’s Timothy J. Seppala, the Mountain View, California-based tech giant launched ultrasonic pairing (also known as Guest Mode) on Wednesday. Once the Chromecast device is connected to a nearby Android device, it can be used to control the television set.
“With the new guest mode feature, anyone with an Android device can cast to your TV as long as they’re in the same room,” Google product manager Jagjit Chawla explained Thursday in a post to the company’s Chrome blog. “This update is rolling out starting today. Just make sure your Chromecast app is up to date on your Android phone or tablet.”
To set up ultrasonic pairing for a Chromecast device, users need to open the Chromecast app on their Android phone or tablet and select “Devices” from the navigation drawer, Chawla explained. Next, they need to choose their Chromecast device, then tap the “Guest Mode” setting and then turn the slider to the “on” position.
Currently, the service is limited to Android mobile devices, Seppala said. The reason, Chawla explained to him, is that iOS does not have an API that allows users to scan for a list of nearby WiFi access points. As such, Google decided to release Guest mode on their own operating system, gather feedback and improving the service before ultimately releasing it for iPad and iPhone sometime in the foreseeable future.
CNET’s Scott Webster said that Guest Mode, which was first demonstrated this summer at Google I/O, uses a PIN number to allow users to access the HDMI streaming media device. However, for the service to work, the device must run Android 4.3 or above.
Activating Guest Mode causes Chromecast to activate a special WiFi beacon that lets the smartphone and tablet users’ devices known that a Cast-ready TV is available, explained Adriana Lee of the website readwrite. Those devices will sense the Chromecast once they hit the cast button, and the link-up process will begin.
“The Chromecast will first try to link up to a guest phone using inaudible sounds to transmit a four-digit PIN code, using the ultrasound technology Google announced at its Google I/O developers conference last June,” Lee said. “If the ultrasonic pairing doesn’t work for some reason, your guest can type in that four-digit PIN.”
“Chromecast… will display the code on your TV screen in your Chromecast backdrop, and will also pop it up on your phone in the Chromecast app,” she added. “Once your guests are all paired, you’re off watching those insanely cute animal videos…  Beats digging up and typing in a long hexadecimal string of characters.”
Lee also notes that Chromecast randomly generates the 4-digit PIN code every 24 hours or whenever it reboots (whichever comes first), and cautions that it does not work with apps that stream music or video stored on a tablet or smartphone (even if they use Google Cast). She also said that some younger users may hear some buzzing due to the ultrasonic technology’s use of high-frequency sounds that are largely undetectable to most people.
“Since Google first announced ultrasonic pairing, some users have asked me if the high-frequency ultrasounds will drive their pets nuts. Google hasn’t officially addressed this concern, but insiders tell me that animals will be fine,” she added. “However, the real test of new features always comes when the public gets their hands on them and reports back in. So we’ll keep our ears perked for any canine or feline discomfort, now that Guest Mode is live.”

Filefish Species Uses Chemical Camouflage To Alter Its Smell

Chuck Bednar for redOrbit.com – Your Universe Online
Scientists have for the first time discovered a creature that chemically disguises itself by ingesting chemicals from its prey in order to hide from potential predators, according to new research published this week in the journal Proceedings of the Royal Society B.
The creature in question is the coral-eating harlequin filefish, and lead author Dr. Rohan Brooker of the ARC Centre of Excellence for Coral Reef Studies (Coral CoE) at James Cook University and his colleagues have found that the creature changes its odor in order to match the coral that it consumes.
“For many animals vision is less important than their sense of smell. Because predators often rely on odors to find their prey, even visually camouflaged animals may stick out like a sore thumb if they smell strongly of ‘food’.” Dr. Brooker said in a statement. “By feeding on corals, the harlequin filefish ends up smelling enough like its food that predators have a hard time distinguishing it from the surrounding coral habitat.”
The harlequin filefish, which is also known as the orange spotted filefish, actually matches the smell of the coral’s musk so closely that small crabs (which live on coral branches) were unable to distinguish it from actual coral, the researchers said. They called it a remarkable example of how closely living things can adapt to their habitats.
According to UPI reporter Brooks Hays, Dr. Brooker and his colleagues conducted several experiments, including one that demonstrated that the filefish was able to take on the smell of its preferred type of coral. In one experiment, the study authors, placed cod in aquarium tanks with two separate groups of the fish.
One of the two groups was fed Acropora coral, their favorite type, while the second was offered a coral that is not normally a part of the filefish’s diet, Hays said. They found that the cod were generally less active and less predatory when in the presence of Acropora-eating fish, which suggests that their disguise was extremely effective.
“Most of the literature on camouflage focuses on visual methods, but many animals use smell more. For these animals, chemical camouflage may be far more important to stay hidden,” Dr. Brooker told Carrie Arnold of National Geographic. “I suspect that this method of hiding is probably a lot more common than any of us guessed.”
Arnold added that the chemical camouflage was so effective, it even fooled crabs that lived in coral branches. The study authors captured two species of small crab — one that lived in each type of coral used in the study — and introduced them to the respective environments. Those typically living in Acropora preferred the smell of the filefish that had eaten that type of coral, and some of them even started treating the fish as though they were coral.
“The harlequin filefish shelters among the branches of coral colonies at night, where not only does it look like a coral branch, it also smells like one, enabling it to remain undetected by nocturnal predators,” said co-author Professor Philip Munday from the Coral CoE. “However, the filefishes’ cover is blown if it shelters in a different species of coral than the one it has been eating. Then, the predators can distinguish it presence and track it down.”
While the ability to chemically blend-in has previously been observed in some types of plant-eating invertebrates, this marks the first time this type of camouflage has been observed in higher order animals, the researchers said. “This is very exciting because it opens the possibility of a wide range of different animals also using similar mechanisms, right under our noses,” Dr. Brooker added.

House Dust Mites Get The Travel Bug

Eric Hopton for redOrbit.com – Your Universe Online
Even microbugs need a holiday from time to time – a well-earned break from all that munching and sucking. And a bunch of them might be coming to an airplane seat near you this Christmas.
A study by a team of University of Michigan researchers has found that we share our cabin space with countless hitchhiking dust mites and a host of other bugs. The results of the research have been published in the journal PLOS ONE.
“What people might not realize when they board a plane is that they can share the flight with a myriad of microscopic passengers, including house dust mites, that take advantage of humanity’s technological progress for their own benefit,” said Pavel Klimov, who is a biologist and assistant research scientist in the U of M Department of Ecology and Evolutionary Biology.
“House dust mites can easily travel on an airline passenger’s clothes, skin, food and baggage,” added Klimov. “Like humans, they use air travel to visit new places, where they establish new populations, expand their ranges and interact with other organisms through various means.”
The genetic study was conducted by Klimov and U of M visiting scholar Rubaba Hamid. The aim was to look for connections between the different house dust mite populations in the United States and South Asia.
The results confirmed that genetic mutations were shared by mites in both the U.S. and Pakistan. This indicates that the mites are indeed travelling between continents.
“What we found suggests that mite populations are indeed linked through migration across continents, though geographic differences still can be detected,” Hamid said. “Every time a mite successfully migrates to a new place, it brings its own genetic signature that can be detected in the resident population a long time after the migration event.”
The work concentrated on two “medically important” species of mite, the American and European house dust mites, both of which are found worldwide. Scientists believe that the species probably split around 81 million years ago at a time when they lived mainly in bird nests. Today, however, house dust mites have adapted to a new host – human beings and they live happily in our homes. Beds, sofas, and carpets are full of them and cleanliness is no protection. House dust mites are around a quarter of a millimeter long and eat human skin scales.
The problem is that, though we cannot see them, house dust mites can cause serious allergic reactions, including asthma, eczema and allergic rhinitis. Over 65 million people around the world are affected each year. Often, it is not the mite itself that causes allergic reaction, but the proteins in their droppings. Mites can leave about 20 droppings every day, which can remain allergenic long after the mite itself has died.
The team studied genetic variations in the group 1 allergen gene (which encodes the most important allergy-causing protein) taken from the two species’ separate populations in the U.S. and Pakistan. This protein is used around the world in standard skin-prick tests for allergies.
In order to be fully effective, the test would need to include local genetic variants of the protein but, until now, there has been little research into these geographical variations in U.S. mites.
“We need to have a better idea about the diversity of allergenic proteins around the world, and particularly in the United States,” said Klimov.
The study discovered mutations at 14 positions along the length of the group 1 allergen gene in genetic sequences taken from American house dust mites (Dermatophagoides farinae). Of the 14 mutations, all but one were “silent,” meaning that they occur at the DNA level, but do not change the amino acid structure of the protein. It is these mutations at the protein level that are medically important, as they can alter the allergenic properties.
The research discovered that a previously unknown mutation occurred at the active site of the protein at position 197, according to Klimov. “This was a rare mutation, found in only a single population of house dust mite in South Asia,” he said. Klimov’s analysis indicates that this mutation may change the enzyme activity of the protein.
These studies are a first step towards explaining how mites have traveled from different parts of the world, but more research is needed if we are to understand the way allergenic properties, immune response and cross-reactivity of the protein work.
“Follow-up experiments to elucidate these issues are underway in our lab,” said Klimov.
Hamid is in the Department of Zoology at the Pir Mehr Ali Shah Arid Agriculture University in Rawalpindi, Pakistan. The other authors of the PLOS ONE paper are Muhammad Inam of the University of the Punjab in Lahore, Pakistan; Farhana Riaz Chaudhary of the Pir Mehr Ali Shah Arid Agriculture University; and Barry OConnor of the U-M Department of Ecology and Evolutionary Biology.
The research was supported by the U.S. National Science Foundation, the Higher Education Commission and the International Research Support Initiative Program in Pakistan, the Ministry of Education and Science of the Russian Federation, and the U.S. National Pediculosis Association.

Mysterious Signal May Be First Evidence Of Dark Matter

Chuck Bednar for redOrbit.com – Your Universe Online
An unusual photon emission in X-ray data originating from space could be evidence for the existence of a dark matter particle, researchers from the Swiss Federal Institute of Technology in Lausanne (EPFL) report in a new study.
The signal, discovered by scientists working in the EPFL’s Laboratory of Particle Physics and Cosmology (LPPC) and colleagues from Leiden University in the Netherlands, could be the first tangible evidence of the mysterious substance that neither emits or absorbs light and is believed to account for unexplainable gravitational effects.
To this point, dark matter has been considered a purely hypothetical substance, but after sifting through a large quantity of X-ray data, the researchers believe they have identified the signal of a lone dark matter particle. Their findings will be published next week in the journal Physical Review Letters.
“When physicists study the dynamics of galaxies and the movement of stars, they are confronted with a mystery,” the EPFL explained in a statement Thursday. “If they only take visible matter into account, their equations simply don’t add up: the elements that can be observed are not sufficient to explain the rotation of objects and the existing gravitational forces. There is something missing.”
For this reason, experts have determined that there must be some invisible type of matter that does not interact with light, but does interact through gravitational forces. This so-called dark matter does not operate under any of the standard models of physics except through gravitational force, but makes up about 80 percent of the universe.
Two teams of scientists have recently detected the highly-anticipated signal, including one led by EPFL scientist Oleg Ruchayskiy and Leiden University professor Alexey Boyarsky. They made the discovery after analyzing X-rays emitted by the Perseus galaxy cluster and the Andromeda galaxy and collected using the ESA’s XMM-Newton telescope. After eliminating those coming from known particles and atoms, one noteworthy anomaly remained.
“All atoms emit a distinct pattern of light called a spectrum which is how astrophysicists can determine what planets and stars are made from at great distances,” explained Sarah Knapton of Science Editor with The Telegraph. Yet when they studied the X-ray spectrum, they found spikes where nothing should exist, a signal appearing as “a weak, atypical photon emission that could not be attributed to any known form of matter.”
Ruchayskiy told Knapton that the signal’s distribution within the galaxy corresponded precisely with what they were expecting from dark matter, “concentrated and intense in the center of objects and weaker and diffuse on the edges.” If the discovery, which resulted from a photon emitted due to the destruction of a hypothetical particle (possibly a sterile neutrino) is verified, it “could usher in a new era in astronomy,” he added in a statement.
“Dark matter is everywhere, though it’s very hard to catch. Everybody is looking for it and this may be the first sign,” added Boyarsky. “Confirmation of this discovery may lead to construction of new telescopes specially designed for studying the signals from dark matter particles. We will know where to look in order to trace dark structures in space and will be able to reconstruct how the universe has formed.”

Dinosaurs Possibly Killed By A One-Two Punch

Brett Smith for redOrbit.com – Your Universe Online

When it comes to theories on why the dinosaurs went extinct, massive and widespread volcanic activity has always played second fiddle to the more dramatic theory of a large asteroid slamming into Earth.

However, a new study published in the journal Science has found that both of these devastating phenomena could have taken place concurrently and collectively wiped the massive lizards off the planet.

The new study is based on an analysis of rock from the Deccan Traps – an area in west-central India that contains well-preserved samples from one of the biggest volcanic events in Earth’s history. The researchers were able to determine that the ancient rocks in Deccan Traps were deposited by activity that started 250,000 years prior to the asteroid strike and extended for 500,000 years following the massive impact, releasing around 580,000 square miles of lava.

Researchers used the ratio of uranium and lead isotopes in the rock to gauge a more accurate date for when they were formed. Over 50 samples of rocks were tested independently at MIT and Princeton University.

The study’s dating of the Deccan Traps rocks is an improvement on past efforts that had attempted to do so, but were only able to narrow down the margin of error to around one million years. The new dates give support to the idea that both volcanic activity and an asteroid strike were behind the dinosaurs’ demise.

“Both are potentially really important,” study author Blair Schoene, a professor of geosciences at Princeton, told the Washington Post. “I don’t know if we can say the extinction would have or would not have happened without both of them.”

In the scenario hinted at by the new study, volcanic activity would have released large quantities of highly toxic chemicals into the air, placing most species on Earth under a high amount of stress. A huge asteroid strike would have simply finished off species that could have already been on their way out.

“I sort of favor the one-two punch idea,” Schoene said.

Study author Gerta Keller has long championed the idea of both causes leading to the Cambrian extinction. She has been seen as somewhat of a maverick in voicing this theory, and the new study serves to validate her somewhat unpopular idea.

“I think this is a game-changer,” she said. “The data is so strong at this point that the momentum is entirely on my side.”

The study team also pointed out that their research resonates with what is happening to our planet right now.

“If models of volatile release are correct, we’re talking about something similar to what’s happening today: lots of carbon dioxide being emitted into the atmosphere very rapidly,” said Michael Eddy, a planetary scientist at MIT. “Ultimately what that can do is lead to ocean acidification, killing a significant portion of plankton — the base of the food chain. If you wipe them out, then you’d have catastrophic effects.”

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

MIT Engineers Trying to Make Computers Communicate More Like Humans

Provided by Larry Hardesty, Massachusetts Institute of Technology
Communication protocols for digital devices are very efficient but also very brittle: They require information to be specified in a precise order with a precise number of bits. If sender and receiver — say, a computer and a printer — are off by even a single bit relative to each other, communication between them breaks down entirely.
Humans are much more flexible. Two strangers may come to a conversation with wildly differing vocabularies and frames of reference, but they will quickly assess the extent of their mutual understanding and tailor their speech accordingly.
Madhu Sudan, an adjunct professor of electrical engineering and computer science at MIT and a principal researcher at Microsoft Research New England, wants to bring that type of flexibility to computer communication. In a series of recent papers, he and his colleagues have begun to describe theoretical limits on the degree of imprecision that communicating computers can tolerate, with very real implications for the design of communication protocols.
“Our goal is not to understand how human communication works,” Sudan says. “Most of the work is really in trying to abstract, ‘What is the kind of problem that human communication tends to solve nicely, [and] designed communication doesn’t?’ — and let’s now see if we can come up with designed communication schemes that do the same thing.”
One thing that humans do well is gauging the minimum amount of information they need to convey in order to get a point across. Depending on the circumstances, for instance, one co-worker might ask another, “Who was that guy?”; “Who was that guy in your office?”; “Who was that guy in your office this morning?”; or “Who was that guy in your office this morning with the red tie and glasses?”
Similarly, the first topic Sudan and his colleagues began investigating is compression, or the minimum number of bits that one device would need to send another in order to convey all the information in a data file.
Uneven odds
In a paper presented in 2011, at the ACM Symposium on Innovations in Computer Science (now known as Innovations in Theoretical Computer Science, or ITCS), Sudan and colleagues at Harvard University, Microsoft, and the University of Pennsylvania considered a hypothetical case in which the devices shared an almost infinite codebook that assigned a random string of symbols — a kind of serial number — to every possible message that either might send.
Of course, such a codebook is entirely implausible, but it allowed the researchers to get a statistical handle on the problem of compression. Indeed, it’s an extension of one of the concepts that longtime MIT professor Claude Shannon used to determine the maximum capacity of a communication channel in the seminal 1948 paper that created the field of information theory.
In Sudan and his colleagues’ codebook, a vast number of messages might have associated strings that begin with the same symbol. But fewer messages will have strings that share their first two symbols, fewer still strings that share their first three symbols, and so on. In any given instance of communication, the question is how many symbols of the string one device needs to send the other in order to pick out a single associated message.
The answer to that question depends on the probability that any given interpretation of a string of symbols makes sense in context. By way of analogy, if your co-worker has had only one visitor all day, asking her, “Who was that guy in your office?” probably suffices. If she’s had a string of visitors, you may need to specify time of day and tie color.
Existing compression schemes do, in fact, exploit statistical regularities in data. But Sudan and his colleagues considered the case in which sender and receiver assign different probabilities to different interpretations. They were able to show that, so long as protocol designers can make reasonable assumptions about the ranges within which the probabilities might fall, good compression is still possible.
For instance, Sudan says, consider a telescope in deep-space orbit. The telescope’s designers might assume that 90 percent of what it sees will be blackness, and they can use that assumption to compress the image data it sends back to Earth. With existing protocols, anyone attempting to interpret the telescope’s transmissions would need to know the precise figure — 90 percent — that the compression scheme uses. But Sudan and his colleagues showed that the protocol could be designed to accommodate a range of assumptions — from, say, 85 percent to 95 percent — that might be just as reasonable as 90 percent.
Buggy codebook
In a paper being presented at the next ITCS, in January, Sudan and colleagues at Columbia University, Carnegie Mellon University, and Microsoft add even more uncertainty to their compression model. In the new paper, not only do sender and receiver have somewhat different probability estimates, but they also have slightly different codebooks. Again, the researchers were able to devise a protocol that would still provide good compression.
They also generalized their model to new contexts. For instance, Sudan says, in the era of cloud computing, data is constantly being duplicated on servers scattered across the Internet, and data-management systems need to ensure that the copies are kept up to date. One way to do that efficiently is by performing “checksums,” or adding up a bunch of bits at corresponding locations in the original and the copy and making sure the results match.
That method, however, works only if the servers know in advance which bits to add up — and if they store the files in such a way that data locations correspond perfectly. Sudan and his colleagues’ protocol could provide a way for servers using different file-management schemes to generate consistency checks on the fly.
“I shouldn’t tell you if the number of 1’s that I see in this subset is odd or even,” Sudan says. “I should send you some coarse information saying 90 percent of the bits in this set are 1’s. And you say, ‘Well, I see 89 percent,’ but that’s close to 90 percent — that’s actually a good protocol. We prove this.”
“This sequence of works puts forward a general theory of goal-oriented communication, where the focus is not on the raw data being communicated but rather on its meaning,” says Oded Goldreich, a professor of computer science at the Weizmann Institute of Science in Israel. “I consider this sequence a work of fundamental nature.”
“Following a dominant approach in 20th-century philosophy, the work associates the meaning of communication with the goal achieved by it and provides a mathematical framework for discussing all these natural notions,” he adds. “This framework is based on a general definition of the notion of a goal and leads to a problem that is complementary to the problem of reliable communication considered by Shannon, which established information theory.”
—–
Follow redOrbit on TwitterFacebookInstagram and Pinterest.

Male Fertility Problems? Could Indicate More Serious Health Concerns

Provided by Bruce Goldman, Stanford University School of Medicine

A study of more than 9,000 men with fertility problems has revealed a correlation between the number of different defects in a man’s semen and the likelihood that the man has other health problems.

The study, conducted by investigators at the Stanford University School of Medicine, also links poor semen quality to a higher chance of having various specific health conditions, such as hypertension, and more generally to skin and endocrine disorders.

The findings, published online Dec. 10 in Fertility and Sterility, may spur more-comprehensive approaches to treating male infertility. They also point to the wisdom of performing complete physical examinations of men experiencing reproductive difficulties.

“About 15 percent of all couples have fertility issues, and in half of those cases the male partner has semen deficiencies,” said the study’s lead author, Michael Eisenberg, MD, assistant professor of urology and director of male reproductive medicine and surgery at Stanford. “We should be paying more attention to these millions of men. Infertility is a warning: Problems with reproduction may mean problems with overall health.”

A study Eisenberg co-authored a few years ago showed that infertile men had higher rates of overall mortality, as well as mortality linked to heart problems, in the years following an infertility evaluation. “But here, we’re already spotting signs of trouble in young men in their 30s,” he said.

Analyzing medical records

In the new study, Eisenberg and his colleagues analyzed the medical records of 9,387 men, mostly between 30 and 50 years old, who had been evaluated at Stanford Hospital & Clinics (now Stanford Health Care) between 1994 and 2011 to determine the cause of their infertility. The men had routinely provided semen samples, which the researchers assessed for characteristics including volume, concentration and motility. In about half of all the male infertility cases, the problem was abnormal semen; in the rest, the fault lay elsewhere. So, using the database, the investigators were able to compare the overall health status of men who had semen defects to that of the men who didn’t.

With a median age of 38, this was a fairly young group of men. However, 44 percent of all the men had some additional health problem besides the fertility problem that brought them to the clinic. In particular, the investigators found a substantial link between poor semen quality and specific diseases of the circulatory system, notably hypertension, vascular disease and heart disease. “To the best of my knowledge, there’s never been a study showing this association before,” said Eisenberg. “There are a lot of men who have hypertension, so understanding that correlation is of huge interest to us.”

In addition, as the number of different kinds of defects in a man’s semen rose, so did his likelihood of having a skin disease or endocrine disorder. When looking at the severity of all health problems, the scientists observed a statistically significant connection between the number of different ways in which a man’s semen was deficient and the likelihood of his having a substantial health problem.

Health, semen quality ‘strongly correlated’

The study wasn’t designed to determine precisely how connections between semen deficiencies and seemingly unrelated disorders, such as cardiovascular or endocrine disease, come about. But, Eisenberg noted, some 15 percent of all genes in the human genome are connected fairly directly to reproduction, and most of these genes also have diverse functions in other bodily systems. He also noted that it may not be a disease itself, but the treatment for the disease, that’s actually responsible for reproductive malfunction. He said he is exploring this possibility now.

“A man’s health is strongly correlated with his semen quality,” he said. “Given the high incidence of infertility, we need to take a broader view. As we treat men’s infertility, we should also assess their overall health. That visit to a fertility clinic represents a big opportunity to improve their treatment for other conditions, which we now suspect could actually help resolve the infertility they came in for in the first place.”

> Continue reading

—–

Follow redOrbit on TwitterFacebookInstagram and Pinterest.

Greenpeace Environmental Protest Damages Peruvian World Heritage Site

Chuck Bednar for redOrbit.com – Your Universe Online
A publicity stunt organized by Greenpeace at the Nazca Lines has officials at the environmental advocacy group apologizing and potentially facing prosecution for possible damage caused at the ancient Peruvian heritage site.
According to BBC News environmental correspondent Matt McGrath, activists from the organization placed a banner next to a hummingbird figure that was carved into the plains of Southern Peru more than 1,500 years ago. Greenpeace said it was hoping to put pressure on UN officials currently meeting in Lima, but government officials said that those involved will be prosecuted for the stunt.
“The ancient depictions of animals, including a monkey and a hummingbird… are a vital part of the county’s heritage,” McGrath said. “Visits to the site are closely supervised – ministers and presidents have to seek special permission and special footwear to tread on the fragile ground where the 1,500 year old lines are cut.”
Earlier this week, however, 20 Greenpeace activists from seven different countries not only went to the site without permission, but unfurled a banner reading “Time For Change! The Future Is Renewable” at a location described by BBC News as being “very close” to some of the lines. The group is being accused of entering and damaging a prohibited area, and Peruvian officials are threatening up to six years in prison for those responsible.
On a video produced by Greenpeace, one of the activists, Mauro Fernandez, said that the group wanted politicians “to understand the legacy we need to leave for future generations. It is not a legacy of climate crisis,” McGrath said. However, Deputy Culture Minister Luis Jaime Castillo told local reporters that it was a “slap in the face at everything Peruvians consider sacred,” telling The Guardian that it was done “without any respect for our laws.”
“It was done in the middle of the night. They went ahead and stepped on our hummingbird, and looking at the pictures we can see there’s very severe damage. Nobody can go on these lines without permission – not even the president of Peru!” he added. “It was thoughtless, insensitive, illegal, irresponsible and absolutely pre-meditated.”
Top officials from throughout the world are currently participating at UN climate talks in Lima, which is what spurred the environmental group’s actions, Dan Collyns of The Guardian said. A Greenpeace spokeswoman at the summit said that it was cooperating with the investigation and that she was not aware of any legal proceedings being brought against the organization.
Nonetheless, Greenpeace issued an apology, stating that it was sorry if the protest caused any “moral offense” to the people of Peru, io9 and the Chicago Tribune said on Wednesday. The group also said that it would no longer use the images as part of its promotional campaign, and said that Executive Director Kumi Naidoo would travel to Lima to apologize to the Peruvian government in person.
Even so, Arielle Duhaime-Ross of The Verge said that the damage caused by footprints left at the site could be long-lasting, with Castillo claiming that they could remain there for “hundreds or thousands of years. And the line that they have destroyed is the most visible and most recognized of all.”
“Without reservation Greenpeace apologizes to the people of Peru for the [offense] caused by our recent activity laying a message of hope at the site of the historic Nazca lines. We are deeply sorry for this,” a spokesman said, according to Collyns. “Rather than relay an urgent message of hope and possibility to the leaders gathering at the Lima UN climate talks, we came across as careless and crass.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Shortened Radiation Treatment For Better Breast Cancer Results

Rayshell Clapper for redOrbit.com – Your Universe Online

Breast cancer is scary. Its prevalence certainly should make all women aware and active in preventing, catching it early, and taking an active role in treatment. And even though women should have a voice in their treatment, often they simply follow the guidelines set out by their doctors. Likely this is because breast cancer is scary enough, but it is also probably because they trust their doctors. Both of these are legitimate, but if breast cancer patients do not speak up, they could end up with unnecessary longer treatments. As the University of Pennsylvania School of Medicine recently reported, about “two-thirds of the women treated for early-stage breast cancer in the US receive longer radiation therapy than necessary.”

The University of Pennsylvania research team consisted of Dr. Ezekiel J. Emanuel, MD, PhD, and Dr. Justin E. Bekelman, MD. USA Today states that their study further supports four other studies that support the fact that many women receive too much radiation. Emanuel and Bekelman found that after a lumpectomy (otherwise known as a breast conserving surgery), women with early-stage cancer currently receive six to seven weeks of radiation in the US, but three weeks of radiation provides three major benefits: it is just as clinically effective, costs less, and is more convenient. The procedure of three weeks of radiation after breast conserving surgery is called hypofractionated whole breast radiation.

As of 2013, only 34.5 percent of early-stage breast cancer patients over 50 received hypofractionated radiation, while only 21.1 percent of younger women received the shorter treatment. Other leading countries use hypofractionated radiation, including Canada and the United Kingdom. In Canada, over 70 percent of women with breast cancer received hypofractionated radiation therapy, and in the United Kingdom that percent is even higher. In both those countries, women with breast cancer experience remission and health again, which supports that the hypofractionated therapy works.

Moreover, both of the above-mentioned countries have national healthcare systems where the focus is not on insurance companies and making money through healthcare treatments. If the US started using hypofractionated therapy, then insurance companies would surely see the savings as would the patients. On top of the costs of the actual treatment, patients would not have to suffer the costs of time and transportation. As USA Today explains, for women who have hourly wage jobs, families, or live in rural parts of their states, to have to undergo radiation for seven weeks can seriously impede their lives. And if the shortened therapy works just as well, if not better, then why wouldn’t they opt for that?

Moreover, anyone who has undergone radiation therapy knows just how damaging it is to the body. If a cancer treatment can cut that down but four weeks, and still see the positive results, naturally she will want that. Radiation is hard on the body. Three weeks of radiation certainly sounds better than seven.

Part of the problem comes from the fact that women do not ask their doctors about different treatment options like this. The for-profit nature of the healthcare system in the US also contributes to the issue. Hospitals, medical companies, and pharmaceutical companies make more money on seven weeks of radiation. Fran Visco, the president of the National Breast Cancer Coalition, stated her frustrations: “How much evidence does the medical community need before it changes practice?” Visco also noted that doctors may make more money from longer treatment courses. “As patient advocates, we don’t want to believe this is financially motivated but find it difficult to understand what else could be the barrier,” she said. Not all doctors are motivated by profits, but if the research shows that hypofractionated radiation therapy is just as effective, more cost efficient, and less burdensome, then that is worth noting. And if practice in other countries further supports these, then women need to take action.

Yes, breast cancer is scary, but women must take an active role in their treatments. They must research, ask questions, and find the treatment that is best for them. Doctors need to provide more options and information instead of rushing to what has always been done or what makes the most money. People’s lives are more important than all that.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Google Cardboard VR Device Passes 500K Units Shipped

Chuck Bednar for redOrbit.com – Your Universe Online
Tech giant announces new developer tools, opens Google Play store for smartphone-aided system
The cardboard virtual reality headset shown off by Google earlier this year is getting its own section on the Play store, as well as new tools for developers, as the tech giant looks to take on the new Samsung/Oculus Gear VR.
The announcements came via Google’s developers blog, which touted that over 500,000 Google Cardboard devices had shipped and that there were “dozens” of VR experiences compatible with the system now available in a new Google Play collection. Those offerings “range from test drives to live concerts to fully-immersive games,” the company added.
In addition, Gizmodo’s Eric Limer said that the Mountain View, California-based company also announced plans to build calibration settings into the VR device’s software development kit (SDK) to ensure that VR apps work smoothly with all different types of Cardboard viewers. They also plan to hire more experts to work on new projects for the unique device.
Google has previously touted Cardboard as an effort to make VR more accessible to the average person, requiring less expensive hardware by using Android smartphone technology to drive the experience. It is essentially a box-like piece of cardboard that transforms handsets into basic, no-frills VR headsets, and the accompanying SDK kit allows developers to create VR software as simply as building a regular web or mobile app, the company added.
“Released first back at Google I/O 2014 earlier this year, the fold-up piece of cardboard with two glass lenses and a metal washer for a button was released basically in secret,” said Chris Burns of SlashGear. “Now just a few months later, Google has released their homepage for the unit and a dedicated section on Google Play. Now, they mean business.”
Google also announced that it would be expanding its efforts in helping other people produce Cardboard viewers. Having already made the specifications open-source earlier this year, the company announced the release of new building specs created with specific cutting tools in mind, and said that they would also help makers tweak the viewing experiences to a device’s unique optical layout by allowing them to define the viewer’s base and focal length.
The announcement comes following the recent release of Gear VR, the phone-based virtual reality system co-created by Samsung and Oculus that, like Cardboard, is essentially “just a face-based holder for your phone,” Limer said. Unlike Google’s device, Gear VR is “heavy duty,” and while it is likely to compete with Sony’s Morpheus unit for higher-end VR systems, he noted that phone-based VR technology is “showing a lot of potential.”
“The growth of mobile, and the acceleration of open platforms like Android make it an especially exciting time for VR,” Google Cardboard Product Manager Andrew Nartker said in that Wednesday developer’s blog post. “Here’s to the cardboard box, and all the awesome it brings,” he added. “There are more devices, and more enthusiastic developers than ever before, and we can’t wait to see what’s next!”
—-
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Ask the Expert: Energy Efficient Video Streaming

Catching up on your favorite television shows or checking out the latest movies over the Internet can be very entertaining. Let the experts at ENERGY STAR show you the most energy efficient ways to do it. EPA has tips to help you save energy and money, while protecting the environment from climate change.

For more information about energy efficient streaming, go http://www.energystar.gov
For more about EPA: http://www.epa.gov/
We accept comments according to our comment policy: http://blog.epa.gov/blog/comment-policy/

Credit: U.S. Environmental Protection Agency

Additive Makes Food More Filling

Brett Smith for redOrbit.com – Your Universe Online

When eating a meal, we typically don’t feel full until we’ve eaten much more food than we need – resulting in those extra calories being added to our body in the form of stored fat. Researchers from the United Kingdom may have found a way to hack that process.

According to a new study in the journal Gut, a compound known as inulin-propionate ester (IPE) can be added to food to make it more filling.

IPE contains propionate, which causes the stomach to discharge hormones that tell the brain to lower feelings of hunger. Propionate is generated naturally when dietary fiber is broken down by microbes in the gut, but IPE supplies greater amounts of propionate than people can get with a standard diet.

In the first part of the study, 20 participants received either IPE or inulin, a nutritional fiber, and were told they could eat what they wanted from a buffet. Volunteers given IPE consumed an average of 14 percent less than their counterparts and had greater levels of appetite-reducing hormones in their blood.

In the second part of the study, 60 overweight participants were tracked for 24 weeks with half of the group given IPE powder to supplement their food and the other half given inulin. The study team saw that 1-in-25 participants given IPE who finished the study gained over 3 percent of their body weight, as opposed to 6-in-24 participants in the inulin group. Not one of the volunteers in the IPE group gained greater than 5 percent of their body weight, as opposed to four in the control group. The IPE group also had less fat in the stomach and liver compared to the control group.

“We know that adults gain between 0.3 and 0.8 kilos (0.8 and 2.1 pounds) a year on average, and there’s a real need for new strategies that can prevent this,” said study author Gary Frost, a professor of medicine at Imperial College London.

“Molecules like propionate stimulate the release of gut hormones that control appetite, but you need to eat huge amounts of fiber to achieve a strong effect,” he added. “We wanted to find a more efficient way to deliver propionate to the gut.”

Frost said the new “proof-of-principle study” demonstrates that supplemental IPE could prevent weight gain in people who are already overweight.

“You need to eat it regularly to have an effect,” he said. “We’re exploring what kinds of foods it could be added to, but something like bread or fruit smoothies might work well.”

“Packaging propionate up to more efficiently deliver it to the large intestine has allowed us to make direct observations in humans that propionate may play an important role in weight management,” said study author Douglas Morrison, a biochemist at the University of Glasgow. “These exciting findings could at last open up new ways to manipulate gut microbes to improve health and prevent disease.”

Study researchers are currently working with technology company Imperial Innovations to take the novel ingredient to market.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

—–

Recommended Reading – 100 Days of Real Food: How We Did It, What We Learned, and 100 Easy, Wholesome Recipes Your Family Will Love by Lisa Leake

Groundbreaking Genetic Study Reveals Secrets Of Bird Evolution

Chuck Bednar for redOrbit.com – Your Universe Online
The secrets of how modern birds evolved and emerged following the mass extinction of the dinosaurs some 66 million years ago have long been hidden in their genes, but now an international team of more than 100 scientists has completed an extensive analysis of their DNA that has produced the most reliable tree of avian life ever.
The landmark study required the researchers to sequence, assemble and compare the full genomes of 48 bird species representing all major branches of modern birds, including the ostrich, hummingbird, crow, duck, falcon, parrot, crane, ibis, woodpecker and eagle species. This ambitious phylogeny project, which took four years to complete, has resulted in what is being hailed as the largest-ever whole genome study of a single class of animals.
Guojie Zhang of the National Genebank at BGI in China and the University of Copenhagen, Erich D. Jarvis of Duke University and the Howard Hughes Medical Institute (HHMI), M. Thomas P. Gilbert of the Natural History Museum of Denmark and their colleagues are part of the Avian Phylogenomics Consortium, and they have published their results in 29 papers published this week – eight of them in a December 12 special edition of Science and 21 others in Genome Biology and other journals.
While scientists had already known that the birds who survived the mass extinction of the dinosaurs experienced a rapid evolutionary burst, the family tree of modern-day birds has long baffled biologists, as has the molecular details of how they managed to give rise to more than 10,000 known species. To solve some of those mysteries, the consortium members sequenced and analyzed the genomes of birds representing all major modern avian branches.
“Although an increasing number of vertebrate genomes are being released, to date no single study has deliberately targeted the full diversity of any major vertebrate group. This is precisely what our consortium set out to do,” said Gilbert. Jarvis called the findings “exciting,” adding “lots of fundamental questions now can be resolved with more genomic data from a broader sampling. I got into this project because of my interest in birds as a model for vocal learning and speech production in humans, and it has opened up some amazing new vistas on brain evolution.”
While this newly published research only represents the first form of genetic analysis, the study authors said that it presents some remarkable new information about avian evolution. One of the new flagship papers presents a well-resolved new family tree for birds that is based on whole-genome data, while a second details the overall genomic evolution of birds and others detail vocal learning, sex chromosomes, how birds lost their teeth and more.
“Previous attempts to reconstruct the avian family tree using partial DNA sequencing or anatomical and behavioral traits… met with contradiction and confusion,” Duke University’s Kelly Rae Chi said in the statement. “Because modern birds split into species early and in such quick succession, they did not evolve enough distinct genetic differences at the genomic level to clearly determine their early branching order, the researchers said. To resolve the timing and relationships of modern birds, the consortium authors used whole-genome DNA sequences to infer the bird species tree.”
“In the past, people have been using 10 to 20 genes to try to infer the species relationships,” Jarvis said. “What we’ve learned from doing this whole-genome approach is that we can infer a somewhat different phylogeny [family tree] than what has been proposed in the past. We’ve figured out that protein-coding genes tell the wrong story for inferring the species tree. You need non-coding sequences, including the intergenic regions. The protein coding sequences, however, tell an interesting story of proteome-wide convergence among species with similar life histories.”
One of the accomplishments of the new study is the resolution of the early branches of Neoaves (new birds), a group that represents approximately 95 percent of modern birds. These birds appeared somewhat suddenly, over the course of just a few million years, and the rapid appearance of so many species over such a short period of time made reconstructing their relationships far more difficult.
The whole-genome analysis, which involved computational analysis led by University of Illinois Founder Professor of Bioengineering and Computer Science Tandy Warnow and University of Texas at Austin graduate student Siavash Mirarab, dated the evolutionary expansion of Neoaves to the time of the mass extinction event 66 million years ago, contradicting the suggestion of recent research suggesting that these birds arose 10 to 80 million years earlier.
“Our results suggest that modern birds diversified in the wake of the mass extinction that marked the end of the age of dinosaurs, but we cannot exclude the possibility that birds began diversifying before the extinction,” co-author Edward Braun, a biology professor at the University of Florida, said in a statement. A total of 14,000 different regions within the genome of the 48 bird species were analyzed by the computational team, including both coding and non-coding sites in the genes.
Jane J. Lee of National Geographic called the research “embarrassment of scientific riches for studying everything from how birds evolved so quickly after dinosaurs disappeared to the ways in which birds and people learn.” She added that “the biggest takeaway” from the consortium’s research “is the way genetic codes can be used to answer wide-ranging questions. Scientists are using birds’ DNA, for instance, both for research on the brain and learning and to reconstruct what an ancient ancestor of birds and dinosaurs might have looked like.”
She added that the study authors said that birds had an accelerated rate of evolution when compared to crocodilians and the common ancestors of both types of creatures, changing at rates similar to mammals but with genomes that were only about one-third the size. However, they all have the same basic functions as mammals, including the capacity for vocal learning, Texas Tech University biologist and study co-author David Ray told the National Geographic reporter.
Ray’s portion of the research started five years ago as an attempt to map one percent of crocodilian DNA, but expanded once the price of mapping a million bases reduced dramatically. Ultimately, they were able to sequence an entire genome of 3 billion bases. They found that the DNA in alligators, crocodiles and gharials is approximately 93 percent identical across the genome, or roughly equal to the DNA a human shares with a macaque.
The research also confirmed that birds and humans use essentially the same genes to speak. Jarvis and his Avian Phylogenomics Consortium colleagues found that vocal learning evolved at least twice and possibly three times in songbirds, parrots and hummingbirds, and that the set of genes involved in each of those adaptations is remarkably similar to the genes that give humans the ability to speak. Eight of the papers are devoted to bird songs, including one identifying 50 genes that show higher or lower activity in the brains of vocal learning birds and humans.
Despite the massive amount of research that has been published thus far, Professor David Burt, Acting Director of the National Avian Research Facility at the University of Edinburgh’s Roslin Institute, said that this is “just the beginning. We hope that giving people the tools to explore this wealth of bird gene information in one place will stimulate further research. Ultimately, we hope the research will bring important insights to help improve the health and welfare of wild and farmed birds.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.
—–
Everything you need for your pet direct from Amazon.com – Pet Supplies

Nearly 269,000 Tons Of Plastic Currently Pollute The Earth’s Oceans

Chuck Bednar for redOrbit.com – Your Universe Online
More than five trillion pieces of plastic garbage weighing a combined 269,000 tons are currently polluting the world’s oceans, according to the authors of a new paper published Wednesday in the open-access journal PLOS ONE.
In the study, Dr. Marcus Eriksen of the nonprofit 5 Gyres Institute in Los Angeles and an international team of colleagues set out to obtain a more precise estimate of the global abundance and weight of floating plastics in the world’s oceans – both large and small. They conducted 24 expeditions from 2007 through 2013 using nets to collect microplastics and data from visual surveys of larger debris to develop a new model of oceanic plastic distribution.
Based on their findings, the authors estimate that there are at least 5.25 trillion plastic particles weighing nearly 270,000 tons currently in the world’s waters. Larger plastics were more abundant near coastal areas, degrading into smaller pieces in the five subtropical gyres. The smallest fragments were found in more remote areas, suggesting that gyres act like paper shredders by turning larger items into microplastics and ejecting them across the ocean.
According to John Schwartz of the New York Times, the researchers reported that the largest source of plastic by weight originated from discarded fishing nets and buoys. While this problem could potentially be solved by creating an international program to pay fishing vessels to reclaim these items, Dr. Eriksen said that this would not address the bottles, bags and other debris that float across the waters and gather where currents converge.
“When the survey teams looked for plastics floating in the water that were the size of grains of sand, however, they were surprised to find far fewer samples than expected – one-hundredth as many particles as their models predicted,” Schwartz said. Dr. Eriksen noted that this could indicate that smaller bits of plastic are either being carried deeper into the sea, or that these particular fragments are being consumed by marine organisms.
He told Schwartz that the scope of the issue makes it impractical to try and collect the floating trash, but said that his non-profit research and advocacy group had been somewhat successful in convincing health and beauty product manufacturers to stop using small scrubbing beads of plastic. He added that other industries must also be challenged to produce their goods so that the ocean “can deal with [them] in an environmentally harmless way.”
Kara Lavender Law of the Sea Education Association in Woods Hole, Massachusetts, who was not involved in the project, told the Associated Press (AP) that Dr. Eriksen’s team gathered data in part of the world where scientists did not previously have measurements pertaining to floating plastic debris, including the South Atlantic, the Indian Ocean and the Southern Ocean near Antarctica.
Law added that the new paper’s estimates for microplastics (approximately 35,540 tons) was comparable to an a previous study conducted using a different methodology by researchers in Spain. She called it encouraging that the two different methods came up with such similar results, given the difficulty of studying plastic in the ocean, and noted that the knowledge will help scientists better understand how this debris is impacting the environment and possibly even the food chain.
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Bitcoin Payments Now Usable At Windows, Xbox Live Online Shops

Chuck Bednar for redOrbit.com – Your Universe Online
One of the largest tech companies in the world has jumped on the bitcoin bandwagon, as Microsoft is now indirectly accepting the virtual currency as a payment method at its Windows and Xbox Live digital marketplaces.
According to Stan Higgins of Coindesk, Microsoft is now allowing US customers to use bitcoin to add money to their accounts as part of a new partnership with a Georgia company called BitPay. Those funds can then be used to purchase content such as apps, games and other media through the Redmond, Washington firm’s PC, mobile and game console stores.
The announcement “adds yet another major tech player to the bitcoin ecosystem,” as Microsoft currently boasts a market cap of over $38 billion and has earned more than $86 billion in annual revenue this year, Higgins said. However, it appears that the computing giant is not directly accepting bitcoin payments at this time, he noted.
In an online help page, Microsoft explains how to use bitcoin to add money to your Microsoft account. Once signed in, select Payment options, then Microsoft account, and finally the “redeem bitcoin” option. Next, select the amount that you want to add and click the next button to review and confirm the transaction within a period of 15 minutes.
PC users can also select “pay with bitcoin” and use their bitcoin wallet to complete the transaction on the same device, and smartphone users can scan a QR code displayed on a page to pay from their mobile wallet app. Most transactions should process instantly, the company said, but customers should wait up to two hours before contacting support. Bitcoin funds cannot be refunded, Microsoft added.
“Microsoft’s integration follows a slow but notable progression of events that suggested it would potentially seek to embrace bitcoin,” said Higgins. A February update to Microsoft’s Bing search engine “allowed users to generate bitcoin price conversions,” while around the same time, co-founder Bill Gates said that his nonprofit Bill & Melinda Gates Foundation “was interested in digital payments technology in general,” he added.
Neither Microsoft nor BitPay have formally announced their partnership, according to TechCrunch reporter Jon Russell, and neither company responded to the website’s request for information. Matthew Sparkes, Deputy Head of Technology at The Telegraph, said that the news caused the value of a single bitcoin to increase from approximately $20 to $360 per unit overnight.
While the announcement “doesn’t directly correlate to its recent (and somewhat surprising) moves to make its core services like Office available on rival platforms… but the fact that it is ready to embrace an upcoming technology before many of its rivals is a sign that Microsoft’s ethos may have modernized,” Russell noted.
“Microsoft now joins other tech players like Dell, which has gone all-in on Bitcoin and accepts it for most products and services,” added Steve Dent of Engadget. “PayPal has a deal with Coinbase to help online stores accept Bitcoins, though you still can’t add them to your digital wallet. So far Amazon, the 800-pound gorilla of the e-commerce world, has strongly resisted the digital currency, however.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Crowdsourcing For Autism Answers With Google And Autism Speaks MSSNG Project

Chuck Bednar for redOrbit.com – Your Universe Online
Google and Autism Speaks are joining forces to sequence the genomes of 10,000 people with autism spectrum disorder (ASD) and store the information in the cloud so that it could be easily accessed by researchers.
The groundbreaking initiative known as MSSNG was launched with the intention of creating “the world’s largest database of sequenced genomic information on people with autism spectrum disorder (ASD) and their family members,” the advocacy group explained in a statement Tuesday.
MSSNG (pronounced “missing”) deliberately omits vowels in order to “represent the missing pieces of the autism puzzle. It is symbolic of the missing information about autism that the project is designed to find,” Autism Speaks said. The organization hailing the program as “a significant milestone in advancing genomic research of autism” that “could lead to breakthroughs into the causes, subtypes and better diagnosis and treatment for the disorder.”
According to Washington Post reporter Jim Tankersley, an estimated one of every 68 children has ASD, and scientists have only recently started figuring out the genetic and environmental factors responsible for the disorder. Much of that progress is the result of analysis of DNA of people with the condition, and Autism Speaks and Google believe that digitizing genetic information could lead to the next great breakthrough.
Rob Ring, the chief science officer at Autism Speaks, told the Washington Post that he and his colleagues were hopeful that the cloud-based initiative “will provide the data on which discoveries around autism are made for years to come.” The group has been collecting volunteered sequences for years, Tankersley said.
MSSNG “will open its cloud portal to scientists early next year with 1,000 of them already in the bank. Researchers anywhere in the world will have the opportunity to tap in and look for new clues in that data,” he added. “The goal is for that access to produce a sort of crowdsourcing for autism answers.”
Fast Company’s David Matthews said that Google’s advanced search capabilities are at the core of this new initiative. MSSNG will use Google Genomics, a tool launched earlier this year on the Mountain View, California-based tech giant that uses its search engine algorithm to investigate genetic data to find clues that will help experts better understand autism.
“The relative cheapness and ease of genetic sequencing today versus the early days of genetic research have certainly been a boon to researchers, but it has also created an increasingly unmanageable volume of data,” Matthews said. “A single human genetic sequence typically runs to around 100 gigabytes, and when you’re trying to tackle numbers of sequences in the tens of thousands, server space becomes an issue.”
Google’s involvement in the project will give researchers a virtual place to organize data collectively instead of having to ship hard drives back and forth to one another, he added. In addition to allowing Google to take part in a potentially important service to the healthcare community, the project will enable them to showcase and enhance the capabilities of its cloud-based service, added Wired.com’s Marcus Wohlsen.
“Millions of people living with autism today need answers. The MSSNG project is the search for those answers, and we’re going to find them,” said Autism Speaks President Liz Feld. “The best research minds in the world are going to mine this database of DNA so we can uncover and understand the various subtypes of autism. Then we can get to work developing customized treatments and therapies so we can improve the quality of life for so many people who need help.”

Recommended ReadingThe Reason I Jump: The Inner Voice of a Thirteen-Year-Old Boy with Autism. “One of the most remarkable books I’ve ever read. It’s truly moving, eye-opening, incredibly vivid.” — Jon Stewart, The Daily Show
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Instagram Outpacing Twitter, Hits 300 Million Monthly User Milestone

Chuck Bednar for redOrbit.com – Your Universe Online
Instagram has reportedly added 100 million members since March, a 50 percent growth over the last nine months, and the photo-sharing service now has a base of more than 300 million active users each month.
That means that the service, which was acquired by Facebook in 2012, now averages more active users each month than Twitter, according to Bloomberg Businessweek reporter Sarah Frier. As of the quarter ending September 30, Twitter said that an average of 284 million members were currently using the microblogging website each month.
In addition, Frier noted that Robert W Baird & Co. analyst Colin Sebastian has reported that Instagram is adding an average of 360,000 new active users each day compared to just 160,000 for Twitter, and re/code’s Kurt Wagner added that the company said those users of the service are posting an average of 70 million new images every day.
Twitter declined Bloomberg’s request for comment on the report, but Frier said that the San Francisco-based social media website “has recently been on a campaign to promote its prospects after several quarters of slowing user growth and questions about whether it can ever reach the scale of Facebook, which has about 1.3 billion members.”
“Monthly active user count at Twitter rose 23 percent in the third quarter, down from 24 percent growth the prior quarter,” she added. As for Instagram, the company said that approximately 70 percent of its current users now reside outside of the US (an increase from 65 percent in March) and a total of over 30 billion photos have now been shared using the app.
The company is also looking to launch a verification service for people as well as brands, and as on Twitter and Facebook, those accounts that have been authenticated will receive a blue badge to prove they are the real deal. The company also said it would look to improve its service by deleting fake accounts.
“As more people join, keeping Instagram authentic is critical,” CEO Kevin Systrom said in a statement. “We’re committed to doing everything possible to keep Instagram free from the fake and spammy accounts that plague much of the web, and that’s why… we’ve been deactivating spammy accounts from Instagram on an ongoing basis to improve your experience.”
“Instagram is home to creativity in all of its forms, a place where you can find everything from images of the Nile River to the newest look from Herschel Supply or a peek inside the mind of Taylor Swift,” he added. “We’re thrilled to watch this community thrive and witness the amazing connections people make over shared passions and journeys.”
Facebook acquired Instagram for $1 billion two years ago, according to Reed Albergotti of The Wall Street Journal, and considering that venture capitalists have valued other social networks at approximately $40 per user, that would mean the photo-sharing website’s growth is currently worth roughly $12 billion.
“Perhaps more important, it has kept teen users within Facebook’s ecosystem. As Facebook has matured, its youngest users are spending more time on other, newer mobile apps; Instagram has been a popular option,” Albergotti added, with Systrom noting he believed the service could “continue to engage generations of people that may not be on Facebook yet.”
Unleash your creativity and take amazing mobile photos with this practical and effective lens kit
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Rosetta Data Suggests Comets Were Not The Source Of Earth’s Water

Chuck Bednar for redOrbit.com – Your Universe Online
Terrestrial water most likely did not come from comets like 67P/Churyumov-Gerasimenko, meaning that the H2O found on Earth was most likely brought here by asteroids, scientists involved with the ESA’s Rosetta mission reported Wednesday in the journal Science.
The study, which was led by Kathrin Altwegg of the University of Bern in Switzerland using information provided by the Rosetta Orbiter Spectrometer for Ion and Neutral Analysis (ROSINA) instrument, measured the amount of deuterium (a heavier isotope of hydrogen) found in water vapor on the comet’s surface, said Dan Vergano of National Geographic.
While normal water contains regular hydrogen atoms, water that contains deuterium is known as heavy water, and Altwegg found that ice on the surface of 67P/C-G had a ratio of heavy water to normal water that is roughly three times that of the planet’s oceans. As a result, the researchers said that it is unlikely that terrestrial water came from Kuiper belt comets, as there would have been more deuterium-rich heavy water here on Earth.
Kuiper belt comets, which are formed outside of Neptune’s orbit, have long been one of three entities believed to have been responsible for bringing water to the Earth during the later stages of its evolution, according to NASA. The other two likely sources are asteroid-like objects from the region of Jupiter or Oort cloud comets formed inside of Neptune’s orbit, the US space agency added. The ROSINA data effectively eliminates Kuiper belt comets as a possible source.

This composite is a mosaic comprising four individual NAVCAM images taken from 19 miles (31 kilometers) from the center of comet 67P/Churyumov-Gerasimenko on Nov. 20, 2014. The image resolution is 10 feet (3 meters) per pixel. Credit: ESA/Rosetta/NAVCAM


“We knew that Rosetta’s in situ analysis of this comet was always going to throw us surprises,” said Matt Taylor, Rosetta’s project scientist from the European Space Research and Technology Center in the Netherlands. “The bigger picture of solar-system science, and this outstanding observation, certainly fuel the debate as to where Earth got its water.”
Nearly three decades ago, mass spectrometers on board the European Giotto mission to comet Halley were able to measure the ratio of deuterium to hydrogen (D/H ratio) in a comet. Those readings revealed a deuterium level twice that of Earth, concluding that Oort cloud comets such as Halley could not have been the original source of the planet’s water. Several other Oort cloud readings have produced similar D/H ration readings.
However, the European Space Agency’s Herschel spacecraft later discovered that the D/H ratio of comet Hartley 2 (believed to be a Kuiper Belt comet) was similar to terrestrial values, according to NASA. The results were unexpected, as most models of the early solar system suggested that Kuiper Belt comets should have an even higher D/H ratio than Oort cloud comets since they formed in a region that was colder.
“The new findings of the Rosetta mission make it more likely that Earth got its water from asteroid-like bodies closer to our orbit and/or that Earth could actually preserve at least some of its original water in minerals and at the poles,” the agency said. Altwegg added that the study “disqualifies the idea that Jupiter family comets contain solely Earth ocean-like water” and “supports models that include asteroids as the main delivery mechanism” of terrestrial water.
However, Open University Professor of Planetary and Space Science Monica Grady said that the conclusions of the paper could be “jumping the gun a bit,” telling BBC News, “The measurements that have been made by Rosina are of the gas that has come from the surface of the comet.”
“The amount of hydrogen relative to deuterium changes as the gas escapes from the surface,” she added. “This is why other instruments on the lander were going to make complementary measurements of the ice on the surface. We are going to have to wait to see what comes from the [Philae lander instruments] COSAC and Ptolemy before we can say any more.”
Recommended ReadingThe Cosmic Tourist: Visit the 100 Most Awe-Inspiring Destinations in the Universe!. Take your seats for an out-of-this-word tour through the Cosmos! Brian May, Patrick Moore, and Chris Lintott—authors of Bang!—fly us from Earth to the farthest-out galaxies. Along the way, we stop and gaze at 100 amazing sights, from asteroids to zodiacal dust. (Hardcover)
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Fibromyalgia and Breastfeeding

Fibromyalgia is characterized by chronic widespread pain and fatigue- and that is sometimes exacerbated by pregnancy and giving birth. Having a new baby is enough to cause exhaustion even in otherwise healthy women- having fibromyalgia can be even worse.

When it comes to the research regarding pregnancy and fibromyalgia, the results are inconclusive. Some women say that their symptoms were alleviated when they became pregnant. Others say that their flare-ups became worse. As we know, everyone is different, so it’s hard to determine whether your symptoms will flare or subside during your pregnancy. You should keep in mind though, that it is not hereditary. Your baby will not be born with fibromyalgia.

On the other hand, when it comes to having fibromyalgia and breastfeeding your baby, the results are much clearer. There have been lots of studies done to look at the way fibromyalgia affects breastfeeding. All of these studies do indicate that it can be very difficult to breastfeed if you’ve been diagnosed with fibromyalgia.

However, that’s not to say that it’s impossible. It is still very possible for you to breastfeed with fibromyalgia syndrome. However, you must make sure that you understand that it will be challenging and why it is challenging- as well as equip yourself with the tools needed to get through these difficulties. Following are some of the tools and techniques you can use to overcome the challenges to breastfeeding:

Breastfeeding with Chronic Pain

Many times, breastfeeding can be difficult for individuals with fibromyalgia due to the joint disorders, chronic pain, and other symptoms that they experience. Additionally, if pregnancy alleviated some of the symptoms of fibromyalgia in an individual, they may find that these symptoms have come back with a vengeance soon after birth. You may notice that your symptoms seem to be much worse than they were before you got pregnant- which can make it challenging to deal with trying to breastfeed.

Fibromyalgia and Breastfeeding

Tools and Techniques to Help You Breastfeed with Fibromyalgia

Breastfeeding can be quite difficult, even if you don’t have a chronic pain condition to deal with. When you are dealing with a condition such as fibromyalgia, you may think that it will be impossible for you to breastfeed. However, as long as you are patient with yourself, it is very possible for you to breastfeed your new baby.

There are lots of suggestions to help you succeed and be able to breastfeed your baby. You should know that stress can trigger a fibromyalgia flare-up and can therefore keep your milk from coming in- so it is vital for you that you keep your surroundings as stress-free as possible when you’re trying to nurse.

There are many stressful things that come up when it comes to having a new baby. It’s necessary that you approach the idea of breastfeeding with an open mind and a positive attitude and do whatever you can to create a “no-stress” zone for yourself to nurse in.

Consider using pillows to support your body and your head while feeding the baby. Get a sling or a pillow that can help you to comfortably prop the baby up so that you’re not having to support his or her full weight. You may find that you’re better able to nurse lying in the bed with your baby facing you.

Get into a Quiet Area

Find an area that is quiet and peaceful for you to breastfeed your baby. You may want to consider playing soothing music as well as you’re about to nurse. Go into a room, away from everyone- and everything- else. Close the door to block out the stress and the noise of the other areas of the house.

You should always breastfeed your baby in the same spot every time- as long as it is comfortable and conducive to relaxation. If you feel that your baby is having difficulty latching on, you may want to consult a lactation specialist.

Finally, you may want to consider trying aromatherapy, massage, and other such alternative treatments in order to help you relax after giving birth. Consider asking for help- that can only be an asset to you at this time.

Know that You Can’t Fail

You should always keep an open mind- understand and accept- that you can’t fail at this. If you breastfeed your baby for a few days and then decide that it’s just too painful, you have had three days to bond and feed your baby your nutrients. If you are able to breastfeed for two weeks, then two months, then five months- that’s wonderful! Anything you can do is better than doing nothing.

If you do decide that breastfeeding is too painful and frustrating and you just need to stop- you should never feel guilty about it. Baby formula does offer many wonderful nutrients and you will still have the chance to shower your baby with love at feeding time.

Just do what you can within your own personal physical and even emotional limitations. Keep in mind that anything you can do is absolutely wonderful and is always enough.

Whatever you do, just remember to make your environment as quiet, relaxing, and stress-free as possible. It’s stressful enough being a new mom- when you add outside stress on top of suffering from fibromyalgia, it can be more difficult. Take a deep breath, relax, and try your best.

 Further reading

Breastfeeding with Fibromyalgia – Yes, It’s Possible: http://www.fibromyalgia-symptoms.org/nursing-and-fibromyalgia.html

Study Finds Fibromyalgia Prohibits Sufferers From Breast-feeding:  http://www.sciencedaily.com/releases/2004/09/040921074750.htm

Zinc Test Could Lead To Earlier Breast Cancer Diagnosis

Chuck Bednar for redOrbit.com – Your Universe Online
A team of UK researchers led by Oxford University scientists has devised a new test that could help doctors diagnose breast cancer earlier by detecting changes in a person’s zinc levels, according to research currently appearing in the journal Metallomics.
The study authors, which also included experts from Imperial College London and the Natural History Museum in London, explained they were able to demonstrate that changes in the isotopic composition of zinc (which can be detected in a person’s breast tissue) could serve as a biomarker of early breast cancer.
According to Daily Mail science correspondent Fiona Macrae, the researchers hope that the metal-detecting blood test could detect cancer before a woman develops a lump, thus allowing doctors to treat it during its earliest stages and potentially save thousands of lives.
“There is a survival rate of about 80 percent for breast cancer but the earlier you can detect it, the more chance you have of treating it,” lead author Dr. Fiona Larner from the Oxford University Department of Earth Sciences told the Daily Mail on Tuesday. “If you can detect it earlier, you can give more women a better chance of survival.”
The test focuses around the so-called heavy and light forms of zinc that exist in a person’s body. Breast tissue is known to absorb zinc and release it back into the bloodstream, Macrae said. Dr. Larner’s team found that cancerous cells tend to absorb more zinc, and also hold onto more of the light form of the metal.
If breast tumors have more of the light version in their tissue, the unwanted heavy version of zinc should be floating around in the bloodstream. Thus, if a woman has higher-than-average levels of the heavy form of the metal in her blood, she could have breast cancer, Dr. Larner said. She and her colleagues are developing a blood test that uses this diagnostic, and they hope that it will be available within the next five years.
As part of the pilot study, the researchers analyzed zinc in the blood and blood serum of 10 subjects (five breast cancer patients and five healthy control subjects) along with a range of breast tissue samples from cancer patients. By using highly-sensitive techniques, they were able to demonstrate that they could detect key differences in zinc caused when cancer makes subtle changes to the way that the body’s cells process the metal.
Similar changes in copper in one of the breast cancer patients provided additional evidence supporting the notion that it could be possible to identify a biomarker for early breast cancer. This discovery, the researchers said, could be used to develop a simple, non-invasive diagnostic blood test that can detect the disease when it is most treatable.
“It has been known for over a decade that breast cancer tissues contain high concentrations of zinc but the exact molecular mechanisms that might cause this have remained a mystery,” Dr. Larner said in a statement.
“Our work shows that techniques commonly used in earth sciences can help us to understand not only how zinc is used by tumor cells but also how breast cancer can lead to changes in zinc in an individual’s blood – holding out the promise of an easily-detectable biomarker of early breast cancer,” she added.
Dr. Larner told the UK Press Association that she is hopeful the test will be used to help screen all women in the UK for early signs of breast cancer within the next 10 years. In the meantime, however, the test is most likely to be reserved for higher-risk women with inherited breast cancer genes, such as BRCA1 or 2, the news organization added.
“The hope is that this research is the beginning of a whole new approach,” said Dr. Larner. “Understanding how different cancers alter different trace metals within the body could enable us to develop both new diagnostic tools and new treatments that could lead to a ‘two-pronged’ attack on many cancers. Further research is already underway to see what changes in other metals may be caused by other cancers.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Cigarettes Are Still Killing Many Americans

Brett Smith for redOrbit.com – Your Universe Online
Over 50 years ago, the US Surgeon General released the first-ever report on smoking, which officially declared that it caused both chronic bronchitis and lung cancer.
Despite the ramifications of that report and the significant drops in smoking rates, cigarettes still cause about 30 percent of all cancer-related deaths in the US, according to a new report from the American Cancer Society.
A similar report from 30 years ago found that 30 percent of all American cancer deaths were linked to smoking, and the new American Cancer Society study is the first attempt to update that statistic. While the new report, published in the Annals of Epidemiology, might infer that progress hasn’t been made – new smoking-related cancers have been added to the conversation.
In the new study, researchers examined the most current information on smoking rates from the National Health Interview Survey (NHIS) and information on the hazards of smoking taken from epidemiologic research studies, to figure out what is called the population attributable fraction (PAF), or the percentage of cancer deaths in the population brought on by smoking.
Based on their data, the study team conservatively estimated that the PAF for active cigarette smokers was 29 percent, which included only deaths from the 12 cancers formally established as smoking-related. When estimated more liberally, meaning the inclusion of all cancers, the PAF was 32 percent. The researchers pointed out that their estimates do not contain cancer deaths due to second-hand smoke or other kinds of tobacco use such as cigars or smokeless tobacco.
“Our results indicate that cigarette smoking causes about three in 10 cancer deaths in the contemporary United States,” the study team wrote. “Reducing smoking prevalence as rapidly as possible should be a top priority for US public health efforts to prevent future cancer deaths.”
Some people have claimed that electronic cigarettes are a less hazardous option compared to conventional cigarettes, but a new study from a team of Japanese researchers has revealed high amounts of carcinogens in at least one brand of e-cigarettes.
“In one brand of e-cigarette the team found more than 10 times the level of carcinogens contained in one regular cigarette,” study researcher Naoki Kunugita, who led a team from the National Institute of Public Health, told the AFP news agency at the time.
Additionally, the Japanese team found both formaldehyde and acetaldehyde, two major carcinogens, in the vapor of several e-cigarette liquids.
While he wouldn’t reveal the name of the highly-toxic brand identified in the study, Kunugita said the study’s take away is that e-cigarettes aren’t the harmless products that the manufacturers of these products would have us believe.
“We need to be aware that some makers are selling such products for dual use (with tobacco) or as a gateway for young people” to start smoking, Kunugita said.
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Artificial Intelligence Could Help Keep You From Embarrassing Yourself On Facebook

Chuck Bednar for redOrbit.com – Your Universe Online
Good news for those who tend to post drunken selfies: Facebook is reportedly developing an advanced AI program that can detect when people attempt to post pictures in which they are inebriated and keep it from happening.
According to Herald Sun technology correspondent Harry Tucker, the social media company is currently developing software that, instead of just being able to identify people’s faces, will actually be capable of determining what is taking place in a picture.
Yann LeCun of Facebook’s Artificial Intelligence research lab told Wired the new AI program would act like “an intelligent digital assistant” that would “mediate your interaction with your friends, and also with content on Facebook.” If you went to post a photo that it determines could show you in a somewhat embarrassing state, it would advise you against uploading it.
The program could also help protect users from their friends as well. LeCun, a researcher and machine learning expert who is now in charge of Facebook’s AI lab, said that this type of virtual assistant could alert users if someone else was attempting to post an embarrassing photo of them without their permission, Tucker said.
“In a virtual way, he explains, this assistant would tap you on the shoulder and say: ‘Uh, this is being posted publicly. Are you sure you want your boss and your mother to see this?’” said Wired’s Cade Metz. He added that the concept “is more than just an idle suggestion” and that LeCun’s team is “laying the basic groundwork” for the AI assistant.
Creating this type of software primarily involves developing image recognition technology capable of distinguishing between a person’s sober appearance and what they look like when inebriated, Metz said. This requires a form of AI known as deep learning, which the social network already uses to identify faces that can be tagged in photos.
At the same time, BGR writer Chris Smith said that LeCun “wants to protect the online identity of a person, even though having intelligent machines analyzing personal data might not sound too thrilling to some Facebook users.” The researcher also believes that this type of digital assistant “would initially be able to answer simple questions, but in time, it’ll be able to analyze a lot more data than just photos posted on Facebook,” Smith added.
Tuesday marked the one-year anniversary of LeCun’s Facebook lab, which is known within company circles as FAIR, according to Metz. Among the work that it has completed thus far is the deep learning algorithms currently used to examine a person’s Facebook activity in order to determine what types of links, posts and photos he or she is more likely to click on so that those types of content are more likely to appear in his or her news feeds.
Those algorithms will “soon analyze the text you type into status posts, automatically suggesting relevant hashtags,” the Wired reporter added. The ultimate goal, however, is to develop AI systems capable of understanding Facebook data “in more complex ways,” thus enabling the social media site to provide a full-on digital assistant to its members.
“For some, this is a harrowing proposition. They don’t want machines telling them what to do, and they don’t want machines identifying their faces and storing them in some distant data center, where they can help Facebook, say, target ads,” Metz said, adding that LeCun and his colleagues insist that their research “is about giving you more control over your online identity, not less.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

X-Ray Laser Acts As Tool To Track Life’s Chemistry

Provided by Anne M Stark, Lawrence Livermore National Laboratory

An international research team that includes researchers from Lawrence Livermore National Laboratory has captured the highest-resolution protein snapshots ever taken with an X-ray laser, revealing how a key protein in a photosynthetic bacterium changes shape when hit by light.

Human biology is a massive collection of chemical reactions and all involve proteins, known as the molecules of life. Scientists have been moving steadily toward their ultimate goal of following these life-essential reactions step by step in real time, at the scale of atoms and electrons.

“These results establish that we can use this same method with all kinds of biological molecules, including medically and pharmaceutically important proteins,” said Marius Schmidt, a biophysicist at the University of Wisconsin-Milwaukee who led the experiment at the Department of Energy’s SLAC National Accelerator Laboratory. “We are on the verge of opening up a whole new unexplored territory in biology, where we can study small but important reactions at ultrafast timescales.”

The results, detailed in the Dec. 5 issue of Science, have implications for research on some of the most pressing challenges in life sciences, which include understanding biology at its smallest scale and discovering molecular targets for drug design.

The experiment took place at SLAC’s Linac Coherent Light Source (LCLS). LCLS’s X-ray laser pulses, which are about a billion times brighter than X-rays from synchrotrons, allowed researchers to see new atomic-scale details of how the bacterial protein changes within millionths of a second after it is exposed to light.

Lawrence Livermore researchers Mark Hunter, Brent Segelke and Matthias Frank contributed to the work in sample preparation, laser setup and conducting the experiment.

The experimental station used in the study, LCLS pulses, measured in quadrillionths of a second, work like a super-speed camera to record ultrafast changes, and snapshots taken at different points in time can be compiled into detailed movies.

“This experiment marks the first time that LCLS has been used to directly observe a protein’s structural changes at such a high resolution as it happens,” Frank said.

The protein the researchers studied, found in purple bacteria and known as PYP for “photoactive yellow protein,” functions much like a bacterial eye in sensing certain wavelengths of light. The mechanism is very similar to that of other receptors in biology, including receptors in the human eye.

“Though the chemicals are different, it’s the same kind of reaction,” said Schmidt, who has studied PYP since 2001. Proving the technique works with a model protein like PYP sets the stage to study more complex and biologically important molecules at LCLS, he said.

In the LCLS experiment, researchers prepared crystallized samples of the protein, and exposed the needle-shaped crystals, each about 2 millionths of a meter long, to optical laser light before jetting them into the LCLS X-ray beam.

The incident X-rays produced diffraction patterns as they struck the crystals, which are used to reconstruct the 3-D structures of the proteins. Researchers compared the structures of the light-exposed proteins to structures of proteins that had been held in the dark to identify light-induced structural changes.

“In this work, we were observing structural changes on a microsecond timescale that were essentially known previously in order to demonstrate the potential of this method,” Frank said. “We are looking forward to our next series of LCLS experiments in 2015, where we will enter uncharted territory and will attempt to probe structural changes in the same protein on much shorter time scales.”

“In the future we plan to study all sorts of enzymes and other proteins using this same technique,” Schmidt said. “This study shows that the molecular details of life’s chemistry can be followed using X-ray laser crystallography, which puts some of biology’s most sought-after goals within reach.”

Other contributors include Arizona State University; University of Hamburg; DESY in Hamburg, Germany; State University of New York, Buffalo; University of Chicago; and Imperial College in London. The work was supported by the National Science Foundation, National Institutes of Health and Lawrence Livermore National Laboratory.

> Continue reading…

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

PMS No More? Tell Me More!

Rayshell Clapper for redOrbit.com – Your Universe Online

Every single month, women across the globe suffer from premenstrual syndrome, more commonly known as PMS. In fact, about 80 percent of women suffer from at least one symptom of PMS as the Office on Women’s Health (OWH) website explains.

What exactly are the symptoms of PMS? According to the OWH website, the symptoms can be physical and emotional, and they include the following:

• Acne
• Swollen or tender breasts
• Feeling tired
• Trouble sleeping
• Upset stomach, bloating, constipation, or diarrhea
• Headache or backache
• Appetite change or food cravings
• Joint or muscle pain
• Trouble with concentration or memory
• Tension, irritability, mood swings, or crying spells
• Anxiety or depression

Each woman who has PMS experiences different symptoms. Any one of these could leave women feeling awful, but some women suffer more than one, which could be devastating. Although the symptoms may be temporary, each month brings them back. Obviously, anyone who experiences any symptom of PMS would desire treatment. Current treatments include lifestyle changes, alternative therapies, and medications. OWH breaks down the current treatments. For lifestyle changes, the focus is on exercise, healthy diet, good sleep, stress management, and avoiding smoking. Current alternative therapies include adding certain vitamins and minerals to one’s daily life. Medications given to treat PMS consist of ibuprofen, ketoprofen, naproxen, and aspirin.

Despite all this, many women still suffer through the pains and troubles of PMS.

However, a collaborative study between the University of Bristol, the University College London, and the University of Sao Paolo-Ribeirão Preto in Brazil has found a possible new treatment in the form of the anti-depressant Prozac, also called fluoxetine.

According to the University of Bristol, “PMS appears to be triggered by the fall in secretion of the ovarian sex steroid hormone progesterone that occurs towards the end of the menstrual cycle and leads to a decline in its breakdown product allopregnanolone, which acts in the brain as a potent sedative and tranquilizing agent.”

In layman’s terms, this means that women with PMS suffer from a type of natural drug withdrawal response. The brain reacts to the drop of progesterone and the subsequent decline of allopregnanolone.

The researchers found a connection between the anti-depressant fluoxetine and how the brain inhibits a certain enzyme, which deactivates allopregnanolone. As the fluoxetine inhibits the enzyme, the allopregnanolone does not decline in its breakdown thus the brain retains its necessary chemical balance. The connection was first recognized in rats, but the study found that fluoxetine had the same effect on the human brain, which means that it could potentially help women who suffer from PMS.

As the University additionally explains: “A paper published in the journal European Neuropsychopharmacology, coupled with the team’s recent findings published in the British Journal of Pharmacology, show that short-term treatment with a low dose of fluoxetine immediately prior to the rat’s premenstrual period not only raised brain allopregnanolone and prevented the development of PMS-like symptoms but also blocked the increase in excitability of brain circuits involved in mediating the stress and fear responses that normally occur during this phase of the cycle.”

Further good news from this study comes in the fact that the dose of the fluoxetine was much lower than that given to patients dealing with depression. Moreover, the study showed that the lower dose of the anti-depressant worked much sooner to prevent symptoms of PMS versus treating the symptoms of depression. In those who take the drug for depression, it has to get into the system for weeks before the patient experiences any benefits. For those who may take it to treat PMS, it started working within hours.

For the 80 percent or so of women who suffer from one or more of the aforementioned PMS symptoms, the study about fluoxetine and its potential benefits could mean the difference between happiness and depression, between living life and hiding due to the mental and physical pains. As human trials begin in Brazil, perhaps new drugs will develop as well. A future without PMS is certainly one most of us would like to see, men and women alike.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Don’t Worry About Changing Passwords…This App Will Do It For You!

Chuck Bednar for redOrbit.com – Your Universe Online
Security is of vital importance when surfing the Web or shopping online, but protecting your personal and financial information with dozens of highly complicated passwords can be such a bother – especially when you’re forced to change them every time you turn around due to yet another dime-a-dozen security threat.
There’s a reason why “password” has long been, and remains, one of the most popular passwords amongst computer users – we’re lazy when it comes to protecting our privacy and financial records. Fortunately, a new security tool from Dashlane makes it easier than ever to change your website passwords…by doing it for you.
As BGR’s Chris Smith explains, the Dashlane Password Changer logs into the various websites where the user wants to change his or her passwords, then automatically changes them while storing the new credentials and allowing easy access social media websites, e-commerce retailers or other important webpages.
Dashlane CEO Emmanuel Schalit calls the tool “the antidote for future Heartbleeds,” telling Smith, “the ability to automatically change passwords is revolutionary. It provides users a highly effective way to stay safe from increasingly common security breaches on the scale of Heartbleed. Password Changer introduces a new paradigm of convenience and security for the consumer.”

Credit: Dashlane


While Dashlane is not the only company that offers management of online passwords, Marshall Honorof of Tom’s Guide reports that once the company launches its new service, it will be the first to allow users to change all of their passwords with a single click. Password Changer is now in beta and Honorof said that it will eventually be able to change your passwords by itself on a regular basis.
“While Password Changer is not yet automated, it does have access to more than 70 services,” he said. “Changing your password is as simple as choosing the service you want and typing in a new password for it. The Dashlane software takes care of the rest. It can even integrate services with two-factor authentication, which makes reaching for your phone to simply update a password unnecessary.”
According to USA Today reporter Jefferson Graham, Dashlane will be free for its 2.5 million members on one computer, and will cost $39.99 for those wanting to keep track of passwords on PCs, phones, tablets and other devices. The service works with websites such as Amazon, Dropbox, eBay, Facebook, Google, PayPal and Twitter, and the company looks to expand it over the new few months.
And then there’s the whole automatic password changes at regularly-scheduled interval thing. In a statement, the company said that the soon-to-be-introduced feature would allow users to, for instance, schedule Password Changer to automatically change some of their most essential passwords every 30 days, removing the need to worry about password security by completely eliminating them from the log-in credential creation process.
As Dashlane co-founder Alexis Fogel explained to Graham, “Managing different passwords is really painful… even if just one account gets hacked, your entire digital identity is at risk.” Whenever there is a security threat, “you have to change” passwords “over and over again. Now, just go to the app, and everything is changed for you.”
Schalit added that the company’s goal is to make passwords “irrelevant for consumers” and to create an app that people can use without ever needing to “know, type, remember of even change their passwords.”
Nothing could possibly go wrong in such a situation, right? Well, unless the app stops working or gets hacked, or the company goes out of business leaving users completely unable to access any of their online accounts.

—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Smoking Could Hamper Treatment For Alcohol Abuse

Provided by Cathy Wilde, University at Buffalo
A new study has shown that smoking can inhibit the success of treatment for alcohol abuse, putting people who are addicted to both tobacco and alcohol in a double bind.
According to findings by the University at Buffalo Research Institute on Addictions (RIA), clients who smoke have shorter stays in alcohol treatment programs than non-smokers and may have poorer treatment outcomes than non-smokers.
Kimberly Walitzer, PhD, deputy director and senior research scientist at RIA, led the study, which analyzed more than 21,000 adult treatment seekers from 253 community outpatient substance abuse clinics across New York State.
“The data suggest that smoking is associated with difficulties in alcohol treatment,” Walitzer says. “Tobacco smokers had shorter treatment durations and were less likely to have achieved their alcohol-related goals at discharge relative to their nonsmoking counterparts.
“This should be a major concern for treatment providers, as the majority of people with alcohol disorders are, in fact, smokers,” she explains.
According to the Centers for Disease Control, less than 20 percent of people in the US are regular smokers. However, a much higher percentage of people with alcohol use disorders are smokers. Further, both smoking and problem drinking are associated with life challenges such as unemployment, lack of high school diploma or GED, criminal justice involvement, mental illness and/or other substance abuse.
For women, these associations are even stronger. Although less than 15 percent of women smoke in the general community, Walitzer’s data indicate that 67 percent of women seeking alcohol treatment were smokers, compared to 61 percent of the men. Unfortunately, the study’s results show that women who smoke have even more difficult circumstances and poorer alcohol treatment outcomes than men who smoke.
What is the solution? “Previous research indicates that if people can quit smoking when entering alcohol treatment, they may have better alcohol outcomes,” Walitzer says. “However, simultaneous cessation is a task that is very challenging to accomplish.”
The study appears in the journal Substance Use and Misuse, and its co-authors are RIA’s Ronda L. Dearing, PhD, senior research scientist; Christopher Barrick, PhD, senior research scientist; and Kathleen Shyhalla, PhD, data analyst. It was funded by the National Institute on Alcohol Abuse and Alcoholism.
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Why The FDA Sees Reusable Menstrual Pads As Medical Devices

Women menstruate. It’s a simple fact of life and usually not too much of an inconvenience to have access to some sort of sanitary product. In the year 2014, women have the choice of a number of products to use during their menstrual periods, and while all do more or less the same job, only menstrual… (more…)

Office Jerks Beware – Your Good Ideas May Not Always Be Welcomed By Colleagues

Provided by Springer

Being original AND disagreeable can backfire within a supportive group, say researchers

You don’t have to be a jerk to come up with fresh and original ideas, but sometimes being disagreeable is just what’s needed to sell your brainchild successfully to others. However, difficult or irritating people should be aware of the social context in which they are presenting their ideas. A pushy strategy will not always be equally successful, warn Samuel Hunter of Pennsylvania State University and Lily Cushenbery of Stony Brook University in the US, in an article in Springer’s Journal of Business and Psychology.

People are often labelled as jerks if they are disagreeable by nature, overly confident, dominant, argumentative, egotistic, headstrong or sometimes even hostile. It’s widely touted in the popular press that being so direct and forceful was what made innovators such as Steve Jobs and Thomas Edison successful.

Hunter and Cushenbery wanted to test whether people with disagreeable personalities are more innovative, and if it helps them down the line to get their fresh ideas accepted and used. In their first study, 201 students from a large Northeastern university in the US completed personality tests before strategizing together in groups of three to develop a marketing campaign. In the second study, involving 291 people, Hunter and Cushenbery used an online chat environment to investigate how being in the presence of other creative and supportive colleagues helped people to share their ideas more freely.

The first study showed that people do not need to be jerks to have fresh ideas. However, such an attitude helps when you want to steamroll your ideas so that others will accept them. Findings from the second study highlighted how important the social context is in which new ideas are being shared. Hunter and Cushenbery established that being disagreeable helps when you want to push your new ideas ahead or when you find yourself in a situation that is not necessarily open to original thoughts or changes. This obnoxious attitude can, however, backfire if you are working within a supportive, creative group in which ideas are shared freely.

“It seems that being a ‘jerk’ may not be directly linked to who generates original ideas, but such qualities may be useful if the situation dictates that a bit of a fight is needed to get those original ideas heard and used by others,” says Hunter in summarizing the results.

“Disagreeable personalities may be helpful in combating the challenges faced in the innovation process, but social context is also critical,” elaborates Cushenbery. “In particular, an environment supportive of original thinking may negate the utility of disagreeableness and, in fact, disagreeableness may hamper the originality of ideas shared.”

Reference: Hunter, S.T. & Cushenbery, L. (2014). Is being a jerk necessary for originality? Examining the role of disagreeableness in the sharing and utilization of original ideas. Journal of Business and Psychology. DOI 10.1007/s10869-014-9386-1

> Continue reading…

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Berkeley Lab Particle Accelerator Sets New World Record

Chuck Bednar for redOrbit.com – Your Universe Online
Researchers from the US Department of Energy’s Lawrence Berkeley National Lab in California have set a new world record by exciting subatomic particles to the highest energies ever recorded from a compact accelerator.
According to the Berkeley Lab, the team that accomplished the feat used a specialized petawatt laser and charged-particle gas combination to accelerate electrons inside a nine-centimeter long plasma tube to an energy of 4.25 giga-electron volts.
This is known as a laser-plasma accelerator, an emerging class of particle accelerators that physicists believe can shrink traditional, miles-long accelerators to machines that can fit on a table, and the record-setting short-distance acceleration corresponds to an energy gradient 1000 times greater than traditional particle accelerators.
“This result requires exquisite control over the laser and the plasma,” explained Dr. Wim Leemans, director of the Accelerator Technology and Applied Physics Division at the Berkeley Lab and lead author of a new Physical Review Letters paper detailing the feat.
Traditional particle accelerators, such as CERN’s Large Hadron Collider, speed up particles by modulating electric fields inside a metal cavity, the researchers explained. This technique has a limit of approximately 100 mega-electron volts per meter before the metal breaks down, but laser-plasma accelerators take an entirely different approach.
In the Berkeley experiment, scientists injected a pulse of laser light into a thin, short straw-like tube that contained plasma. The laser creates a channel through the charged-particle gas as well as waves that trap free electrons and accelerate them to high energies. The record-setting effort was assisted by the Berkeley Lab Laser Accelerator (BELLA), a powerful laser capable of producing a quadrillion watts of power (a petawatt).
Dr. James Symons, associate laboratory director for Physical Sciences at Berkeley Lab, called it “an extraordinary achievement for Dr. Leemans and his team to produce this record-breaking result in their first operational campaign with BELLA.”
Dr. Leemans explained that he and his colleagues were “forcing this laser beam into a 500 micron hole about 14 meters away” and that “the BELLA laser beam has sufficiently high pointing stability to allow us to use it.” In addition, he said that the laser pulse, which fires once a second, is stable to within a fraction of a percent – something that “never could have happened” with less precise, harder-to-control lasers.
Considering the high energies required for the experiments, the researchers used computer simulations at the National Energy Research Scientific Computing Center (NERSC) to test the set-up of the accelerator and see how different parameters would alter the outcome. Eric Esarey, senior science advisor for the Accelerator Technology and Applied Physics Division at Berkeley Lab, noted that since small changes could have a drastic impact on the results, it was important to focus on the regions of operation and the best ways to control the accelerator.
“In order to accelerate electrons to even higher energies – Leemans’ near-term goal is 10 giga-electron volts – the researchers will need to more precisely control the density of the plasma channel through which the laser light flows,” the laboratory said. “In essence, the researchers need to create a tunnel for the light pulse that’s just the right shape to handle more-energetic electrons. Leemans says future work will demonstrate a new technique for plasma-channel shaping.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Sony Hackers Release Personal Info Of Movie Stars

Chuck Bednar for redOrbit.com – Your Universe Online
Demand ‘The Interview’ not be released
The hackers responsible for crippling Sony Pictures Entertainment’s computer system have released personal information of some of Hollywood’s biggest stars and demanded that a controversial film involving a plot to kill North Korean dictator Kim Jong-un not be released, various media outlets have reported.
On Tuesday morning, Reuters reporters Lisa Richwine and Jim Finkle said that the group known as Guardians of Peace (GOP) has for the first time publically called upon the movie studio to cancel the upcoming release of “The Interview,” a comedy in which two TV journalists are recruited by the CIA to assassinate the North Korean leader.
According to Richwine and Finkle, the group posted a letter to a file-sharing website on Monday asking Sony to “stop immediately showing the movie of terrorism which can break the regional peace and cause the War!” The letter was signed by GOP, the same group that claimed responsibility for a cyberattack campaign against Sony Pictures that started back on November 24.
Unnamed sources close to the investigation of the Sony hacking have reportedly told Reuters that North Korea is a principal suspect in the attacks, which have crippled the film studio’s computer systems and have also resulted in employees and their families being threatened. A North Korean diplomat has denied his country’s involvement, the news organization said, despite previously referring to “The Interview” as “an act of war.”
The hackers have also released what is believed to be contact information for Brad Pitt, Julia Roberts and Tom Hanks, as well as the aliases used by several Hollywood actors and actresses while traveling secretly, according to Nick Allen of The Telegraph. Daniel Craig, Natalie Portman, Tobey Maguire and Sarah Michelle Gellar were among those who had their pseudonyms leaked by the hackers, he noted.
Allen said that the new leak was “the latest embarrassing tranche of information posted online by the Guardians of Peace,” and that the latest Sony Pictures documents “also included research on how popular individual movie stars are in different countries.” The group also posted a message saying, “You, Sony and FBI, cannot find us.”
The ongoing investigation also led to the discovery of threats sent by the group to Sony Pictures CEO Michael Lynton, Chairman Amy Pascal and other executives on November 21, three days before the attacks, said Mashable’s Lorenzo Franceschi-Bicchierai and Christina Warren. An apparently unread email discovered in Pascal’s account warned that the hackers would cause “great damage” and that Sony would be “bombarded” unless the group received “monetary compensation.”
“The hackers signed the email as ‘God’sApstls,’ a phrase that was also found inside some of the malware used in the attack on Nov. 24, which wiped many of Sony’s computer systems,” said Franceschi-Bicchierai and Warren, adding that the hackers “used what appears to be a throwaway Gmail address” and that the email may help explain “the chilling message that appeared on some Sony employees’ computers” and “made references to a previous, unheeded warning.”

—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.
—–
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Experiments Indicate Strong Winds Needed To Shape Titan’s Dunes

Chuck Bednar for redOrbit.com – Your Universe Online
The presence of massive dunes, some over a mile wide and hundreds of yards high, on Saturn’s moon Titan has long puzzled scientists, but the authors of a new study appearing in the current edition of the journal Nature believe they have solved the mystery.
Titan, which has a dense atmosphere as well as lakes and rivers made up of components of natural gas, is home to dunes that can stretch out for hundreds of miles – even though data suggests the moon only experiences light breezes, not the strong winds required to move such massive amounts of sand.
However, Devon Burr, an associate professor in the Earth and Planetary Sciences Department at the University of Tennessee, and colleagues from Arizona State University have demonstrated that Titan’s winds must blow faster than previously believed in order to move the sand, thus potentially explaining how those dunes were formed.
Burr, James K. Smith, engineer and manager of ASU’s Planetary Aeolian Laboratory, and their fellow researchers conducted experiments at the Arizona State facility’s high-pressure wind tunnel and found that previous estimates of the wind seep required to blow sand-sized particles around on the moon are approximately 40 percent too low.
“It was surprising that Titan had particles the size of grains of sand – we still don’t understand their source – and that it had winds strong enough to move them,” Burr, lead investigator on the study, explained in a statement Monday. “Before seeing the images, we thought the winds were likely too light to accomplish this movement.”

Image Above: Cassini radar sees sand dunes on Saturn’s giant moon Titan (upper photo) that are sculpted like Namibian sand dunes on Earth (lower photo). The bright features in the upper radar photo are not clouds but topographic features among the dunes. Credit: NASA
The dunes, which were first discovered by the Cassini-Huygens orbiter and lander in 2004, begin to form when the wind collects loose particles from the ground and causes them to saltate – or hop – downwind. To understand the dunes, it is essential to identify the threshold wind speed that causes the particles that comprise them to start moving.
While geologists had already determined the threshold wind speeds required for dust and sand under various conditions on Earth, as well as on Venus and Mars, the bizarre conditions present on Titan made it more difficult. For one thing, given that the moon’s surface temperature is negative 290 degrees Fahrenheit, it is unlikely the particles found on its surface behave the same way as sand found on Earth, Mars or Venus.
Based on the Cassini observations and other data, scientists believe that it is composed of small particles of solid hydrocarbons (or ice wrapped in hydrocarbons), with a density about one-third that of terrestrial sand, the researchers said. Furthermore, Titan’s low gravity (approximately one-seventh that of Earth’s) and the low density of the particles gives them a weight of roughly four percent that of terrestrial sand, they added.
Burr, Smith and their colleagues used a high-pressure wind tunnel previously used to study conditions on Venus, and increased its air pressure to about 12 times the surface pressure of Earth to recreate the conditions of Titan. They also compensated for the low density of Titan “sand” and the moon’s reduced gravity through numerical modeling.
Ultimately, the simulation “reproduces the fundamental physics governing particle motion thresholds on Titan,” the researchers explained, according to an ASU statement, adding that previous studies extrapolated data from experiments designed to mimic conditions on Earth and Mars and produced results considered to be questionable under Titan’s conditions.
The new wind tunnel experiments indicate “that the previous calculations for wind speeds necessary to lift particles were about 40 to 50 percent too slow,” the university said. “The new experiments show that near the surface of Titan, the most easily moved sand-size particles need winds of at least 3.2 miles per hour (1.4 meters per second) to start moving,” demonstrating that gusts would be needed to “blow them around and reshape the dunes.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Scientists Use Ancient Parchment DNA To Study Agricultural Development

Chuck Bednar for redOrbit.com – Your Universe Online
Millions of documents currently stored in archives throughout the world could provide clues to tracing agricultural development throughout the centuries thanks to a new technique of analyzing DNA found in ancient parchments, researchers from Trinity College Dublin and the University of York report in a new study.
In a paper published Monday in the Philosophical Transactions of the Royal Society B, the study authors report that state-of-the-art genetic sequencing techniques make it possible to obtain vital information from the DNA of the parchment on which historical texts are written.
The researchers were able to use these methods to extract and analyze DNA and protein from tiny samples of 17th and 18th century parchments, and collected enough information to allow them to determine which types of creatures from which the parchment sheets were made. They then compared the genomes of those animals to their modern relatives to discover how their genetic diversity was influenced by the expansion of agriculture.

This is an imaged parchment document from Yarburgh Muniments Lancashire Deeds YM. D. Lancs Jan. 13-14, 1576/7. Credit: By permission of The Borthwick Institute for Archives'


The information gives scientists a new resource from which to study the development of livestock husbandry across the centuries, Trinity College Dublin Professor of Population Genetics Daniel Bradley and his colleagues report in the recently-published study. The work was funded by a grant from the European Research Council.
“This pilot project suggests that parchments are an amazing resource for genetic studies that consider agricultural development over the centuries,” Bradley said in a statement. “There must be millions stored away in libraries, archives, solicitors’ offices and even in our own attics. After all, parchment was the writing material of choice for thousands of years, going back to the Dead Sea Scrolls.”
“Wool was essentially the oil of times gone by, so knowing how human change affected the genetics of sheep through the ages can tell us a huge amount about how agricultural practices evolved,” he added.
Bradley and his colleagues from Trinity and the Centre for Excellence in Mass Spectrometry at York extracted DNA and collagen (protein) from two 2 cm x 2 cm samples of parchment provided by the Borthwick Institute for Archives.
One of the samples showed a strong affinity with northern Britain, specifically the region in which current black-faced breeds such as Swaledale, Rough Fell and Scottish Blackface are common, the researchers explained. The other sample showed a closer affinity with the Midlands and southern Britain where the livestock Improvements of the later 18th century were most active.
If similar levels of endogenous DNA content can be found in other parchments, the resulting genetic sequencing could provide new insights into the breeding history of sheep and other types of livestock before, during and after agricultural improvements that resulted in the emergence of regional breeds of sheep during the 18th century.
“We believe the two specimens derive from an unimproved northern hill-sheep typical in Yorkshire in the 17th century, and from a sheep derived from the ‘improved’ flocks, such as those bred in the Midlands by Robert Bakewell, which were spreading through England in the 18th century,” Matthew Collins, head of York’s BioArCh bioarchaeology center, said.
“This pilot project suggests that parchments are an amazing resource and there are millions stored away in libraries, archives, solicitors’ offices and private hands. They can give us significant data about the source animal and using them we can learn an enormous amount about the development of agriculture in the British Isles,” he added. “We want to understand the history of agriculture in these islands over the last 1,000 years and with this breath-taking resource we can.”

—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

BPA-Lined Containers May Be Linked To Increased Blood Pressure

Chuck Bednar for redOrbit.com – Your Universe Online
Eating or drinking from bottles or cans lined with the chemical Bisphenol A (BPA) could increase a person’s blood pressure, according to new research published Monday in the American Heart Association journal Hypertension.
While previous research has linked consumption of BPA, a substance used as an epoxy lining for cans and plastic bottles, with high blood pressure and heart rate variability, the new Seoul National University College of Medicine-led study indicates that exposure from canned beverages actually has the same effect.
“A 5 mm Hg increase in systolic blood pressure by drinking two canned beverages may cause clinically significant problems, particularly in patients with heart disease or hypertension,” study author Yun-Chul Hong, director of the South Korean university’s Environmental Health Center and chair of its Department of Preventive Medicine, said in a statement. “A 20 mm Hg increase in systolic blood pressure doubles the risk of cardiovascular disease.”
As part of their study, Hong and co-author Sanghyuk Bae recruited 60 adults over the age of 60 (mostly Korean women) from a local community center for a randomized crossover trial. Each participant visited the study site on three different occasions, and was randomly provided with soy milk in either cans or glass bottles.
Two hours after the consumption of each beverage, urine samples were later collected and tested for BPA concentration levels, blood pressure and heart rate variability. The researchers found that urinary BPA concentration increased by as much as 1,600 percent in those consuming canned beverages versus the glass-bottled ones.
Hong and Bae explained that soy milk was the ideal beverage for the test, as it contains no ingredient known to cause blood pressure to become elevated. They believe their findings could help policy-makers, the healthcare industry and the general public become more aware of the cardiovascular health risks associated with BPA exposure.
“Because these results confirm findings from other studies, doctors and patients, particularly those with high blood pressure or heart disease, should be aware of the possible risks from increased blood pressure when consuming canned foods or beverages,” Hong told Stephen Reinberg of HealthDay News.
“Thanks to the crossover intervention trial design, we could control most of the potential confounders… Time variables, such as daily temperatures, however, could still affect the results,” he added. “I suggest consumers try to eat fresh foods or glass bottle-contained foods rather than canned foods and hopefully, manufacturers will develop and use healthy alternatives to BPA for the inner lining of can containers.”
Steven Gilbert, director and founder of the US Institute of Neurotoxicology and Neurological Disorders, emphasized the need to find safer alternatives to BPA for can linings, telling Reinberg that he was particularly concerned about children being exposed to the chemical. BPA has been linked to physical and mental development including gynecomastia (male breast growth), he said, as well as behavioral issues, obesity and type 2 diabetes.
However, Steven Hentges from the American Chemistry Council’s Polycarbonate/BPA Global Group disputed the study’s findings, telling HealthDay News that the claims that BPA, which has been declared safe by the US Food and Drug Administration (FDA), “’may pose a substantial health risk’ is a gross overstatement of the findings, an incredible disservice to public health, and runs contrary to years of research by government scientists.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Monkeying Around With Math – Dopamine Plays A Role In Cognitive Reasoning

Brett Smith for redOrbit.com – Your Universe Online

Dopamine is probably best known as the neurotransmitter that brings us happiness, contentment or pleasure, but a new study from researchers at the University of Tübingen’s Institute for Neurobiology in Germany has shown that dopamine also plays a role in cognitive tasks.

Published in the journal Neuron, the new study specifically found that dopamine plays a role in the brain’s processing of mathematics.

Previous research has shown that dopamine plays a role in motor skills as individuals suffering from Parkinson’s disease have markedly lower levels of the neurotransmitter in their brains. Researchers have also known that a dopamine imbalance can disrupt cognitive abilities – particularly in the prefrontal cortex, which is used in logical processing.

In the new study, researchers taught rhesus monkeys to answer simple “greater than” and “less than” math questions. Utilizing recent research as a basis, the scientists knew that particular neurons in the prefrontal cortex are involved in answering these kinds of questions. Studies show that one side of these “rule cells” in the prefrontal cortex is triggered when the “greater than” rule is used, and the other side is triggered when the “less than” rule applies.

As these cells are being turned on, relatively small quantities of various substances were being released nearby. These chemicals can have the same impact as dopamine – or the reverse – and could interact with dopamine-sensitive neurons. The team saw that activation of the dopamine system permitted the “rule cells” to execute their function better and to better differentiate between the “greater than” and “less than” rules.

Nerve cells in the prefrontal cortex (marked) can process “greater than” and “less than” rules better under the influence of dopamine. Credit: LS Tierphysiologie/Tübingen University

The study team said their work provides new details on how dopamine affects abstract thought processes – such as those required for the use of simple mathematical rules.

“With these findings, we are just starting to understand how nerve cells in the prefrontal cortex produce complex, goal-directed behavior,” said study author Torben Ott, a neurobiologist at University of Tübingen.

Andreas Nieder, also from Tübingen, noted that his team’s results could also have medical significance.

“These new insights help us to better interpret the effects of certain medicines which may be used for instance in cases of severe psychological disturbance,” Nieder said, “because such medications influence the dopamine balance in the prefrontal cortex in ways we do not understand well to date.”

Dopamine is often associated with drug and alcohol use, and a study last year showed that simply tasting alcohol can release a flood of dopamine completely independent from any effect the alcohol might have. The study team found this effect was much stronger in people with a family history of alcoholism.

“Sensory cues that are closely associated with drug intoxication (ranging from tastes and smells to the sight of a tavern) have long been known to spark cravings and induce treatment relapse in recovering alcoholics. Many neuroscientists believe dopamine plays a critical role in such cravings,” the study team said in a statement.

“We believe this is the first experiment in humans to show that the taste of an alcoholic drink alone, without any intoxicating effect from the alcohol, can elicit this dopamine activity in the brain’s reward centers,” said David A. Kareken, a professor of neurology at the Indiana University.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Pew Study Finds Americans Believe The Internet Makes Them Smarter

Chuck Bednar for redOrbit.com – Your Universe Online
The majority of American Internet users feel better informed when it comes to knowing what products and services to buy, as well as both national and international news and popular culture.
According to Kristen Purcell and Lee Rainie of the Pew Research Internet Project, 87 percent of the 1,066 adult Internet users participating in the study said that access to the online world and cell phones have improved their ability to learn new things. Of those, 53 percent said that the Internet and mobile devices had strengthened this area of their lives “a lot.”
“The vast majority of Americans believe their use of the web helps them learn new things, stay better informed on topics that matter to them, and increases their capacity to share ideas and creations with others,” Purcell and Rainie explained. “These generally positive attitudes are buttressed by the view that people like having so much information at their fingertips, rather than feeling information overload. Moreover, this positive judgment extends to the broader culture. Most believe that average Americans and US students are better informed than in the past.”
People under the age of 50, those living in higher-income households and those who have attained higher levels of education are the most likely to report that the Internet and cell phones have been especially helpful when it comes to learning new things, the report found. Seventy-two percent said that they liked having massive amounts of information at their fingertips, while only 26 percent said that they felt overwhelmed by the amount of content.
When asked about specific types of information, 81 percent of those who responded said that the Web and cell phones had made them more knowledgeable about products and services than they were five years ago, while 75 percent felt that way about national news, 74 percent about global news and 72 percent about pop culture.
Furthermore, more than two-thirds of them said they knew more than their friends than they did five years ago thanks to these technological tools, and 60 percent felt likewise about their family members. However, only 49 percent said that they felt better informed about civic and government activities in their community thanks to Internet and mobile devices, and just 39 percent felt better informed about their neighbors and their neighborhood.
Nearly three-fourths of Internet users said that digital technologies had improved their ability to share their ideas and creations with others, an increase from 55 percent since 2006. Purcell and Rainie said that one of the primary reasons for this increase in the rise of social networks such as Facebook and Twitter. Such websites were not widely used eight years ago, but are currently used by more than two-thirds of all Internet users, according to Pew.
“Overall, internet users believe that both the average American and the average student today are better informed thanks to the internet,” the Pew representatives said. Seventy-six percent of adults said that the Internet had made the average US resident better informed, while just eight percent said it had made them less knowledgeable. Seventy-seven percent felt it made modern-day students better informed, while eight percent again believed the opposite was true.
“Perhaps surprisingly in both cases, internet users under age 30 are less likely to believe the internet is making average Americans or today’s students better informed,” Purcell and Rainie added. “Instead, they are more likely than their older counterparts to say the internet has had no real impact – 19 percent of young adults say so, compared with 9 percent of those ages 30 and older.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Voters More Willing Than Consumers To Pay For Food Safety

Provided by Lauren Milideo, Penn State

Voters are more willing to pay for a decreased risk of food-related illness than consumers, but female consumers are more willing to pay than male consumers, according to an international team of researchers.

“The question is, what would consumers prefer?” said Amit Sharma, associate professor of hospitality management and finance, Penn State. “Would they prefer a market-driven, or a policy-driven approach? Either of those two approaches could lead to some price increase. Improving quality costs money, and food safety is no different.”

Sharma and colleagues wanted to know whether people would pay more for a lowered risk of a food-related illness, and in particular whether their choices would vary if they were thinking about the issue from a consumer perspective as opposed to a voter perspective.

“The question is whether it matters whether we elicit consumer or citizen preferences when valuating food safety,” said the researchers in a recent issue of Food Policy.

The researchers created two surveys for distribution to participants. One survey asked about the participant’s willingness to pay more at a neighborhood restaurant to ensure reduced risk of food-related illness. The other asked whether the participant would vote yes or no for regulations to reduce this risk that would result in the same increase in restaurant prices. Participants were asked about their willingness to pay increased amounts — from none, to 1 to 5 percent, to over 30 percent of the meal price — for a lowered risk of food-related illness. Respondents answered the survey for a 25 percent, 50 percent, and 75 percent reduction in risk.

Over the course of a semester, the team collected survey responses from 864 people at a university campus restaurant. Participants covered a range of ages, income levels and educations levels and included local residents, students and university employees.

The researchers also developed models, taking into account variables including participant gender and age, to determine whether the participants responded to the survey differently as consumers than as voters.

The researchers found that, in the total sample, voters and consumers varied significantly in their willingness to pay for decreased risk. Furthermore, among consumers, women were more willing than men to pay for a reduced risk of illness, and in particular, older women were willing to pay more than young men.

“This indicates that while men and women have a similar (willingness to pay) for a reduction in the foodborne risk level at a society level, women are more willing than men to pay to protect themselves when at a restaurant,” the researchers said.

Neither voters nor consumers differed in their willingness to pay at different risk levels. However, the overall difference between responses in a voting context, as opposed to as consumers, reflects participants’ varying reactions to the cause of a price increase, said Sharma.

“An increase in price because of a policy, or an increase in price because of a vote that led to a government policy, would be more acceptable than if the restaurants had implemented this by themselves,” said Sharma.

This is important if policy decisions rely on consumer-based data that may not accurately convey people’s willingness to pay more for a reduced food-related risk.

“We might come to undervalue what citizens would truly pay for safer foods versus if this was more of a market driven or in this case a buying scenario,” said Sharma. “If this was driven by policy, then it’s likely that citizens would be willing to pay a higher price.”

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Hormone Reduces Calorie Burning, Contributes To Obesity

Provided by Amanda Boundris, McMaster University

Researchers from McMaster have identified an important hormone that is elevated in obese people and contributes to obesity and diabetes by inhibiting brown fat activity.

Brown adipose tissue — widely known as “brown fat” — is located around the collarbone and acts as the body’s furnace to burn calories. It also keeps the body warm. Obese people have less of it, and its activity is decreased with age.

Until now, researchers haven’t understood why.

There are two types of serotonin. Most people are familiar with the first type in the brain or central nervous system which affects mood and appetite. But this makes up only five percent of the body’s serotonin.

The lesser-known peripheral serotonin circulates in the blood and makes up the other 95 percent of the body’s serotonin. McMaster researchers have discovered that this kind of serotonin reduces brown fat activity or “dials down” the body’s metabolic furnace.

The study, published Monday in Nature Medicine, is the first to show that blocking the production of peripheral serotonin makes the brown fat more active.

From left to right: Co-authors Waliul Khan, associate professor of pathology and molecular medicine and Gregory Steinberg, professor of medicine, with lead author and post-doctoral fellow, Justin Cran. Credit: McMaster University

“Our results are quite striking and indicate that inhibiting the production of this hormone may be very effective for reversing obesity and related metabolic diseases including diabetes,” said Gregory Steinberg, the paper’s co-author and professor of medicine at the Michael G. DeGroote School of Medicine. He is also co-director of MAC-Obesity, the Metabolism and Childhood Obesity Research Program at McMaster.

“Too much of this serotonin acts like the parking brake on your brown fat,” he explained. “You can step on the gas of the brown fat, but it doesn’t go anywhere.”

The culprit responsible for elevated levels of peripheral serotonin may also have been found.

“There is an environmental cue that could be causing higher serotonin levels in our body and that is the high-fat western diet,” said Waliul Khan, co- author, associate professor of pathology and molecular medicine for the medical school and a principal investigator at Farncombe Family Digestive Research Institute. “Too much serotonin is not good. We need a balance. If there is too much, it leads to diabetes, fatty liver and obesity.”

The majority of serotonin in the body is produced by tryptophan hydroxylase (Tph1). The McMaster team found that when they genetically removed or inhibited this enzyme that makes serotonin that mice fed a high-fat diet were protected from obesity, fatty liver disease and pre-diabetes due to an enhanced ability of the brown fat to burn more calories.

Notably, inhibiting the peripheral serotonin doesn’t affect the serotonin in the brain or central nervous system functioning, said Steinberg.

This is in contrast to earlier weight loss drugs which worked to suppress appetite by affecting levels of brain serotonin, but were associated with problems including cardiac complications and increased risk of depression and suicide.

“Moving forward, we think it’s a much safer method to work with increasing energy expenditure instead of decreasing the appetite, which involves more risks,” said Steinberg.

The researchers conclude that reducing the production of serotonin by inhibition of Tph1 “may be an effective treatment for obesity and its comorbidities,” and so the team is now working on a pharmacological “enzyme blocker.”

This study, conducted over five years, was supported by funding from the Canadian Diabetes Association, the Canadian Institutes of Health Research, Crohn’s and Colitis Canada, MAC-Obesity and the Natural Sciences and Engineering Research Council of Canada.

> Continue reading…

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.