Work On Fluorescent Microscope Earns Three Scientists 2014 Nobel Prize In Chemistry

Chuck Bednar for redOrbit.com – Your Universe Online
Two Americans and one German scientist have been awarded the 2014 Nobel Prize in Chemistry for using fluorescence to improve the resolution of microscopes, the Royal Swedish Academy of Sciences announced on Wednesday.
Eric Betzig of the Janelia Farm Research Campus at the Howard Hughes Medical Institute in Ashburn, Virginia; Stefan W. Hell of the Max Planck Institute for Biophysical Chemistry and the German Cancer Research Center; and William E. Moerner of Stanford University will split prize money of eight million kronor (just over $1.1 million).
According to the Associated Press (AP), the academy honored the trio for “the development of super-resolved fluorescence microscopy,” which they said allowed them to bypass the maximum resolution of traditional optical microscopes. “Their ground-breaking work has brought optical microscopy into the nanodimension,” they added.
Betzig, Hell and Moerner were announced as the Nobel Prize winners at a press conference in Sweden, and BBC News reported that their names will be added to a list of 105 other Chemistry laureates recognized by the academy since 1901. Their work made it possible to study molecular processes in real time, said committee chair and Lunds University chemist Sven Lidin.
Prior to their work, it was believed that optical microscopes would never be able to yield a resolution better than 0.2 micrometers, or half the wavelength of light. Betzig, Hell and Moerner were able to circumvent that limitation, however, by using fluorescent molecules which made it possible to study the activity of individual molecules inside living cells, including the aggregation of proteins associated with various diseases, the academy said.
The Academy awarded two separate principles: one to Hell for developing the method that enabled stimulated emission depletion (STED) microscopy, which uses one laser beam to stimulate fluorescent molecules to glow and another to cancel out all but those in a nanometer-sized volume; and the other to Betzing and Moerner who, working separately, laid the foundation for a second method of single-molecule microscopy.
Their method, the academy explained, relies upon the possibility of being able to activate or deactivate the fluorescence of individual molecules. Researchers image the same area several times, allowing only a few interspersed molecules to glow during each session. They then superimpose those images, which yields a dense super-image resolved at the nanolevel, they added. The method was first used by Betzig in 2006.
Upon hearing that he was named as one of the winners of the award, which the Wall Street Journal refers to as “the most prestigious prize for chemistry research,” Hell told reporters that he was “totally surprised” and “couldn’t believe it.” Moerner’s wife, who learned from the AP her husband was named a Nobel Laureate in Chemistry, said that she was “delighted” and “thrilled,” and she “had no idea it would be these three individuals.”
In a statement, American Chemical Society (ACS), President Dr. Tom Barton congratulated the winner, adding that their work had “allowed us to see the previously unseen – lifting the veil on bacteria, viruses, proteins and small molecules. This work is a most appropriate choice to be honored as it represents the confluence of biology, physics and chemistry. This is a wonderful example of chemistry as the enabling science.”
—–
Shop Amazon.com for all your fluorescent microscope needs.
—–

Facebook’s New Mobile Advertising Network Now Open For Business

Chuck Bednar for redOrbit.com – Your Universe Online
Facebook’s mobile advertising network, which allows ads to be placed not just throughout the social media website but also in third-party mobile apps, is getting set to expand, various media outlets reported on Tuesday.
According to Cade Metz of Wired.com, the social network has extended an invite to all mobile developers and publishers to join the Facebook Audience Network. The network, which launched in January, allows developers and publishers to sign up to display Facebook ads in their apps in exchange for a piece of the revenue, he added.
In an update posted Tuesday, Facebook’s Tanya Chen said that the network, which she claimed offered “a quick and easy way to connect these apps with the more than 1.5 million active advertisers on Facebook,” had been “optimized… to improve performance” in recent months and was now being formally launched to developers and publishers worldwide.

Image above courtesy of Facebook
Chen further touted the network’s effectiveness by referencing comments from gaming app developer Glu and music sharing app Shazam. Chris Akhavan, President of Publishing at Glu Mobile, said that their CPM with Facebook’s Audience Network was twice as good as with the service’s competition, and Shazam reported a 37 percent increase of revenue from ad networks since joining.
Furthermore, David Cohen of AllFacebook noted that Walgreens witnessed click-through rates increase at least fourfold during its test of the Facebook Audience Network. He also reports that HarperCollins reported a 16 percent increase in impressions. Game developer Machine Zone also said its cost per install was down and that it had found a new audience for its products.
As Metz explained, promoting success stories like this is part of Facebook’s sales pitch that revenue obtained by using their network “will be higher than what developers and publishers could get from other mobile ad networks, because the Facebook network lets advertisers target users on mobile apps in much the same way they target users on Facebook proper. Advertisers, you see, will pay a premium for such targeting.”
The Wired.com business and enterprise editor added that the social network is looking to secure a greater share of the $140 billion digital advertising market – despite already holding a solid lead over Google and other less prominent companies. Since Facebook has access to so much personal data about its users, he explained, it is able to more closely match ads based on age, gender, and interests and can charge more for those ads as a result.
While Facebook assures users that ad matches are made anonymously and that no new forms of targeting will be used as part of the network, Zach Miners of IDG News Service noted that “any broader leveraging of Facebook data is bound to raise privacy concerns among some people.”
“Similar questions around privacy arose last week with the rollout of Facebook’s Atlas ad server. Atlas also uses Facebook data to deliver ads outside the social network, though those ads may not appear on Facebook at all,” he added. Atlas, which is separate from the Audience Network, allows companies to secure ads from various sources and display them across a plethora of other websites and services, Metz noted.

New AAA-Sponsored Study Casts Doubts On Safety Of Hands-Free Device Use By Drivers

Chuck Bednar for redOrbit.com – Your Universe Online
While most people tend to believe the use of hands-free devices while operating a motor vehicle is a safe practice and does not cause drivers to become distracted, new research from the University of Utah and the AAA Foundation for Traffic Safety suggests otherwise.
According to Hadley Malcolm of USA Today, the organizations tested a variety of voice-activated car systems (including Apple’s Siri, Chevrolet’s MyLink, and Toyota’s Entune). Each system was then rated on a scale of 1 to 5, with 1 meaning the system did not distract the driver and 5 indicating it was highly demanding mentally.
In one of two studies conducted Utah psychology professor David Strayer and his colleagues, the researchers found that using voice commands to make phone calls or tune the radio using MyLink distracted drivers the most, earning it a score of 3.7. Mercedes’ COMMAND system (with a score of 3.1), MyFord Touch (3.0) and Chrysler’s UConnect (2.7) fared better, but each of them diverted the driver’s attention more than a regular cell phone conversation would have.
With a score of 1.7, Entune was found to be the least distracting system, requiring only as much attention as listening to an audiobook, Strayer and his fellow authors reported. The next least distracting system was Hyundai’s Blue Link (2.2), which the research team compared to having a conversation with a passenger on the distraction scale.
In a separate study, they used Siri to send and receive texts, post to social media and use a calendar. As it turns out, Apple’s iPhone AI program received a worse rating than any of the in-car systems, scoring 4.14 even when modified for use as a hands-free, eyes-free device. In addition, test drivers using Siri twice rear-ended another car, said Associated Press (AP) reporter Joan Lowy.
“Even though your car may be configured to support social media, texting and phone calls, it doesn’t mean it is safe to do so,” Strayer said in a statement, adding that he and the AAA are urging drivers to minimize the use of in-vehicle technology that could prove distracting.
The findings of their research, he said, could be used to help automakers tweak future voice-controlled systems in order to make them “simpler and more accurate” in their responses. Strayer added that he and his colleagues were “concerned we may be making distraction problems worse by going to voice-activated technology, especially if it’s not easy to use. But the reality is these systems are here to stay. Given that, let’s make the technology as safe as possible with the goal of making it no more distracting than listening to the radio.”
The research involved 162 Utah students and other volunteers who performed a variety of tasks using the various voice-based interactive technologies while looking at a computer screen as they operated a driving simulator. They also drove actual vehicles on a loop through Salt Lake City’s Avenues district, during which time they were accompanied by at least one other researcher for data collection and safety purposes.
Their work follows a 2013 AAA-University of Utah study which demonstrated that using hands-free devices to talk, text or send e-mail could be distracting and risky for motorists. It was during that research the researchers established the five-point mental workload scale and gave distraction ratings of 1.21 for listening to the radio, 1.75 for listening to an audiobook, 2.27 for using a hands-free cell phone, 2.33 for talking with a passenger, 2.45 for using a hand-held cell phone and 3.06 for using a speech-to-text system used to play and compose emails and texts.
“Technologies used in the car that rely on voice communications may have unintended consequences that adversely affect road safety,” explained Peter Kissinger, President and CEO of the AAA Foundation for Traffic Safety. “The level of distraction and the impact on safety can vary tremendously based on the task or the system the driver is using.”
—–
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now
—–

How Much Sodium Are You Getting From Your Sandwich?

Brett Smith for redOrbit.com – Your Universe Online
Sandwiches are responsible for about one-fifth of the average American’s sodium intake, according to a new study from the United States Department of Agriculture (USDA).
Published in the Journal of the Academy of Nutrition and Dietetics, the new study was part of the national “What We Eat in America NHANES 2009-2010” survey and it found that sandwiches comprised a much higher percentage of total sodium intake for the average American than previously believed.
The USDA researchers said the survey used a different coding system than past surveys which allowed for the reporting of various components of a sandwich. For example, a survey respondent eating a ham sandwich would be able to report specific amounts of bread, ham, tomato, cheese, mayonnaise, etc. The survey also used a single code for fast food sandwiches, like a quarter-pound cheese burger or turkey sub.
Previous research had used a single food code for sandwiches and this led to the conclusion that sandwiches represent only about 4 percent of total sodium intake.
“In 2009-2010, only about 20 percent of all sandwiches were represented by a single food code,” explained study author Rhonda Sebastian, a nutritionist at the USDA’s Agricultural Research Service (ARS). “For that reason, previously published estimates of sandwich contributions to sodium intake that were based on only single-code sandwiches are considerably underestimated.”
The current USDA guidelines, released in 2010, recommend a maximum daily intake of 2,300 milligrams of sodium. For adults over 50, African-Americans, and those with certain medical conditions, the USDA recommends a maximum daily sodium intake of only 1,500 milligrams. According to the new study, “Sandwiches alone contribute 30 percent of the less restrictive guideline and 46 percent of the stricter guideline.”
USDA scientists also discovered that people who reported consuming sandwiches had noticeably greater energy intake than those who did not. Those who ate a sandwich on the survey day took in an average of around 300 kilocalories over those who did not record eating a sandwich. Sandwich eaters also had greater overall sodium intake, averaging close to 600 milligrams daily greater than those who didn’t report eating a sandwich.
“The unanticipated finding that sandwich consumption is associated with higher overall intake of energy underscores the importance of making healthful choices of sandwich ingredients,” said study author Cecilia Wilkinson Enns, an ARS nutritionist. “Many sandwiches, such as burgers and franks, and common sandwich components, such as yeast breads, cheese, and cured meats, are among the top contributors not only to sodium but also to energy in the diets of adult Americans.”
The researchers also discovered higher levels of sodium intake among sandwich reporters correlated to the higher levels of daily energy intake.
“Regardless of sandwich reporting status, sodium density was approximately 1,700-1,800 mg per 1,000 kilocalories, suggesting that the higher sodium levels of sandwich reports are explained by their higher energy intake,” Sebastian said.
In light of the study results, the USDA recommends reconsidering sandwiches when looking to reduce both sodium and caloric intake.
“Due to sandwiches’ frequent consumption and considerable contributions to sodium intake, substituting lower-sodium for higher-sodium ingredients in sandwiches could significantly impact sodium intakes,” Wilkinson Enns said.
—–
May we suggest – The Encyclopedia of Sandwiches: Recipes, History, and Trivia for Everything Between Sliced Bread by Susan Russo (Author) and Matt Armendariz (Photographer). How do you keep a Dagwood from toppling over? What makes a Po’ Boy so crispy and crunchy? And who was the genius that invented the Fluffernutter? Discover these answers and more in The Encyclopedia of Sandwiches—a chunky little cookbook dedicated to everything between sliced bread.
—–

Scientists Develop Barcoding Tool For Stem Cells

Provided by Joseph Caputo, Harvard University
New technology that tracks the origin of blood cells challenges scientific dogma
A 7-year-project to develop a barcoding and tracking system for tissue stem cells has revealed previously unrecognized features of normal blood production: New data from Harvard Stem Cell Institute scientists at Boston Children’s Hospital suggests, surprisingly, that the billions of blood cells that we produce each day are made not by blood stem cells, but rather their less pluripotent descendants, called progenitor cells. The researchers hypothesize that blood comes from stable populations of different long-lived progenitor cells that are responsible for giving rise to specific blood cell types, while blood stem cells likely act as essential reserves.
The work, supported by a National Institutes of Health Director’s New Innovator Award and published in Nature, suggests that progenitor cells could potentially be just as valuable as blood stem cells for blood regeneration therapies.
This new research challenges what textbooks have long read: That blood stem cells maintain the day-to-day renewal of blood, a conclusion drawn from their importance in re-establishing blood cell populations after bone marrow transplants—a fact that still remains true. But because of a lack of tools to study how blood forms in a normal context, nobody had been able to track the origin of blood cells without doing a transplant.
Boston Children’s Hospital scientist Fernando Camargo, PhD, and his postdoctoral fellow Jianlong Sun, PhD, addressed this problem with a tool that generates a unique barcode in the DNA of all blood stem cells and their progenitor cells in a mouse. When a tagged cell divides, all of its descendant cells possess the same barcode. This biological inventory system makes it possible to determine the number of stem cells/progenitors being used to make blood and how long they live, as well as answer fundamental questions about where individual blood cells come from.
“There’s never been such a robust experimental method that could allow people to look at lineage relationships between mature cell types in the body without doing transplantation,” Sun said. “One of the major directions we can now go is to revisit the entire blood cell hierarchy and see how the current knowledge holds true when we use this internal labeling system.”
“People have tried using viruses to tag blood cells in the past, but the cells needed to be taken out of the body, infected, and re-transplanted, which raised a number of issues,” said Camargo, who is a member of Children’s Stem Cell Program and an associate professor in Harvard University’s Department of Stem Cell and Regenerative Biology. “I wanted to figure out a way to label blood cells inside of the body, and the best idea I had was to use mobile genetic elements called transposons.”
A transposon is a piece of genetic code that can jump to a random point in DNA when exposed to an enzyme called transposase. Camargo’s approach works using transgenic mice that possess a single fish-derived transposon in all of their blood cells. When one of these mice is exposed to transposase, each of its blood cells’ transposons changes location. The location in the DNA where a transposon moves acts as an individual cell’s barcode, so that if the mouse’s blood is taken a few months later, any cells with the same transposon location can be linked back to its parent cell.
The transposon barcode system took Camargo and Sun seven years to develop, and was one of Camargo’s first projects when he opened his own lab at the Whitehead Institute for Biomedical Research directly out of grad school. Sun joined the project after three years of setbacks, and accomplished an experimental tour de force to reach the conclusions in the Nature paper, which includes data on how many stem cells or progenitor cells contribute to the formation of immune cells in mouse blood.
With the original question of how blood arises in a non-transplant context answered, the researchers are now planning to explore many more applications for their barcode tool.
“We are also tremendously excited to use this tool to barcode and track descendants of different stem cells or progenitor cells for a range of conditions, from aging, to the normal immune response,” Sun said. “We first used this technology for blood analysis, however, this system can also help address basic questions about cell populations in solid tissue. You can imagine being able to look at tumor progression or identify the precise origins of cancer cells that have broken off from a tumor and are now circulating in the blood.”
“I think that not only for the blood field, this can change the way people look at stem cell and progenitor relationships,” Camargo added. “The feedback that we have received from other experts in the field has been fantastic. This can truly be a groundbreaking technology.”
This research was supported by an NIH Director’s New Innovator Award (DP2OD006472) and the Harvard Stem Cell Institute.
> Continue reading…
—–

How to Detect If You Have Fibromyalgia

Fibromyalgia is a disorder that affects over ten million Americans. Like any disease, if fibromyalgia is caught early, doctors can prevent it from progressing or getting worse. You don’t want to wait until it is too late. How would you know if you have fibromyalgia? What are the symptoms? Well, before you read about the symptoms and causes of fibromyalgia, it is important that you understand what the disorder is.

What is Fibromyalgia?

Fibromyalgia (FMS) is a chronic pain condition that is known to cause intense pain all over the body – especially on a points called “trigger points.” There is usually painful tenderness and achiness that occurs in a person’s muscle and tissue. Although most people are known to develop this disorder later in life, young people in their 20s and 30s can develop this disorder too.

What causes Fibromyalgia?

Research about the causes of this disorder is still going on. Because the causes of the disorder are still a bit of a mystery to doctors and scientists, diagnosing the disease and treating the disease can be difficult.

Misdiagnoses of Fibromyalgia

It is not surprising if you and your doctor misdiagnose fibromyalgia. It occurs very often. As mentioned before, research into this disorder and its causes are ongoing. Because of this, many doctors and patients commonly misdiagnose fibromyalgia and fail to identify the underlying cause of a patient’s symptoms. Here is a list of some disorders that are frequently misdiagnosed as fibromyalgia:

Gluten Intolerance

Gluten is a protein that can be found in wheat,  and some other grains. Even though most people would think that gluten intolerance has to do with the foods we eat or has something to do with our digestive system, this is not often true. Since gluten intolerance is a type of autoimmune disease, it presents itself as a neurological condition. People who have gluten intolerance might have pain, cognitive impairment, and fatigue. Because many of the symptoms of gluten intolerance are the same as the symptoms of FMS, it is easy for doctors and patients to misdiagnose FMS as gluten intolerance. Even if your gluten intolerance is not linked with FMS, it could be linked to other diseases. Research on gluten intolerance has shown that the disease is linked to over 50 different diseases.

Early symptoms and common misdiagnoses of Fibromyalgia

Vitamin Deficiencies

People with FMS often suffer from vitamin deficiencies. You can find out if you have a vitamin deficiency with a simple blood test. The blood test would show you and your doctor your vitamin levels. Then, you and/or your doctor can determine if your levels are too high, too low, or normal, and adjust your diet or identify nutritional supplements needed to bring your levels back to normal.

Mycotoxins

Mycotoxins are toxic substances that are released by molds in the environment. You could have mycotoxins in your body because you are exposed to the toxins through skin contact or air contact. A simple blood test would show if you have elevated levels of mycotoxins. If you do, you would experience similar symptoms as FMS patients. Because of the similar symptoms, doctors and patients often misdiagnose FMS as high levels of mycotoxins in the body.

Mercury Toxicity

Do you have silver fillings in your teeth? Do you know that these silver fillings contain mercury? We have always been told to stay away from mercury. So, how could it be safe in our mouths? In fact, every time that we chew, grind our teeth, or have a teeth cleaning, some mercury from the silver fillings can be released into our body. The mercury in your silver fillings is released in very small amounts into your body. These amounts are so minimal that they cannot hurt you. However, exposure to large amounts of mercury can cause mercury poisoning. Mercury poisoning is very dangerous and can cause FMS along with many other diseases and disorders.

There is always the possibility of misdiagnoses of FMS, just like any other disease.

Symptoms of Fibromyalgia

Now that you know a little bit about the disorder and the common misdiagnoses of the disorder, you are ready to learn about the symptoms. Besides tests and thorough examinations, symptoms are a great way to determine if you have FMS or if you don’t have FMS. These symptoms are not a checklist for FMS. You can’t check off the symptoms you have, total up your score, and see if you have FMS. If you have most of the symptoms listed, you should be tested for FMS to be safe.

  • Pain
  • Fatigue
  • Anxiety
  • Depression
  • Lack of energy
  • Pain in muscle and tissue
  • Feeling of tenderness and achiness on the body
  • Sleeping problems
  • Change in moods and/or mood swings

If you have a majority of these symptoms, you should talk to your doctor about fibromyalgia and get tested. You want to get tested early so that the disease can be prevented from developing. If you do have FMS, you don’t want it to get worse.

The most common symptom that is linked with FMS is pain. If you are feeling pain in your muscles and tissues or pain in most of your body, you should talk to your doctor about FMS. It is better to be safe than sorry. Most people with FMS have said that they experienced pain before any other symptom.

Fibromyalgia is not a disease that should be joked about. It is very serious and it affects over ten million Americans. If you have symptoms, go to your doctor. Do not wait and think that the symptoms would go away by themselves. Even if you and your doctor don’t think that you have FMS, it is possible that you and your doctor have misdiagnosed FMS as some other disease or disorder. I The tests for verifying the cause of your symptoms are simple to do, and receiving the correct treatment will can greatly improve your quality of life.

Fibromyalgia Dieting For Symptom Relief

Fibromyalgia is a disease that afflicts men and women in disproportionate numbers. This condition affects pain transmission in the central nervous system by skewing neural pathways. This culminates in chronic pain, along with a host of other symptoms.

Fibromyalgia patients develop secondary symptoms, including fatigue, sleep deprivation, irritable bowel syndrome and memory disturbances. This disease stems from a number of contributing factors, including genetics, environment, and psyche, as well. As it turns out, chronic pain is not merely a sensory problem, but rather, a cognitive-emotional one, as well.

How Is Fibromyalgia Generally Treated?

Research has uncovered a link between fibromyalgia and low serotonin. For this reason, antidepressants are always the first prescribed course of treatment for fibromyalgia patients. Antidepressants are often followed by additional remedies, including anticonvulsants and muscle relaxers.

Healthy serotonin production can improve mental acuity, pain perception and pain management, as well. Muscle relaxants and anticonvulsants simply treat the resulting muscle tension and stiffness of this condition. Furthermore, localized pain treatments, such as steroids, are injected at the site for temporary relief.

Medication alone will not suffice, in terms of yielding long-term comfort and wellness. A safe and professional medical regimen must be counterbalanced with a nutritious diet that excludes problematic food sources. While neither diet nor medication can cure this condition, a varied approach can lessen symptomatic responses.

Fibromyalgia Dieting For Symptom Relief

Why Diet Is Important

A healthy diet can leverage the fight against this fibromyalgia. Many foods have been known exacerbate the symptoms of this condition. Hence, patients are encouraged to plan their meals methodically, while selecting the best sources of nutrition. Many foods can alter mood state and increase the excitability of the nervous system. This, of course, never bodes well for fibromyalgia patients.

The medical community does not recognize fibromyalgia as an inflammatory disease. However, recent studies seem to invalidate this widespread consensus. An inflammatory issue may actually complicate pain perception for fibromyalgia patients. This phenomenon is known as central sensitization, a natural result of inflamed spinal cells. Cell damage is believed to cause this state sensitization, and therefore, disrupt pain perception.

This brings us to the central importance of maintaining a healthy diet, and a selective diet at that (as some conventionally healthy foods appear to worsen symptoms). Many chemically based foods encourage the synthesis of neurotransmitters involved in central sensitization. With this said, it is important to exclude foods that heighten sensitivity of the central nervous system.

Diet Tips

Sugar is the nemesis of all fibromyalgia patients. When sugar is consumed, it alters the glycemic index, spikes insulin levels, and intensifies pain beyond measure. The endocrine system of the body may be loosely correlated with central sensitivity, which underscores the importance of a low sugar diet. Even fresh fruit juices should be avoided. Whole fruit juice is the preferable choice for those with fibromyalgia.

Obesity, diabetes and hypertension present an additional risk factor for such patients. These individuals must exclude gluten from their diets entirely. Both sugar and gluten based products are processed similarly in the body. Hence, gluten can increase spinal sensitivity and pain, as well.

Fibromyalgia patients routinely exclude caffeine from their daily diets. As noted, the central nervous system is very reactive to stimulants such as caffeine. And when stimulants interact with the central nervous system, they cause sleep deprivation and intense pain.

A raw, organic diet containing unprocessed foods is ideal in most cases. Fibromyalgia patients should eliminate additives, preservatives, and other food products from their diets. Vegetables are a viable source of nutrition for anyone suffering from this condition. However, there is one exception. Nightshade vegetables, such as tomatoes, eggplant and potatoes, amplify pain symptoms.

Choose your fats very selectively. In the realm of nutrition, there are both healthy, life sustaining fats, and unhealthy fats. For example, omega-3 fatty acids aid pain management, as they improve the functional capacity of the nervous system. Unhealthy fats merely worsen symptoms.

Yeast, a common food additive, exposes patients to pain-inducing yeast fungus. So, try to avoid this as much as possible. Dairy is another food source that should be strictly avoided by sufferers of this condition. Fibromyalgia patients suffer from digestive issues and never respond favorably to processed dairy products. Raw dairy products are the recommended alternative in this case.

As noted, sugar intensifies fibromyalgia pain. Even aspartame-based sweeteners must be averted at all costs. Aspartame is an excitotoxin that stimulates the nervous system and increases pain sensitivity. It also appears the MSG, a common additive in unhealthy foods, produces the same effect. In essence, these foods are a toxic source of muscle irritation, fatigue and discomfort.

Managing Your Diet

Most people characterize diet regimens as tedious and time-consuming. However, you can streamline your diet plan with a food journal and pre-made meals. Journaling your food consumption will prevent you from overeating or consuming aggravating foods, such as soda and chocolate. Furthermore, pre-made meals will facilitate the process of eating healthy on the go. Your pre-prepared meals should contain a number of foods designed to fight fibromyalgia.

Your meals should contain a healthy dose of fruits and vegetables, minus the nightshade veggies. Fruits and vegetables contain phytochemicals and antioxidants, which mitigate irritable bowel syndrome and other symptoms of this disease.

Try to incorporate more fish and walnuts into your diet, as omega-3 fatty acids reduce inflammation and improve cognitive clarity. Lean protein is another invaluable means of sustainable health and wellness. The right protein source can help stabilize your insulin levels and prevent lethargy.

Studies have revealed that a vegan or vegetarian diet permits the most pain relief. Vegetables supply the body with an influx of disease-fighting components. Of course, this is an individual choice that may not suit the personal preference of every patient.

While not the most appetizing food options on the menu, broccoli, beans, tofu and oatmeal can provide a natural source of energy. As discussed, fibromyalgia patients must avoid caffeine, in spite of their lethargy. However, various foods can provide an instant boost throughout your day.

iPad Tops Disney, Nickelodeon, McDonalds As Most Popular Brand Among Kids

Chuck Bednar for redOrbit.com – Your Universe Online
The kids have spoken: Apple’s iPad is now a more popular brand among children than McDonalds, Disney, Toys R Us or Nickelodeon, according to an annual brand popularity survey from family research firm Smarty Pants.
Smarty Pants conducted an online survey of US households containing youngsters between the ages of six and 12, and had those kids evaluate 256 consumer brands across more than 20 categories. More than 6,600 children and their parents took part in the three-month survey, and the results were tabulated into scores measuring overall brand awareness, love and popularity on a scale of 0 to 1,000, the research firm explained.
iPad was No. 1 with a score of 898, followed by Hershey’s with a score of 894 and Oreo at 885. Rounding out the top 10 were M&M’s, Doritos, Cheetos, Skittles, Disney, YouTube and Xbox. Nickelodeon was tied for 12th with a score of 846, while McDonalds was 15th with an 839 score and Toys R Us was 24th with a score of 830.
“iPad’s number one status among kids represents the culmination of the ‘tablet takeover’ – a movement from shared screens and TV network dominance to curated content on personal devices,” Smarty Pants president Wynne Tyree explained in a statement. “Kids increasingly turn to iPad for games, TV shows, videos, books, homework help and communicating with friends and family.”
The company explained that kids have begun to view the iPad as an all-in-one digital device, which allows them to watch videos, play games and surf the Internet anywhere they want to. They also consider the tablet as a personal device, which means they don’t have to worry about sharing it with siblings.
Tyree said that the Apple tablet’s rise to the top of the list has been sudden and dramatic. Five years ago, it ranked 109th on the list, he said, and while it “captured the hearts of tweens and middle and upper class families” almost instantly, it recently has become “an indispensable part of childhood for the masses.”
It should come as no surprise given the iPad’s rising popularity that other forms of digital entertainment and content providers would also receive a boost, and according to Todd Spangler of the Boston Herald, brands such as Netflix, Hulu, Amazon Instant Video, Android and Samsung Electronics all experienced “notable increases” in their rankings among kids as well.
According to CNET’s Chris Matyszczyk, a similar survey of mothers found that the iPad ranked far lower – coming in at No. 30 on the list. Crayola topped their list, followed by Hershey’s, M&Ms, and Oreos. Among 6- to 12-year-olds, Crayola ranked 14th in brand awareness and popularity, edging out McDonalds with a score of 842.
Other noteworthy products that cracked the top 25 include Lay’s potato chips (11th, 847); Kit Kat (t-12th, 846); Nintendo Wii (16th, 838); Reese’s (t-17th, 836), Chips Ahoy (t-17th, 836), iPod (19th, 836); Kraft Macaroni and Cheese (20th, 834) and Popsicle (21st, 833). LEGOs were tied with Pop Tarts for 27th place (826), while Disney’s Frozen was 30th (820), Sony’s PlayStation was 33rd (819) and Cartoon Network came in 34th (818).
Shop Amazon – Kindle Fire HDX – A Powerhouse Tablet Built for Work and Play
—–

Buzzing About Caffeine – Researchers Find New Genetic Variants Linked To Coffee Consumption

Chuck Bednar for redOrbit.com – Your Universe Online
If you’re the type of person who drinks several cups of coffee, it may not just be because you’re feeling tired or because you enjoy the taste – your habit may be linked to one of the six newly-identified genetic variants found to be associated with habitual consumption of the caffeinated beverage.
A new large-scale study published online Tuesday in the journal Molecular Psychiatry by researchers from Harvard School of Public Health and Brigham and Women’s Hospital and colleagues from the Coffee and Caffeine Genetics Consortium, found that the variants could also help explain why specific amounts of coffee or caffeine have different effects on different people.
According to Associated Press (AP) writer Malcolm Ritter, scientists had long known that a person’s DNA influenced how much coffee they consumed, but the new study is the first to pinpoint some of the specific genes responsible. They analyzed data from over 120,000 individuals looking for tiny variations in DNA which correlated with coffee consumption amounts.
Arielle Duhaime-Ross of The Verge said the findings “help explain why some coffee lovers bounce off the walls after a single cup, whereas others feel the need to invent alarm clocks that wake you up with a shot of espresso,” though Ritter noted the authors found no link between the genes and the intensity of a person’s taste for coffee.
Lead investigator Marilyn Cornelis, a research associate at the Harvard T.H. Chan School of Public Health, and her colleagues identified a total of eight genes linked to coffee consumption, although two of them had been previously identified by Cornelis and others in earlier studies. Two of the new genes were linked to metabolism of caffeine, two were associated to its psychoactive effects, and two were related to lipid and glucose metabolism.
The role of those last two genes in coffee consumption is currently unknown, and Cornelis said that they could warrant further investigation. In all, however, the newly-discovered variants explain roughly 1.3 percent of coffee-drinking behavior, and while that might not seem like a large amount, the researchers said that it is roughly equal to the percentage reported for smoking, alcohol consumption and other types of habitual behaviors.
“Coffee and caffeine have been linked to beneficial and adverse health effects. Our findings may allow us to identify subgroups of people most likely to benefit from increasing or decreasing coffee consumption for optimal health,” Cornelis said in a statement Tuesday. “The new candidate genes are not the ones we have focused on in the past, so this is an important step forward in coffee research.”
“The next question is who is benefiting most from coffee,” she told Harvard Gazette Staff Writer Alvin Powell. “If, for example, caffeine is protective, individuals might have very similar physiological exposure to caffeine, once you balance the metabolism. But if coffee has other potentially protective constituents, those levels are going to be higher if you consume more cups, so they might actually be benefitting from non-caffeine components of coffee. So it’s a little bit complex.”
While the researchers told Duhaime-Ross that they do not believe that their findings will change anyone’s current coffee consumption habits, the do feel that it will help doctors and nutritionists develop individualized caffeine consumption guidelines that have been optimized to a person’s specific genetic makeup.
Furthermore, Cornelis said that it will make it easier to study the health effects of coffee and solve some of the confusion about whether or not consuming the caffeinated beverage is good or bad for a person’s health. As The Verge reporter pointed out, even though drinking coffee is often viewed as a bad habit, it has been found to decrease a person’s risk of developing Type 2 diabetes, prostate cancer, and oral cancer.
Get your buzz on! Shop Amazon.com for all your coffee needs.
—–

Facebook Finalizes WhatsApp Acquisition – Eight Months After First Announcing It

Chuck Bednar for redOrbit.com – Your Universe Online
Several months after initially announcing plans to acquire messaging service WhatsApp, Facebook has confirmed in a filing with the US Securities and Exchange Commission (SEC) that the multi-billion dollar deal was officially complete.
The social media giant first revealed in February that it would acquire the real-time messaging network in a deal initially reported to be worth $4 billion in cash and $12 billion worth of Facebook shares. In addition, the company said that WhatsApp employees and founders would be given $3 billion in restricted stock units.
It was touted to be the largest Facebook acquisition to date, but in March, privacy groups filed a complaint with the Federal Trade Commission (FTC) claiming the purchase was unfair because WhatsApp users had the expectation that their data would not be collected for advertising purposes.
The FTC approved the acquisition, with the warning that the agency would be monitoring the service for possible privacy violations. European regulators signed off on the deal on October 4, saying that it would not harm competition because consumers would still have plenty of different messaging and consumer communications apps available to them.
According to the Associated Press (AP), the value of the deal increased from the original $19 billion to $21.8 billion in the time since the original agreement was reached. WhatsApp founders Jan Koum and Brian Acton received $6.8 billion and $3.5 billion after taxes, respectively, and Facebook will now award 177.8 million shares of Class A common stock and $4.59 billion in cash to WhatsApp’s shareholders, Forbes Staff Writer Parmy Olson added.
CNET’s Lance Whitney said that WhatsApp would continue operations as a wholly-owned subsidiary of Facebook. Whitney also noted that Koum would become a member of the social media company’s board of directors.
Reuters reporter Alexei Oreskovic explained that the company’s 70 employees would continue working at the company’s Mountain View, California-based headquarters.
“The acquisition… underscores the sky-rocketing values of fast-growing Internet startups, and the willingness of established players such as Facebook and Google Inc. to pay out for them,” Oreskovic said on Monday. “WhatsApp, which has more than 600 million monthly users, is among a new crop of mobile messaging and social media apps that have become increasingly popular among younger users.”
“The price that Facebook was willing to pay raised eyebrows when the buyout was announced Feb. 19, though analysts agreed that landing the popular site made sense,” the AP added. “WhatsApp has been growing rapidly, especially in developing countries like Brazil, India, Mexico and Russia, and now has more than 500 million users.”
WhatsApp allows users to chat with the people on their contacts list, both in one-on-one conversations and in group sessions, according to the wire service. Furthermore, people can use the app to send text messages, photos, videos and voice recordings over the Internet, and even text or call people internationally without accruing heavy charges. WhatsApp is an ad-free service which costs nothing to use for one year, and costs users $1 annually afterwards.
Shop Amazon – Contract Cell Phones & Service Plans
—–

Active Sexting Found To Increase Likelihood Of Sexual Activity In Teens

Chuck Bednar for redOrbit.com – Your Universe Online

One in four teens actively participate in sexting, or electronically transmitting explicit images to one another, and those who do are more likely to become sexually active within the next 12 months, researchers from the University of Texas Medical Branch at Galveston (UTMB) report in a new study.

However, writing in the journal Pediatrics, Dr. Jeff R. Temple and Dr. HyeJeong Choi of the UTMB Department of Obstetrics and Gynecology said that while their findings indicate that sexting may precede intercourse in some cases, they found no link between the sending and receiving of explicit photos and the likelihood of engaging in risky sexual behavior over time.

“We now know that teen sexting is fairly common,” Dr. Temple, an associate professor and psychologist, said in a statement. Despite that knowledge, however, he said the majority of research into the topic “looks across samples of different groups of young people at one time, rather than following the same people over time. Because of this, it’s unclear whether sexting comes before or after someone engages in sexual activity.”

To find out, Dr. Temple and Dr. Choi used anonymous surveys to study approximately 1,000 high school sophomores and juniors from southeastern Texas, said Reuters reporter Andrea Burzynski. Of those polled by the researchers, 28 percent of teens said that they were involved in sexting, and those who admitted to doing so during their second year of high school were more likely to have engaged in sexual activity by the following year.

However, the study authors said that only active sexting, which was defined as sending an explicit photo, was correlated with an increased likelihood of sexual activity, added Burzynski. Passive sexting, or asking for or receiving a photo, was not. Their findings suggest that actively sending a nude photograph was the important part of the link between sexting and sexual activity, as opposed to simply asking for or being asked for one.

“Being a passive recipient of or asking for a sext does not likely require the same level of comfort with one’s sexuality,” explained Dr. Choi. “Sending a nude photo may communicate to the recipient a level of openness to sexual activity, promote a belief that sex is expected, and serve to increase sexual advances, all of which may increase the chance of future sexual behavior. Sexting may serve as a gateway behavior to actual sexual behaviors or as a way to indicate one’s readiness to take intimacy to the next level.”

According to Burzynski, the research (which was funded by the National Institutes of Health and the National Institute of Justice) is part of an ongoing six-year probe of 974 ethnically-diverse adolescents. Each study participant periodically completes surveys, which discuss their history of sexting, sexual activity and other behaviors.

“For parents and teachers, sexting among teens is troubling not only for reasons related to personal values surrounding sex, but because the photographs can be easily and widely shared,” the Reuters reporter added. “The posting of nude photos of celebrities such as actresses Jennifer Lawrence and Kate Upton on the Internet in September by an anonymous hacker, for instance, raised concerns about technology, security and privacy.”

However, Dr. Temple told Burzynski that parents should not become overly concerned if they find out that their teens are actively sexting, since the discovery could be used as an opportunity to engage those adolescents in important discussions pertaining to safe sex, sexual health and related issues.

—–

World’s First Solar Battery Runs On Light And Air

Provided by Pam Frost Gorder, Ohio State University

Is it a solar cell? Or a rechargeable battery?

Actually, the patent-pending device invented at The Ohio State University is both: the world’s first solar battery.

In the October 3, 2014 issue of the journal Nature Communications, the researchers report that they’ve succeeded in combining a battery and a solar cell into one hybrid device.

Key to the innovation is a mesh solar panel, which allows air to enter the battery, and a special process for transferring electrons between the solar panel and the battery electrode. Inside the device, light and oxygen enable different parts of the chemical reactions that charge the battery.

The university will license the solar battery to industry, where Yiying Wu, professor of chemistry and biochemistry at Ohio State, says it will help tame the costs of renewable energy.

“The state of the art is to use a solar panel to capture the light, and then use a cheap battery to store the energy,” Wu said. “We’ve integrated both functions into one device. Any time you can do that, you reduce cost.”

He and his students believe that their device brings down costs by 25 percent.

The invention also solves a longstanding problem in solar energy efficiency, by eliminating the loss of electricity that normally occurs when electrons have to travel between a solar cell and an external battery. Typically, only 80 percent of electrons emerging from a solar cell make it into a battery.

With this new design, light is converted to electrons inside the battery, so nearly 100 percent of the electrons are saved.

The design takes some cues from a battery previously developed by Wu and doctoral student Xiaodi Ren. They invented a high-efficiency air-powered battery that discharges by chemically reacting potassium with oxygen. The design won the $100,000 clean energy prize from the U.S. Department of Energy in 2014, and the researchers formed a technology spinoff called KAir Energy Systems, LLC to develop it.

“Basically, it’s a breathing battery,” Wu said. “It breathes in air when it discharges, and breathes out when it charges.”

For this new study, the researchers wanted to combine a solar panel with a battery similar to the KAir. The challenge was that solar cells are normally made of solid semiconductor panels, which would block air from entering the battery.

Doctoral student Mingzhe Yu designed a permeable mesh solar panel from titanium gauze, a flexible fabric upon which he grew vertical rods of titanium dioxide like blades of grass. Air passes freely through the gauze while the rods capture sunlight.

Normally, connecting a solar cell to a battery would require the use of four electrodes, the researchers explained. Their hybrid design uses only three.

The mesh solar panel forms the first electrode. Beneath, the researchers placed a thin sheet of porous carbon (the second electrode) and a lithium plate (the third electrode). Between the electrodes, they sandwiched layers of electrolyte to carry electrons back and forth.

Here’s how the solar battery works: during charging, light hits the mesh solar panel and creates electrons. Inside the battery, electrons are involved in the chemical decomposition of lithium peroxide into lithium ions and oxygen. The oxygen is released into the air, and the lithium ions are stored in the battery as lithium metal after capturing the electrons.

When the battery discharges, it chemically consumes oxygen from the air to re-form the lithium peroxide.

An iodide additive in the electrolyte acts as a “shuttle” that carries electrons, and transports them between the battery electrode and the mesh solar panel. The use of the additive represents a distinct approach on improving the battery performance and efficiency, the team said.

The mesh belongs to a class of devices called dye-sensitized solar cells, because the researchers used a red dye to tune the wavelength of light it captures.

In tests, they charged and discharged the battery repeatedly, while doctoral student Lu Ma used X-ray photoelectron spectroscopy to analyze how well the electrode materials survived—an indication of battery life.

First they used a ruthenium compound as the red dye, but since the dye was consumed in the light capture, the battery ran out of dye after eight hours of charging and discharging—too short a lifetime. So they turned to a dark red semiconductor that wouldn’t be consumed: hematite, or iron oxide—more commonly called rust.

Coating the mesh with rust enabled the battery to charge from sunlight while retaining its red color. Based on early tests, Wu and his team think that the solar battery’s lifetime will be comparable to rechargeable batteries already on the market.

The U.S. Department of Energy funds this project, which will continue as the researchers explore ways to enhance the solar battery’s performance with new materials.

Avoid Fibromyalgia Flares- Take Care of Your Emotional Needs

If you have been diagnosed with fibromyalgia, then you are most likely sick and tired of all the pain and other symptoms. You deal with backaches, chronic headaches, and much more. That has likely begun to take a toll on you. Additionally, there are certain physical, environmental, and even emotional factors that can contribute to you fibromyalgia flare-ups. Learn what your triggers are and how you can work to avoid them.

Stress as a Trigger for Fibromyalgia

Often, a little bit of stress in our lives can be a good thing. After all, without it, we would never get anything done, right? Our work would simply keep piling up on our desks. The dishes would be left dirty in the sink. The poor dog would never get taken out on his daily walk. However, there are times that stress in our lives can become unhealthy. If you have fibromyalgia, stress is definitely something that you must monitor so that you can avoid triggering a flare-up of your symptoms.

Research has shown that one of the major things that can trigger a fibromyalgia flare-up is emotional factors such as stress. Episodes of extreme emotional anxiety and stress can cause headaches, muscle pains, and can even trigger anxiety attacks.

Even external stressors such as bright lights or loud noises can trigger the symptoms of fibromyalgia. It is unclear why exactly individuals with fibromyalgia react so negatively with stress. It could be because the stress actually causes our bodies to release specific hormones which can interfere with the perception of pain in those with fibromyalgia. The other school of thought is that the stress actually increases soreness and muscle tension.

In order to keep your symptoms of fibromyalgia from flaring up, you should find ways to limit the amount of stress in your life. Take regular breaks from both your work and your home life and see what you can do about reducing your workload at the office. Learn meditation and take part in a regular exercise program so that you can have an outlet for that extra energy that you have from time to time or a way to release the stress.

Tips for Stress Management and Emotional Health

#1- Avoid Stress that is Unnecessary

Know that you’re not going to be able to avoid all stress- and besides, it’s not healthy to avoid a situation that you need to address. However, you may actually be surprised by what you can actually eliminate.

  • Learn the word “NO”- know what your limits are and set boundaries
  • Avoid those that cause you stress- If there’s someone always causing drama and stress, get them out of your life for good.
  • Be in control of your surroundings- If there’s something in your surroundings that causes you stress, get rid of it.
  • Avoid those “hot-button” topics- If there’s something that upsets you to discuss, then don’t discuss it.
  • Avoid Fibromyalgia Flares

#2- Change Your Situation

If there is a situation that is stressful that you simply can’t avoid- figure out what you can do to change it. More often than not, this involves a change in the way that you operate in daily life as well as your communication skills.

  • Don’t bottle up your feelings, express them- If there is something grating on your nerves, communicate your concerns.
  • Learn to compromise- When you ask others to change, you must be willing to make a few changes too- this will help you to find that middle ground.
  • Learn to be assertive- Deal with any issues head on- don’t take a backseat.
  • Learn time management skills- Improperly managing time can cause you to experience lots of stress- plan ahead and don’t overextend yourself.

#3- Adapt

If you’re not able to change the thing that is causing you stress, then make some changes yourself. Adapt to the situation and regain control.

  • Change your perspective- Learn to view the stressful situations from a perspective that is much more positive.
  • Focus on the big picture- Again, change your perspective- will the situation be an issue in the long run?
  • Change your standards- When you’re a perfectionist, you are setting yourself up for failure- set reasonable expectations for yourself.

#4- Accept What You Can’t Change

There are some things that will cause you stress that you just can’t avoid. In these cases, you should learn to accept things as they come. Though acceptance of a situation can be difficult, in the long run it’s much easier than fighting against a situation.

  • Don’t be controlling- There are some things in life that are going to be beyond your control- focus on what you can control, such as your response to these things.
  • Look on the bright side- Know that every cloud truly does have a silver lining- look for it and use the challenges of life to grow.
  • Talk about things- Find someone you trust or see a professional therapist- sometimes just talking about what is stressing you out can help.
  • Forgive- Accept that no one is perfect and everyone makes mistakes at some point- let go of the resentment and anger.

#5- Have Fun and Relax

Go beyond taking charge and having a positive attitude. Know that you can reduce the stress in your life by simply taking time to take care of yourself. If you allow yourself time to have fun and relax, you’ll be better able to deal with the stressors as they come.

  • Set aside time to relax- Allow some time in your day to include some rest and relaxation- don’t let other things encroach on this time.
  • Make some connections- Spend time with those who make your life better- having a strong support system around you will help to reduce the effects of stress
  • Have a sense of humor- Learn how to laugh (even at yourself) – this can help your body fight off the negative effects of stress.

#6- Be Healthy

A great way to reduce the effects of stress on your life and body is to be healthy.

Get regular exercise- Take at least 30 minutes three times a week to exercise- aerobics are great for releasing that tension and stress you have bottled up inside.

Eat healthy- Bodies that are getting proper nourishment are better able to handle stress.

Reduce sugar/caffeine intake- The “highs” that come with sugar and caffeine are short lived and usually end up with a crash in your energy and mood- by reducing these, you’ll feel much better and get better sleep.

Avoid drugs, alcohol, and even cigarettes- These may be an “easy” escape from the stress, but this relief is only temporary.

Get proper sleep- Getting the proper amount and quality of sleep will refresh your mind and body- when you feel tired, your stress is increased.

Reports Indicate That Facebook Is Planning To Enter The Healthcare Field

Chuck Bednar for redOrbit.com – Your Universe Online

Facebook is currently considering entering the healthcare field by creating online support communities to connect social media users dealing with the same medical conditions, as well as launching new preventative care applications to help people improve their overall well-being, according to media reports published late last week.

Reuters reporters Christina Farr and Alexei Oreskovic, who broke the story on Friday, cited three sources familiar with the matter who requested anonymity since Facebook’s plans are still in development. Should those plans come to fruition, the social network would join fellow tech giants Apple and Google in the growing field of healthcare-related content.

The sources told Farr and Oreskovic that Facebook officials have been meeting with medical industry experts and entrepreneurs over the past several months, and is in the process of establishing an R&D team to test new health apps. While healthcare has long been an interest for the social media website, it has only recently come to realize that it might help draw people to the network, though Reuters noted Facebook is still “in the idea-gathering stage.”

Details about Facebook’s plans “are limited,” added Mashable’s Samantha Murphy Kelly, but reports also suggest the company might launch its health app under a different name in order to ease potential privacy concerns about users. She added that the website has been scrutinized for various privacy-related issues, including its role in a recent research study that manipulated users’ emotions.

“The news comes as at a time when other companies like Google and Apple are dabbling with health and fitness data,” Kelly explained. “Apple recently launched its Health app, a hub for iOS 8 users to keep track of fitness goals and health-related information, and Google is working on Google Fit for Android that tracks similar data across various platforms and wearables.”

While the Mashable reporter pointed out that it might seem unusual for Facebook to test the waters of the healthcare content field, the Reuters reporters wrote that the unexpected success of the website’s organ donor status initiative that was launched two years ago could be a factor.

“Separately, Facebook product teams noticed that people with chronic ailments such as diabetes would search the social networking site for advice,” one former Facebook insider told the news organization. “In addition, the proliferation of patient networks such as PatientsLikeMe demonstrate that people are increasingly comfortable sharing symptoms and treatment experiences online,” Farr and Oreskovic added.

If and when Facebook launches its new healthcare app and/or online support communities, the advertising built around those programs will be vastly different than those commonly seen in print or broadcast forms of media, Reuters explained. For example, pharmaceutical companies are barred from using social media to promote the sale of prescription drugs, largely due to privacy concerns related to medical disclosures.

“It remains unclear whether Facebook will moderate or curate the content shared in the support communities, or bring in outside medical experts to provide context,” said Farr and Oreskovic. The reporters said that Facebook officials declined to comment on any potential healthcare-related plans for the social network.

—–

Shop Amazon – Wearable Technology: Electronics

—–

Screenshots Indicate PayPal-Style Payments May Be Coming To Facebook Messenger

Chuck Bednar for redOrbit.com – Your Universe Online
Facebook Messenger reportedly includes a hidden person-to-person money transfer feature which only needs activation by engineers at the social network to be usable, a Stanford University computer science student has discovered.
Screenshots and video obtained using the iOS development tool Cycript and posted to Twitter by the student, Andrew Aude, depict a page listing saved debit cards (but not credit cards), a payment history and a way to switch PIN numbers on or off in the Messenger app.
Once activated, Messenger’s payment option would allow users of the service to send money to their fellow users in much the same way they send photos, explained Josh Constine of TechCrunch. He added that it was unclear if Facebook would charge a fee for the service or offer it at no cost in order to encourage increased usage of the Messenger app.
That decision, Constine said, “will be up to David Marcus, the new head of Messenger who was formerly the president of PayPal. Why Facebook chose to poach Marcus is now obvious: Facebook Messenger payments could compete with Venmo, PayPal, Square Cash, and other peer-to-peer money transfer apps.”
Aude, who according to his Twitter bio is also a security researcher and iOS developer, told VentureBeat’s Barry Levine that he was inspired to use Cycript to investigate Messenger after another security researcher first reported finding payment-related code in the app last month. He added that it did not even appear to be necessary to link a bank account in order to use the service, and that while PayPal was listed in the code, he found no option to use that service.
According to Levine, Aude also said the payments program currently only allows person-to-person transactions, though money transfers among multiple participants were apparently mentioned as a future option. The transactions were made via an ACH [Automated Clearing House] electronic transfer to the checking account of the recipient, though it is currently unclear how that information will be entered into the app. The transaction will be kept private.
As Constine noted, Facebook CEO Mark Zuckerberg said during the social network’s second-quarter earnings call that there would eventually be “some overlap between [Messenger] and payments,” and that “the payments piece will be a part of what will help drive the overall success and help people share with each other and interact with businesses.” However, he suggested at the time that the feature would not be available any time soon.
“There’s so much groundwork for us to do,” Zuckerberg said, according to TechCrunch. He urged investors and analysts to change their estimates of the website’s revenue if they expected the payment feature to be launched in the near future, “because we’re not going to. We’re going to take the time to do this in the way that is going to be right over multiple years.”
“There’s no word of when Messenger may get updated with support for payments,” added The Verge reporter Dante D’Orazio. “Even though the code for the service appears to already be built into the app, it’s possible that it’s only there to facilitate limiting testing, and it may not indicate that a launch is imminent.”
—–

New Research Uncovers One-Fifth Of The Genes Responsible For Height In Humans

Chuck Bednar for redOrbit.com – Your Universe Online
Subtle changes in our DNA could help explain why some people are taller than others, the international team of researchers behind the largest genome-wide association study ever conducted report in the October 5 edition of the journal Nature Genetics.
Scientists from more than 300 institutions representing the Genetic Investigation of Anthropometric Traits (GIANT) Consortium reviewed data from more than 250,000 subjects and identified nearly 700 variants of genome-wide significance that together explained 20 percent of all adult height. Their findings nearly double the number of known gene regions influencing height to over 400 and provides new insight of the biology of this particular trait.
“Height is almost completely determined by genetics, but our earlier studies were only able to explain about 10 percent of this genetic influence,” GIANT Consortium Leader and co-senior investigator Dr. Joel Hirschhorn from Boston Children’s Hospital and the Broad Institute of MIT and Harvard said in a statement.
“Now, by doubling the number of people in our study, we have a much more complete picture of how common genetic variants affect height – how many of them there are and how much they contribute,” he added.
Dr. Hirschhorn and his colleagues reviewed nearly two million common genetic variants (i.e. those that showed up in at least 5 percent of their subjects) and found 697 variants in 424 genetic regions as being affiliated with height.
Co-first author Dr. Tonu Esko of Boston Children’s Hospital, the Broad Institute and the University of Tartu (Estonia) added that, in the wake of their research, his team can now explain approximately 20 percent of the heritability of height – an eight percent increase from where scientists were prior to the Consortium’s work.
According to Reuters reporter Will Dunham, many of the genes identified in the study are believed to be key regulators of skeletal growth that had not been previously linked to height. Some were related to the bone component collagen, some to the cartilage component chondroitin sulfate and others to growth plates (tissues near the ends of the body’s long bones).
Studies have suggested that up to 80 percent of the factors responsible for determining height lies in our genetic code, but it was only seven years ago that the first genes associated with the characteristic were discovered, according to BBC News. Much work must still be done, but experts believe this study could ultimately lead to a simple test to quell parents’ fears about their children’s growth.
“We study height for two main reasons,” Dr. Hirschhorn, a geneticist and pediatric endocrinologist, told Dunham on Sunday. “For over 100 years, it’s been a great model for studying the genetics of diseases like obesity, diabetes, asthma that are also caused by the combined influence of many genes acting together. So by understanding how the genetics of height works, we can understand how the genetics of human disease works.”
“It’s common knowledge that people born to tall parents are more likely to be tall themselves. Most of this is down to the variations in our DNA sequence that we inherit from our parents – the different versions of all our genes,” Professor Tim Frayling of the University of Exeter Medical School said in a statement. “In 2007 we published the first paper that identified the first common height gene, and since then the research has come on leaps and bounds.”
He added that their new study “goes a long way towards fulfilling a scientific curiosity that could have real impact in the treatment of diseases that can be influenced by height, such as osteoporosis, cancer or heart disease. It also a step forward towards a test that may reassure parents worried that their child is not growing as well as they’d hoped – most of these children have probably simply inherited a big batch of ‘short genes.’”
—–

Google Reportedly Working On Displays Which Could Be Assembled Into One Massive Screen

Chuck Bednar for redOrbit.com – Your Universe Online
A top-secret project rumored to be in the works at Google would allow people to piece together small, modular screens into a larger television or monitor capable of displaying a single, seamless image.
As Rolfe Winkler and Alistair Barr of the Wall Street Journal reported on Friday, the individual screen pieces could be assembled like building blocks. While the reporters note that little is known about the project, in theory it would allow customers to create TVs or monitors in a variety of different shapes and sizes.
While Google declined their request for comment, Winkler and Barr report that the One Laptop Per Child co-founder and former MIT professor Mary Lou Jepsen is heading up the project. Jepsen, they noted, has co-founded three display-related tech startups, including Pixel Qi which specializes in low-power screens viewable in direct sunlight. She currently runs the display division of Google’s advanced product laboratory, Google X.
“Among the problems that the group is trying to solve, the people familiar with the project said, is how to make display modules that are ‘seamless’ so that people looking at a giant screen wouldn’t see the borders between the modules,” Winkler and Barr reported. “The project remains at an early stage and has been kept secret, even within Google, partly because the technical challenges are as large as the planned screens, one of the people said.”
One of the Wall Street Journal’s sources said the developers are attempting to “do the stitching between the seams” both electronically and through software, and that Google X is looking to recruit more experts to address the problem. Gecko Design, a mechanical engineering and product design company which the research lab acquired by the Mountain View, California-based firm in August, could be brought in to assist with the project.
In August, Gecko Chief Executive Jacques Gagné said that his company, which was also a part of One Laptop Per Child, had been working with Google X on an undisclosed project since 2013. He said that the firm was working with Google X on multiple high-tech projects, but declined to confirm the modular display was one of them.
“The displays are said to be able to show different information per module, or via a sub-sect of monitors. If you had 20 monitors linked, you could have 20 different screens showing 20 different things,” said Slashgear‘s Nate Swanner. “You could also have 20 screens showing 5 things via 4 groups of monitors, or however you’d like to split it up.”
“The reason or scope of the project isn’t known, so it’s not clear why Google is interested in the display-linking tech,” he added, humorously suggesting that such a display could “come in handy on Sundays in the Fall and Winter,” during football season. The actual reason why Google X, which is the laboratory that has given the world Google Glass and the company’s driverless car, would be working on such a display, however, is unknown.
However, as BGR’s Jacob Siegal noted, TVs that could be assembled like LEGO blocks could be usable for a variety of different locations, including schools, office conference rooms and homes. It’s a novel idea, but as NPD DisplaySearch research director Riddhi Patel pointed out, the device would probably need to be affordable and easy to install in order to attract the attention of the average consumer.
—–

Redbox And Verizon Pulling The Plug On Their Joint Video Streaming Service

Chuck Bednar for redOrbit.com – Your Universe Online
Redbox Instant, a streaming video service launched in March 2013 and jointly operated by the DVD rental kiosk firm’s parent company Outerwall Inc. and Verizon Communications will be permanently shut down on Tuesday, the two companies announced in a statement Saturday.
According to the announcement, Redbox Instant will cease operations at 11:59 p.m. Pacific time on October 7, and that information on applicable refunds would be emailed to customers and posted online on October 10. Users of the service will be able to continue to stream movies and use their Redbox kiosk credits until that time.
Redbox and Verizon initially launched the service in order to compete against online streaming video companies such as Netflix, Amazon Prime Video and Hulu Plus, but according to Reuters, it simply never caught on with the public. It marked Verizon’s first foray into the streaming video market outside of its own network operating region.
Subscribers to Redbox Instant paid $6 to $8 per month for access to a rotating library of about 4,500 movies, with the more expensive plan also providing them with coupons which could be redeemed for DVD rentals at Redbox kiosks all over the country, explained Adam Westlake of Slashgear. Unlike many of its rivals, however, it did not include TV shows.
“This is certainly the nail in the coffin for Redbox Instant, but the end comes as no surprise to many, as the service has had new customer sign-ups disabled for the last three months,” Westlake said. “This is was because of an ongoing issue with credit card fraud, but it also resulted in existing customers with expiring cards being unable to update payment information and continue their subscription.”
Redbox rental kiosks are expected to continue operating normally, he added.
In an FAQ posted to their website, Redbox said that no action will be required to receive a refund for services paid for but not received, and the amount due would be automatically paid to the credit or debit card affiliated with the user’s account. Subscribers should monitor their statements for a credit marked either “REDBOXINSTNT*MONTHLY” or “REDBOXINSTNT*RENTBUY” which should appear no later than Friday, October 24.
Redbox Instant was initially announced by Verizon in February 2012, with the initial plan calling for a release as early as that summer. It would be unveiled to the public a few months later, with alpha testing getting underway in July of that year. The service officially launched on March 14, 2013 and is now set to cease operations less than 20 months later.
Outerwall Chief Executive Officer J. Scott Di Valerio told investors on July 31 that subscriber numbers had been disappointing, and the company would “have some decisions to make in March,” when the service would have celebrated its second anniversary, according to Jim Polson of Bloomberg News.
“I’m sure a handful of dedicated fans will be crushed by this news,” said Michael Crider of the website Android Police. While he noted that “RedBox’s physical kiosk service seems to be doing alright for the admittedly underserved markets that it targets,” Crider added that he would be “very surprised if the company doesn’t try to tackle digital distribution again at some point in the future, either on its own or with another partner.”
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now
—–

Free Birth Control Reduces Teen Pregnancies, Abortions

Provided By Diane Duke Williams, Washington University in St. Louis

Teens who received free contraception and were educated about the pros and cons of various birth control methods were dramatically less likely to get pregnant, give birth or get an abortion compared with other sexually active teens, according to a new study.

The research, by investigators at Washington University School of Medicine in St. Louis, appears Oct. 2 in The New England Journal of Medicine.

The study promoted the use of long-acting forms of birth control, such as intrauterine devices (IUDs) and implants, because of their superior effectiveness in preventing unintended pregnancies. Among the 1,404 teens enrolled in the Contraceptive CHOICE Project, 72 percent opted for IUDs or implants. This compares with an estimated 5 percent of US teens who choose long-acting birth control.

In the United States, most teens opt for less-reliable contraceptives such as birth control pills or condoms or no method at all because of cost and other factors.

“When we removed barriers to contraception for teens such as lack of knowledge, limited access and cost in a group of teens, we were able to lower pregnancy, birth and abortion rates,” said Gina Secura, PhD, the study’s first author and director of the CHOICE Project. “This study demonstrates there is a lot more we can do to reduce the teen pregnancy rate.”

From 2008-13, the annual pregnancy rate of teens ages 15-19 in the study averaged 34 per 1,000, compared with 158.5 per 1,000 in 2008 for sexually active US teens. During the five-year span, the average annual birth rate among teens in the study was 19.4 per 1,000, compared with 94 per 1,000 in 2008 for sexually active US teens.

The abortion rate among teens in the study also dropped dramatically. From 2008-2013, their average annual abortion rate was 9.7 per 1,000, compared with 41.5 per 1,000 in 2008 for sexually active US teens.

“The difference in pregnancy, birth and abortion rates between teens enrolled in the Contraceptive CHOICE Project and U.S. teens is remarkable,” said Jeffrey Peipert, MD, PhD, principal investigator of the CHOICE Project and the Robert J. Terry Professor of Obstetrics and Gynecology.

Among teens in the study, almost 500 were minors ages 14-17 when they enrolled. Half of these minors reported having had a prior unintended pregnancy, and 18 percent had had at least one abortion.

Although the teen pregnancy rate in the United States has declined in the past two decades, it remains far higher than in other industrialized countries. Each year, more than 600,000 teens in the United States become pregnant, with three in 10 teens becoming pregnant before they turn 20.

In addition to the negative health and social consequences suffered by teen mothers and their children, US teen births cost almost $10 billion annually in public assistance, health care and lost income, according to The National Campaign to Prevent Teen and Unplanned Pregnancy in 2010.

The researchers analyzed data on teens enrolled in the Contraceptive CHOICE Project, a study of more than 9,000 St. Louis women and adolescents at high risk for unintended pregnancy and willing to start a new contraceptive method. Participants had their choice of a full array of birth control options, ranging from long-acting contraceptives such as IUDs and implants to shorter-acting forms like birth control pills, patches, rings, condoms or natural family planning. The teens then were followed for two to three years.

This study supports results from a previous study of the Contraceptive CHOICE Project that determined that providing birth control at no cost substantially reduced unplanned pregnancies and cut abortion rates by a range of 62-78 percent compared with the national rate.

Teens in the current study who chose IUDs or implants continued to use them longer than those who opted for shorter-acting methods such as the pill. Two-thirds of teens in the study still were using IUDs and implants at 24 months after beginning their use compared with only a third of teens still using shorter-acting methods such as birth control pills.

“We were pleasantly surprised by the number of teens choosing IUDs and implants and continuing to use them,” Peipert said. “It’s exciting that this study could provide insight into how to tackle this major health problem that greatly affects teens, their children and their communities.”

Teen pregnancy has been designated by the US Centers for Disease Control and Prevention as one of the six Winnable Battles because of the magnitude of the problem and the belief that it can be addressed by known, effective strategies. The Winnable Battle target is to reduce the teen birth rate by 20 percent, from 37.9 per 1,000 teens in 2009 to 30.3 per 1,000 teens by 2015.

Sierra Nevada Challenges NASA Decision On Crew Transport Contracts

Chuck Bednar for redOrbit.com – Your Universe Online
—–
UPDATE: October 6, 2014 (3:50 a.m.)
According to a brief announcement by Steven Siceloff of NASA’s Kennedy Space Center, “While NASA has awarded this contract,” referring to the contracts recently given to SpaceX and Being, “NASA has instructed Boeing and SpaceX to stop performance on the contract while the GAO resolves a protest.”
redOrbit will continue to provide updates as they become available.
—–
ORIGINAL: September 29, 2014 (4:15 a.m.)
Sierra Nevada Corp (SNC) has filed a formal protest over NASA’s decision to grant a total of $6.8 billion in contracts to Boeing and SpaceX for the construction of next-generation vehicles to transport American astronauts to and from the International Space Station (ISS).
Those agreements, which were announced earlier this month, could pay Boeing up to $4.2 billion for use of their commercially owned and operated CST-100, and another $2.6 billion to SpaceX for use of their Dragon spacecraft. The goal is to have domestically-made vehicles available for use in manned missions by 2017.
SNC, which was also under consideration for those contracts, said that their bid could have saved NASA up to $900 million and that statements made by officials at the US space agency “indicate that there are serious questions and inconsistencies in the source selection process,” said Reuters reporters Andrea Shalal and Mohammad Zargham.
“With the current awards, the U.S. government would spend up to $900 million more at the publicly announced contracted level for a space program equivalent to the program that SNC proposed,” Sierra Nevada said in a statement. “SNC, therefore, feels that there is no alternative but to institute a legal challenge,” it continued, noting that a “thorough review” of NASA’s decision to award the contracts “must be conducted.”
Furthermore, Andy Pasztor of the Wall Street Journal pointed out that SNC said the US space agency’s selection of Boeing and CST-100 “would result in a substantial increased cost to the public despite near equivalent technical and past performance scores,” and that NASA’s own selection records and debrief indicate the presence of “serious questions and inconsistencies in the source selection process.”
A NASA spokeswoman told Pasztor that the agency would have no comment while the legal challenge is pending with the US Government Accountability Office (GAO), which must determine whether or not the complaint is valid. That process can take several months, the Wall Street Journal reporter added. NASA has not publically released the selection criteria, or how each of the three firms ranked in terms of technical, management and cost issues.
While Pasztor said that SNC was “the only bidder to propose a winged vehicle able to land on a runway during its return trip from the international space station,” he added that sources had informed him the company “lagged behind the other two bidders in some technical rankings.” Sierra Nevada said in their statement that its proposal was the second lowest priced and that it had “achieved mission suitability scores comparable” to its rivals.
Under the terms of the agreements, both Boeing and SpaceX will provide at least one crewed flight test with at least one NASA astronaut on board. Those test flights will verify that the respective rocket and spacecraft systems perform as expected and are capable of launching, maneuvering in orbit and docking at the space station.
Once both companies successfully complete those trials, they will be awarded NASA certification and will each go on to conduct at least two, and as many as six, crewed missions to the ISS, according to officials at the space agency. Those spacecraft will also serve as a lifeboat for astronauts stationed on board the ISS.
FOR THE KINDLE – The History of Space Exploration: redOrbit Press

—–

Study On Invasive Species Shows Darwin Had It Right All Along

Brown University

Dov Sax of Brown University and Jason Fridley of Syracuse University aren’t proposing a novel idea to explain species invasiveness. In fact, Charles Darwin articulated it first. What’s new about Sax and Fridley’s “Evolutionary Imbalance Hypothesis” (EIH) is that they’ve tested it using quantifiable evidence and report in Global Ecology and Biogeography that the EIH works well.

The EIH idea is this: Species from regions with deep and diverse evolutionary histories are more likely to become successful invaders in regions with less deep, less diverse evolutionary histories. To predict the probability of invasiveness, ecologists can quantify the imbalance between the evolutionary histories of “donor” and “recipient” regions as Sax and Fridley demonstrate in several examples.

Darwin’s original insight was that the more challenges a region’s species have faced in their evolution, the more robust they’ll be in new environments.

“As natural selection acts by competition, it adapts the inhabitants of each country only in relation to the degree of perfection of their associates,” Darwin wrote in 1859. Better tested species, such as those from larger regions, he reasoned, have “consequently been advanced through natural selection and competition to a higher stage of perfection or dominating power.”

To Sax and Fridley the explanatory power of EIH suggests that when analyzing invasiveness, ecologists should add historical evolutionary imbalance to the other factors they consider.

“Invasion biology is well-studied now, but this is never listed there even though Darwin basically spelled it out,” said Sax, associate of ecology and evolutionary biology. “It certainly hasn’t been tested before. We think this is a really important part of the story.”

Evidence for EIH

Advancing Darwin’s insight from idea to hypothesis required determining a way to test it against measurable evidence. The ideal data would encapsulate a region’s population size and diversity, relative environmental stability and habitat age, and the intensity of competition. Sax and Fridley found a suitable proxy: “phylogenetic diversity” (PD), an index of how many unique lineages have developed in a region over the time of their evolution.

“All else equal, our expectation is that biotas represented by lineages of greater number or longer evolutionary history should be more likely to have produced a more optimal solution to a given environmental problem, and it is this regional disparity, approximated by PD, that allows predictions of global invasion patterns,” they wrote.

With a candidate measure, they put EIH to the test.

Using detailed databases on plant species in 35 regions of the world, they looked at the relative success of those species’ invasiveness in three well-documented destinations: Eastern North America, the Czech Republic, and New Zealand.

They found that in all three regions, the higher the PD of a species’ native region, the more likely it was to become invasive in its new home. The size of the effect varied among the three regions, which have different evolutionary histories, but it was statistically clear that plants forged in rough neighborhoods were better able to bully their way into a new region than those from evolutionarily more “naive” areas.

Sax and Fridley conducted another test of the EIH in animals by looking at cases where marine animals were suddenly able to mix after they became united by canals. The EIH predicts that an imbalance of evolutionary robustness between the sides, would allow a species-rich region to dominate a less diverse one on the other side of the canal by even more than a mere random mixing would suggest.

The idea has a paleontological precedent. When the Bering land bridge became the Bering Strait, it offered marine mollusks a new polar path between the Atlantic and Pacific Oceans. Previous research has shown that more kinds of mollusks successfully migrated from the diverse Pacific to the less diverse Atlantic than vice-versa, and by more so than by their relative abundance.

In the new paper, Sax and Fridley examined what has happened since the openings of the Suez Canal in Egypt, the Erie Canal in New York, and the Panama Canal. The vastly greater evolutionary diversity in the Red Sea and Indian Ocean compared to the Mediterranean Sea and the Atlantic led to an overwhelming flow of species north through the Suez.

But evolutionary imbalances across the Erie and Panama Canals were fairly small (the Panama canal connects freshwater drainages of the Atlantic and Pacific that were much more ecologically similar than the oceans) so as EIH again predicts, there was a more even balance of cross-canal species invasions.

Applicable predictions

Sax and Fridley acknowledge in the paper that the EIH does not singlehandedly predict the success of individual species in specific invasions. Instead it allows for ecosystem managers to assess a relative invasiveness risk based on the evolutionary history of their ecosystem and that of other regions. Take, for instance, a wildlife official in a historically isolated ecosystem such as an island.

“They already know to be worried, but this would suggest they should be more worried about imports from some parts of the world than others,” Sax said.

Not all invasions are bad, Sax noted. Newcomers can provide some ecosystem services — such as erosion control — more capably if they can become established. The EIH can help in assessments of whether a new wave of potential invasion is likely to change the way an ecosystem will provide its services, for better or worse.

“It might help to explain why non-natives in some cases might improve ecosystem functioning,” Sax said.

But perhaps Darwin already knew all that.

Homeopathic Remedies For Drug-averse Fibromyalgia Patients

Rejecting Conventional Therapies

Slowly but surely, the health community is reviving its interest in alternative medicine. In response to this, patients are now scurrying in search of alternative remedies. Overall, patients and disease sufferers have transformed their mode of thinking and their perception of disease as a whole.

While the body was initially believed to be the source of disease, it is now looked upon as an infinite source of healing in many respects. Western medicine contends that through external aid, a patient can be restored to health. However, this is a limiting belief system that constrains natural healing processes.

Fibromyalgia patients can improve their wellbeing by combining the most effective therapies. Furthermore, many patients prefer to combine western medication with homeopathic remedies. Every person is different, and different bodies will respond uniquely to the same treatment regimen.

Western medications are not the sole means of recovery in the context of this condition. Just as homeopathic remedies are designated as an alternative, conventional meds should remain an option as well (instead of a health requirement).

Depending on the severity of one’s condition, conventional forms of medication may be compulsory, especially if that person’s livelihood depends on it. However, if a patient feels that their symptoms are manageable without Western medicine, they reserve the right to seek alternative treatments elsewhere.

Why People Choose Alternative Medicine

Alternative medicine is a really divisive subject matter. Many people ridicule the chemically induced side effects of conventional meds, and opt for alternative treatments instead.

While fibromyalgia medication has a proven track record of efficiency and potency, it can produce side effects such as severe fatigue, suicidal thoughts, cardiovascular issues, and much more. These effects are often triggered by antidepressant medications.

For this reason, many people have chosen to harness their body’s natural healing potential. While some disregard the effectiveness of homeopathic methods, others invest their hopes in these all-natural therapies.

In terms of nondrug, alternative treatments for fibromyalgia, the most popular methods include  trigger point therapy, deep tissue massage, shiatsu massage, biofeedback, acupuncture, meditation, reiki meditation, herbal remedies healthy dieting, and much more. As it turns out, antidepressants are not the only prescribed mode of treatment for such patients.

Nondrug Trigger Point and Alternative Fibromyalgia Treatment

Reiki Energy Therapy

Reiki energy therapy is a Japanese healing technique designed to harness and focus bodily energy. This treatment is used to alleviate pain and depression, as well.

As one may observe in Chinese medicine, reiki energy therapy utilizes qi, a universal life force that flows throughout the entire human body. When the energy flow of qi is obstructed, it gives rise to painful blockages that yield disease.

Many patients turn to spiritual, energetic therapies to remove blockages from their energy pathways, and experience pain relief. Some people, however, have mastered reiki therapy without professional aid.

Herbal and Food Based Therapies

Some people with fibromyalgia rely on nature to treat their illness. They recognize the bounty of healing herbs and foods in nature, and utilize these herbs accordingly. For example, capsaicin, a natural pain relief chemical, is found in pepper plants.

Substance P, a chemical in capsaicin, is known to reduce painful sensations. Those who rely on herbal remedies incur amazing, short-term benefits. Vitamin D and magnesium are also known to counteract the presence of pain and improve muscle function. The best source of Vitamin D is undoubtedly the sun.

Food, when consumed properly, can be the most powerful therapy of all in some cases. As noted, magnesium based foods are highly recommended for people who suffer from this condition. Magnesium rich foods include leafy green vegetables, lentils and beans, almonds, fish, avocados, bananas, and dried fruit.

Of course, not every food source embodies healing potential. In fact, many foods exacerbate circumstances for fibromyalgia patients. These patients are advised to exclude sugar, caffeine, nightshade vegetables and processed fats and dairy from their diet.

Acupuncture

Acupuncture provides fibromyalgia patients with immediate relief. When small, acupuncture needles are inserted into the body, painful, neural pathways are suppressed and redirected. These needles are twisted and moved gently in order to counteract the pain associated of this condition. Every site in the body has a corresponding acupuncture point, that when activated, produces the intended effect. This produces rapid, short term results, and it modifies how the brain perceives pain.

Melatonin Therapy

Most fibromyalgia patients suffer from sleep disturbances that intensify their symptoms. In a cyclical turn of events, their symptoms then produce more sleeplessness. Melatonin, a naturally synthesized chemical, regulates sleep cycles. For this reason, a melatonin deficiency may render one sleepless and agitated in most cases. Melatonin treatments are an alternative mode of therapy specifically designed to enhance melatonin production and promote sleep and recovery.

Biofeedback Therapy

Biofeedback is an interesting form of alternative medicine. And like all therapies listed in this article, it does not require any pharmaceuticals whatsoever. It instead relies extensively on the power of the mind and perception to guide the course of pain reduction and other forms of symptomatic alleviation.

During sessions, patients become more acquainted with their bodies, while modifying how pain is perceived. Once electrodes are attached to the skin, bodily sensations are then presented as visuals or sounds. With the aid of biofeedback, many patients have learned to exercise mild control over temperature, blood pressure, pain, headaches, and more. With this said, the mind can control functions that are widely regarded as phenomena of the body.

Trigger Point Massage

Trigger points comprise tightly bound muscle fibers. When compressed, these nodules are often accompanied by soreness and tenderness. In many cases, trigger point pain radiates to other areas of the body.

Trigger point massage therapists employ pressure techniques to smooth and relax these tense muscle nodules. This is an interactive form of massage therapy, as the patient must breathe deeply at certain intervals of time and identify the location of the trigger points. Trigger point massage therapy may be performed with a small, soft, massage ball in the comfort of one’s home, as well.

Overdose Deaths From Heroin On The Rise In The US

April Flowers for redOrbit.com – Your Universe Online
A new report from the Centers for Disease Control and Prevention (CDC) reveals that heroin deaths have risen sharply in many US states. The findings, published in a recent Morbidity and Mortality Weekly Report, were developed from death certificate data from 28 states. Despite this rise, twice as many people died from prescription opioid overdoses than heroin overdoses in these states in 2012.
According to John Tozzi of Bloomberg Businessweek, there is a link between prescription opioid abuse and heroin use. Previous research has shown that nearly 3 out of 4 new heroin users identify as having used prescription opioids before starting heroin.
Although not part of the study, the CDC finds two factors driving the increase in heroin overdoses. The first factor is widespread prescription opioid exposure and increasing rates of opioid addiction. The second factor is an increased heroin supply. The researchers do not find this relationship surprising as heroin is a type of opioid, so both drugs act on the same brain receptors to produce similar effects.
“Reducing inappropriate opioid prescribing remains a crucial public health strategy to address both prescription opioid and heroin overdoses,” CDC Director Tom Frieden, M.D., M.P.H., said in a CDC statement. “Addressing prescription opioid abuse by changing prescribing is likely to prevent heroin use in the long term.”
The study looked at changes in heroin and prescription opioid death rates in 28 states — representing 56 percent of the total US population — between 2010 and 2012. During the study period, overall heroin deaths doubled across all 28 states. BusinessWeek reports that the researchers found 3,635 overdoses in the 28 states in 2012, up from 1,779 in 2010. In Kentucky alone, heroin deaths tripled during the study period.
As for prescription opioid rates, five states had increases, seven states had decreases and the remainder had no change. No state showed a decrease in heroin rates during the study period.
“This study is another reminder of the seriousness of the prescription opioid overdose epidemic and the connection to heroin overdoses,” said Grant Baldwin, Ph.D, M.P.H, Director, Division of Unintentional Injury Prevention. CDC and other federal agencies are working to promote a smart, coordinated approach to reduce inappropriate prescribing and help people addicted to these drugs.”
The researchers suggest that helping those already addicted to prescription opioids and heroin is as important as addressing issues with prescribing habits to prevent future addiction.
—–

Invasive Species Of Bullfrog Spreading Along Montana’s Yellowstone River

Chuck Bednar for redOrbit.com – Your Universe Online
Native species living along the Yellowstone River in Montana are being threatened by a growing invasion of voracious American bullfrogs known to eat just about anything (including other bullfrogs), a team of researchers led by US Geological Survey biologist Adam Sepulveda claim in a new study.
Writing in the journal Aquatic Invasions, Sepulveda and colleagues from the USGS’s Northern Rocky Mountain Science Center, the Montana Natural Heritage Program, the Bureau of Land Management and the Montana Fish Wildlife and Parks concluded that this invasive species is now thriving and rapidly spreading in the river’s floodplain.
According to BBC News reports, the study authors found that the number of American bullfrogs living in the region has nearly quadrupled over the past four years. The frogs, which can grow to be up to 12 inches long, are typically native to the eastern US but have spread to nearly every state – and now they have been found along a 66 mile stretch of the Yellowstone River.
The bullfrogs are believed to be one of the reasons that multiple amphibian and reptile species all over the world have been declining in numbers, the USGS explained in a statement Thursday. The creatures’ size, combined with their mobility, their appetite, their ability to reproduce at a rapid rate and the fact that they are carriers of amphibian diseases, makes them a threat to biodiversity.
“The impacts of bullfrogs on native amphibians in the Yellowstone River are not yet known, but native Northern leopard frogs are likely to be most vulnerable to bullfrog invasion and spread because their habitats overlap,” Sepulveda said, adding that the invasive species (which had not been observed in the region prior to 1999) were likely introduced to Montana “for food, recreational hunting, bait and pest control, and as released pets.”
In order to get a grasp on the degree to which the bullfrogs have spread, the USGS said that scientists conducted field surveys in 2010, 2012 and 2013. Those researchers used visual encounter surveys to search for adults, egg masses and larvae, and calling surveys to listen for calls of breeding males.
Their work was performed while walking or slowing driving down roads that were adjacent to wetlands, large bodies of water or while on the river itself. The investigation found that bullfrogs had expanded from about 37 miles in 2010 to about 66 miles in 2013, and that the number of breeding sites had increased from 12 to 45 sites in that time.
The results, the agency said, “indicate that bullfrogs are firmly established in the Yellowstone River floodplain and can rapidly spread to new habitats.” While the spreading was found upstream, most of it was downstream, which suggests that the spread could be accelerated by river flow. The bullfrogs were also found in publicly accessible areas with deeper waters and emergent vegetation, indicating that they were likely introduced by humans.
The bullfrogs have caused problems by “preying on native frogs, out-competing other animals for food, and spreading a fungus that’s suspected as a cause of a widespread decline in amphibians,” according to Associated Press (AP) reporter Matthew Brown. While state and federal agencies initially tried to halt the species’ spread by killing them off, they were forced to give up after “the number of bullfrogs overwhelmed the effort,” he added.
—–

Source Code For Bad USB Malware Released

Chuck Bednar for redOrbit.com – Your Universe Online
BadUSB, the critical security flaw that could allow hackers to smuggle malware onto devices undetected, has been reverse engineered and a version of its source code has been released.
The malware, which was revealed by SR Labs security consultants Karsten Nohl and Jakob Lell at the Black Hat security conference in Las Vegas in August, cannot be detected by scans because it can target the miniscule chips used to control the operational system used by USB equipment such as a mouse, keyboard or flash drive.
Nohl and Lell demonstrated a proof-of-concept of the malware at the Black Hat conference, showing how it could be installed on a USB device to completely take over a computer, secretly change files installed from a memory stick or even redirect Internet traffic – and the attack would be undetectable by computer security software and would be difficult to fix, according to Wired’s Andy Greenberg.
Since it resides in the firmware and not the flash memory, he said the attack could be hidden long after the content of a flash drive would appear to have been deleted, highlighting the potential dangers of sharing USB devices. Due to the destructive nature of the threat and the inability to detect potentially harmful USB devices, Nohl and Lell opted against publically releasing the source code for the malware.
Last week, however, independent security researchers Adam Caudill and Brandon Wilson demonstrated during a joint presentation at the Derbycon hacker conference in Louisville, Kentucky that they had reverse-engineered the same USB firmware as Nohl and Lell, and had successfully reproduced some of the properties of BadUSB.
Unlike the SR Labs researchers, however, Caudill and Wilson have published their code and demonstrated potential uses for it on the distributed revision control and source code management hosting service Github, “raising the stakes for USB makers to either fix the problem or leave hundreds of millions of users vulnerable,” Greenberg wrote on Thursday.
“The belief we have is that all of this should be public. It shouldn’t be held back. So we’re releasing everything we’ve got,” Caudill told the Derbycon audience, according to Greenberg. He added that the decision was “largely inspired” by Nohl and Lell’s decision not to release their material, and that if a security research team was “going to prove that there’s a flaw,” he felt that they needed “to release the material so people can defend against it.”
Caudill and Wilson, who declined to name who they were working for, said that publically releasing the USB attack code would allow penetration testers to experiment with the technique, the Wired reporter said. They also said that releasing the exploit was the only way to prove that USBs are nearly impossible to secure in their current form, and to pressure USB makers to make changes to their current, apparently flawed security structure.
“If this is going to get fixed, it needs to be more than just a talk at Black Hat,” Caudill told Greenberg during a follow-up interview. He claimed that the USB trick is most likely already available to NSA officials and other government agencies, and that those organizations could already be secretly using it.
“If the only people who can do this are those with significant budgets, the manufacturers will never do anything about it. You have to prove to the world that it’s practical, that anyone can do it… That puts pressure on the manufactures to fix the real issue,” he added. “People look at these things and see them as nothing more than storage devices. They don’t realize there’s a reprogrammable computer in their hands.”
Russell Brandom of The Verge also explained that fixing the problem will not be easy. Since the vulnerability allows attackers to reprogram USB firmware, preventing it would require a new layer of security around that firmware, which in turn would require a massive update to the USB standard itself.
“However the industry responds, we’re likely to be living with it for a long, long time. In the meantime, any time you plug a USB drive into your computer, you’ll be opening up a huge vector of attack,” he added. “Unless you can track a device’s provenance from the factory to your computer, the only real protection [is] avoiding USB drives and devices at every turn. … It’s an extreme response, but not an unreasonable one.”
—–

World Space Week 2014: 15th Anniversary Event To Honor Satellite Contributions

Chuck Bednar for redOrbit.com – Your Universe Online
The contribution of satellite navigation to society is serving as the theme of World Space Week 2014, the annual event where astronomy enthusiasts from across the globe pay tribute to how the international space industry has helped to make our lives better.
According to the United Nations (UN), World Space Week – which is marking its 15th anniversary this year – is “an annual global celebration of the contributions of space science and technology to humanity” which has been commemorated from October 4 through October 10 every year since 1999.
The week-long event “aims to provide unique leverage in space outreach and education; educate people around the world about the benefits that they receive from space; encourage greater use of space for sustainable economic development; demonstrate public support for space programs; excite young people about science, technology, engineering, and math; and foster international cooperation in space outreach and education,” the UN added.
The start and end dates for World Space Week were chosen to commemorate two significant events in the history of space, according to NASA History Web Curator Stephen J. Garber. On October 4, 1957, Sputnik I became the first human-made satellite to be launched into outer space, and on October 10, 1967, the UN Treaty on Principles Governing the Activities of States in the Exploration and Peaceful Uses of Outer Space, including the Moon and Other Celestial Bodies went into effect.
This year’s theme, ‘Space: Guiding Your Way,’ emphasizes how much people have come to rely on satellites in their day-to-day lives, the official World Space Week website explained. Satellite networks that provide positioning and timing-related data are an integral part of just about everyone’s lives, and this year marks the addition of European Galileo, Russian Glonass and Chinese Beidou to the global GPS system.
“With these new systems we are seeing a tremendous growth of downstream applications being developed for all imaginable professional and personal uses,” the organizers said. Navigation systems have long been used for planes, trains, ships and cars, as well as smartphones and handheld GPS systems. However, they are also used for the agricultural and finance industries, and even Hollywood uses probes to synchronize cameras, they added.
The event organizers said that there were more than 1,400 events organized in 80 countries on all continents, including Antarctica, during last year’s World Space Week, and this year’s event is expected to top that. If you’re looking for a good way to mark the occasion, AFP Relaxnews is reporting that popular ideas “include geocaching events, real-world outdoor treasure hunts using GPS-enabled devices.”
Some of the events that have been announced include a series of programs at the London Science Museum and an evening of space trivia at the Space Foundation in Denver, Colorado. In addition, Missouri’s Springfield Greene County Library District will allow individuals to participate in a Skype Q&A session with astronaut Michael Hopkins. A complete list of events is available at the World Space Week’s official website.
In addition, a new iOS video game based on the Space Racers animated public television series for preschoolers will be released on October 6 as part of World Space Week. The TV program was produced in collaboration with NASA experts, and the game was designed so kids between the ages of four and six can select from one of four characters and soar through 32 levels of space-flying action.
CNN.com’s Lauren Said-Moorhouse marked the occasion with a countdown of the nine coolest Twitter accounts for space enthusiasts to follow. Among the astronauts, vehicles and organizations featured on the list are current ISS crew members Alexander Gerst of the ESA and Reid Wiseman from NASA, the Rosetta mission’s Philae lander, space transportation company SpaceX and NASA’s Voyager probe.
Speaking of Weisman and Gerst, they will be spending part of World Space Week with the first of two scheduled spacewalks to replace a failed power regulator and relocate a failed cooling pump. The two ISS Expedition 41 crew members will exit the orbiting laboratory’s Quest airlock at approximately 8:10am Eastern on Tuesday. That spacewalk is expected to last roughly 6.5 hours, and will be followed by a second on October 15, according to NASA.
—–
FOR THE KINDLE: Space Technologies on Earth: redOrbit Press
—–

6 Best Fibromyalgia Pain Killers

chronic pain medication, pain killers, painkillers

Drugs Tablets Pain Medication Pills Headache

Of primary concern for sufferers of fibromyalgia, of course, is the issue of pain killers and which are going to be most effective for your pain. There are different symptoms of the syndrome, but pain is the most pertinent condition. Effective pain relief can come from over-the-counter, prescription drugs or a combination. We present the 6 of the best fibromyalgia pain killers.

6 Fibromyalgia painkillers

Advil

Advil is a brand of ibuprofen, first developed in 1962.  It is a type of non-steroidal anti-inflammatory drug or NSAID. There are different brands of ibuprofen, but Advil is one of the best known. Similar to Tylenol, Advil is used for pain or fever relief. However, unlike Tylenol, Advil also soothes inflammation, making it one of the most effective fibromyalgia painkillers.

NSAIDs are among the most common pain relief medicines in the world. Over 30 million Americans use them every day to soothe headaches, sprains, arthritis symptoms, and other aches and pains. And, because of the anti-inflammatory abilities that NSAIDs have, they can also lower fever and reduce swelling.

NSAID drugs work by inhibiting cyclooxygenase-1 (COX-1), the chemical responsible for production and release of prostaglandins, which are, in turn, responsible for pain and fever. They also inhibit cyclooxygenase-2 (COX-2), which is responsible for inflammatory response.

These drugs are very effective for relief of pain caused by inflammation. Unfortunately, while the causes of fibromyalgia are still a bit of a mystery, it is known that it is not caused by inflammation. While these drugs have been prescribed often for fibromyalgia pain, they haven’t actually been all that effective on their own. However, NSAIDs have seen success in combination with other pain relievers.

Aspirin

Plain old Aspirin is also a NSAID like Advil, and it’s also one of our favorite fibromyalgia painkillers. The chemical name of Aspirin is acetylsalicylic acid or ASA for short. The use of willow and other salicylate-rich plants has an ancient history, being used by the ancient Sumerians. More recently, ASA in its pure form was distilled in 1899 by a scientist working for Bayer. There are also other brands available such as Bufferin, Entrophen and house brands. Like Advil, because it has anti-inflammatory properties, it can provide fever and pain relief. However, unlike either Tylenol or Advil, Aspirin also works to thin the blood and is therefore often used to prevent stroke and heart disease.

As part of the NSAID class of drugs, Aspirin inhibits COX-1 and COX-2 to provide relief from pain, fever, and inflammation. While most NSAIDs also inhibit platelets in the blood, Aspirin does so irreversibly for eight to ten days, the full lifespan of the platelet. It is this which gives it the ability to act as a blood-thinner, but can also make you more prone to bleeding.

Aspirin is generally safe, but children under 18 years of age should avoid it. While the side effect that might cause Reye’s syndrome is rare and only occurs in very specific circumstances, it is also very serious. Since there are other types of painkillers available, it is easier for children to simply avoid the drug altogether. It should also be noted that while there are products called “baby aspirin” or “low-dose ASA.” In spite of the names, these should also be avoided for children. These products are actually blood-thinners for adults to take to prevent heart disease and stroke.6 fibromyalgia painkillers

Tylenol

The generic name of Tylenol is acetaminophen and it’s also on our list of fibromyalgia painkillers. This drug was discovered and marketed in 1956. It is called acetaminophen in the US, Canada and Japan, and paracetamol elsewhere. Unlike Advil and Aspirin, Tylenol is not in the NSAID category. Because it works on the nervous system instead of inflammation, it is more effective than NSAIDs for fibromyalgia pain.

Tylenol is used for pain or fever relief, but it has no anti-inflammatory action. Therefore, it won’t affect any underlying inflammation that can cause pain. Conversely, it can have a greater effect on pain like fibromyalgia where the pain is brought on by conditions in the nervous system rather than inflammation.

Tylenol is a safe drug to take for pain during pregnancy or lactation. That is because though it can be detected in breast milk, no adverse effects on either mothers or infants have been reported. Therefore, it is considered to be the first choice painkiller in pregnancy and lactation.

Acetaminophen is generally considered a very safe drug because it has few interactions with other drugs. It has also been around for a long time, so healthcare professionals are very familiar with it.  However, this can lead to a significant risk of overdose with the drug. First, because of the way it is perceived, it can mean that people are not as cautious with it as they really need to be. Because fibromyalgia pain is ongoing, it seems likely that, in an attempt to relieve pain, a person might be tempted to take more of the drug than is recommended.

Also, acetaminophen overdose can be quite serious. The symptoms of overdose include nausea, diarrhea, abdominal pain and jaundice. It also has several symptoms that may easily blend with symptoms that a patient is already suffering because of fibromyalgia and can, therefore, be easily missed.

Finally, it can be very easy to overdose on acetaminophen accidentally. This is because the drug is often included in other kinds of drugs. For example, you might be taking Tylenol for your fibromyalgia pain, and then catch a cold. Cold medication often also includes acetaminophen.

Acetaminophen is also sold in combination with opioid and other drugs.

Like ibuprofen and ASA, there are generics and other brands which have acetaminophen as the main active ingredient. They are virtually the same thing as Tylenol, and so it’s up to you which brand you want to use. If cost is an issue, you may want to buy the house brand acetaminophen, which is generally cheaper than the brand name Tylenol.

Amitriptyline

Unfortunately, there has been a long history of fibromyalgia being treated as a psychosomatic condition, so naturally, you might be suspicious if your doctor prescribes an antidepressant as one of the fibromyalgia painkillers. However, there has been a great deal of success in treating pain through tricyclic antidepressants like Amitriptyline.

It is believed that this is because the mechanism for fibromyalgia pain is chemically based on the neurotransmitters that carry pain signals to the brain. Many of these same neurotransmitters are involved in depression, which helps explain why antidepressants can help with fibromyalgia pain.

Unlike Tylenol, Aspirin, and Advil, Amitriptyline is not available over-the-counter. It is only available through a prescription. The drug can also help with sleep issues that come with fibromyalgia. So it can not only help with pain, but help you get more and better sleep.

It’s also not much of a surprise if the chronic pain of fibromyalgia might have made you depressed. Because there is generally a linkage between physical and emotional health, it will be helpful if it’s possible to elevate your mood. So in another way, an antidepressant can help with fibromyalgia suffering.

Lyrica

Lyrica is an oral medication that is classified as an anti-seizure or anti-convulsant drug. Lyrica is the trade name of the drug Pregabalin and it’s one of the more effective fibromyalgia painkillers. It was the first drug approved by the FDA for treatment of fibromyalgia. Like Amitriptyline, Lyrica is not an over-the-counter drug and must be prescribed.

Lyrica binds to a part of the nerves and it is thought that this reduces the ability for nerves to send pain messages to each other; it slows down impulses in the brain that cause seizures and affects chemicals in the brain that send pain signals across the nervous system.

Again, like Amitriptyline, the anticonvulsants work on the nervous system to relieve pain. The drug will reduce the number of pain signals that are sent to the brain. Fibromyalgia is believed to be caused, at least in part, by nerves being effectively “hyperactive” and so sending too many signals. What anticonvulsants do is calm the nerves and cause them to send fewer signals. This helps them prevent seizures in other conditions and also helps relieve pain in fibromyalgia.

Opioids

There is little doubt that opioids do the job, and they will almost definitely be one of the most effective fibromyalgia painkillers. However, they also carry a very significant risk of creating dependence. For this reason, while there is very little question that opioids are effective, it is generally seen as a measure to be taken only after other treatments have been tried.

However, because antidepressants and anticonvulsants can take several days to take effect, opioids are sometimes prescribed in conjunction with other drugs. Also, where other options aren’t effective at relieving pain, long-acting opioids are often prescribed to stay on top of the pain while other solutions are explored.

While there are drugs with a higher success rate than others, it is also important to note that what works for one person may not work for another. One single drug won’t work for everyone. Conversely, a drug that may work for you won’t necessarily work for other people. To a certain extent, it will require some investigating and some trial and error.

HIV Pandemic Spread By The ‘Perfect Storm’

Chuck Bednar for redOrbit.com – Your Universe Online
The origin of the AIDS pandemic can be traced back to a city in what is now the Democratic Republic of Congo, an international team led by scientists from Oxford University and the University of Leuven report in the latest edition of the journal Science.
According to first author Dr. Nuno Faria of Oxford’s Department of Zoology and his colleagues, a reconstruction of the genetic history of the HIV-1 group M pandemic concluded the pathogen originated in the city of Kinshasa, and that the common ancestor of the group is highly likely to have emerged there sometime around 1920.
Furthermore, the researchers reported that while HIV is known to have been transmitted from apes and primates to people at least 13 times, only one of those strands, the HIV-1 group M, resulted in a human pandemic.  The team’s analysis concluded there was a 95 percent chance the HIV-1 group M transmission originated between 1909 and 1930.  The spread of the HIV-1 group M from virus to pandemic is attributed to a “perfect storm” of factors, including urban growth, the advanced railway and river transportation, and changes to the sex trade.
While it is quite likely the virus crossed from chimpanzees to humans in southern Cameroon several years before the current pandemic began, the pathogen which causes AIDS remained a regional infection until it reached Kinshasa, according to The Guardian. At the time, Kinshasa was the largest and fastest growing city in the region, and records show that by the 1940s, more than one million people had passed through the city on the railways alone.

Image Above: Kinshasa’s railways helped to make it on of Africa’s best connected cities. Credit: Atlas du Congo Belge et du Ruanda-Urundi, Gaston Derkinderen, Les Transport, Elsevier, Bruxelles, 1955
By 1960, the rate of new pandemic HIV infections outpaced the growth of the regional population, the researchers reported. While trains and other forms of transportation helped spread the disease, The Guardian notes that other factors were in play as well. The UK newspaper said that records suggest Kinshasa’s predominantly male population had a high demand for sex workers, and that doctors may have helped spread the virus by using unsterilized needles at sexual health clinics in the region.
“Our genetic data tells us that HIV very quickly spread across the Democratic Republic of the Congo, travelling with people along railways and waterways to reach Mbuji-Mayi and Lubumbashi in the extreme South and Kisangani in the far North by the end of the 1930s and early 1950s,” Dr. Faria explained in a statement Friday.
“This helped establishing early secondary foci of HIV-1 transmission in regions that were well connected to southern and eastern African countries,” he added. “We think it is likely that the social changes around the independence in 1960 saw the virus ‘break out’ from small groups of infected people to infect the wider population and eventually the world.”
Dr. Faria and experts from Belgium, Canada, France, Portugal, Spain, the UK and the US reconstructed the history of the HIV pandemic, which to date has infected nearly 75 million worldwide, using historical records and DNA samples of the virus dating back to the 1950s. The genetic material allowed them to construct a family tree that traced the ancestry of the AIDS-causing pathogen, and statistical models helped them delve back into its origins.
“You can see the footprints of history in today’s genomes, it has left a record, a mutation mark in the HIV genome that can’t be eradicated,” study co-author and Oxford professor Oliver Pybus told BBC News online health editor James Gallagher. It was those mutational marks which allowed Pybus, Dr. Faria and their colleagues to reconstruct HIV’s family trees, Gallagher added.
University of Nottingham professor Jonathan Ball told BBC’s Gallagher that the study was “a fascinating insight into the early phases of the HIV-1 pandemic. It’s the usual suspects that are most likely to have helped the virus get a foothold in humans – travel, population increases and human practices such as unsafe healthcare interventions and prostitution.” He added that the suggestion the spread “had more to do with the conditions being right” than with the disease adapting for growth and transmission in humans would “prompt interesting and lively debate.”
—–

After Decades Of Searching, Scientists Find Elusive Particle That Is Both Matter And Antimatter

Chuck Bednar for redOrbit.com – Your Universe Online
Scientists from Princeton University have discovered an unusual new type of particle that is essentially its own antiparticle – behaving simultaneously like matter and antimatter, according to a new study currently appearing in the online edition of the journal Science.
The particle, which is known as a Majorana fermion, was detected and imaged using a two-story-tall microscope floating in an ultralow-vibration lab, the researchers explained. Not only is the discovery “an exciting step forward for particle physics, explained Macrina Cooper-White of The Huffington Post, but it could also impact quantum computer development.
“This is the most direct way of looking for the Majorana fermion since it is expected to emerge at the edge of certain materials,” Princeton physics professor and lead investigator Ali Yazdani said in a statement Thursday. “If you want to find this particle within a material you have to use such a microscope, which allows you to see where it actually is.”
Using the massive microscope, Yazdani and his colleagues were able to capture a glowing image of the Majorana fermion perched at the end of an atomically thin wire – exactly where scientists have long predicted it would be. In fact, Cooper-White said the existence of a particle that could serve as its own antimatter counterpart was first hypothesized by Italian physicist Ettore Majorana in 1937, and experts have been searching for it ever since.

Image Above: Video screenshot. This video shows how the researchers first deposited iron atoms onto a lead surface to create an atomically thin wire. They then used their microscope to create a magnetic field and to map a signal that indicates the presence of the particle, called a Majorana fermion. The signal appeared at the ends of the wire. Video courtesy of Ilya Dorzdov, Ali Yazdani Lab
In addition to the implications this has in the realm of fundamental physics, the researchers said the discovery could lead to a major advance in the development of computers based on quantum mechanics. In quantum computing, electrons are coaxed into representing both the ones and zeros of conventional computers, as well as a unique state in which they exist as both a one and a zero (a property known as quantum superposition).
Quantum superposition “offers vast opportunities for solving previously incalculable systems, but is notoriously prone to collapsing into conventional behavior due to interactions with nearby material,” the university explained. Since the Majorana fermion is surprisingly stable, even though it contains qualities of both matter and antimatter, scientists believe it could be engineered into materials that provide a more stable way to encode quantum information.
As part of their research, Yazdani’s team placed a long chain of magnetic iron atoms on top of a superconductor made out of lead, said Scientific American writer Clara Moskowitz. Typically, magnetism disrupts superconductors, which rely on the absence of magnetic fields to allow electrons to flow unimpeded. However, in this instance, they had a different effect.
During the experiment, the magnetic chain turned into a special type of superconductor that caused each electron to coordinate their spins so that they simultaneously satisfied the requirements of magnetism and superconductivity. Each pair could be viewed as both an electron and an antielectron, possessing a negative charge and a positive one, respectively. The arrangement left one electron at each end of the chain with no partner.
As a result, the electrons at the end had an electrically neutral signal and assumed the properties of both electrons and antielectrons, making them Majorana particles, Moskowitz said. The researchers explained that their experiment allowed them to directly visualize how the signal changed along the wire, essentially mapping the quantum probability of finding the Majorana fermion along the wire and ultimately pinpointing its locations at the ends of the wire.
Yazdani said the research was “exciting” and could be “practically beneficial, because it allows scientists to manipulate exotic particles for potential applications, such as quantum computing.” He added that, even though the setup for the experiment was complex in nature, the new approach did not require the use of exotic materials (using only lead and iron) and would be easy for other scientists to reproduce and build upon.
California Institute of Technology physicist Jason Alicea, who did not participate in the research, told Moskowitz that while the Princeton paper offered “compelling evidence” for the Majorana fermion, it was important to consider “alternative explanations – even if there are no immediately obvious candidates.” He also praised the experimental setup, and in particular the way in which it made it possible to easily produce the new particle.
—–

Energy Expenditure Of Cheetahs Suggests Humans Are Responsible For Their Decline

Chuck Bednar for redOrbit.com – Your Universe Online
Like housecats, cheetahs spend much of their time relaxing, opting to conserve their energy for the bursts of activity required to take down their prey, according to new research appearing Friday in the journal Science.
However, lead author Dr. David M. Scantlebury of the Queen’s University Belfast School of Biological Sciences and his colleagues report that their analysis of the creatures’ daily energy expenditure revealed that human activity forces them to expend more effort than larger predators, and could be the primary cause of their plummeting population.
In what the researchers are calling the first study of its kind, the authors wrote that cheetahs typically do not expend significantly more energy than other, similar mammals. Furthermore, they found that the cats typically incur more energy loss while searching for food than they do during their spectacular outbursts of running – a discovery which suggests that human-caused reductions or redistribution of their prey has impacted them tremendously.
Dr. Scantlebury explained that he and his fellow researchers looked at 19 free-roaming cheetahs at one of two different sites in southern Africa – one in the Kalahari desert and the other in a wetter region of the continent. Each cheetah was studied for a period of two weeks, and the scientists injected heavy water into the creatures before tracking them and collecting their feces so that they could calculate how much of the liquid they were losing each day.
With that information, they were able to calculate the cheetah’s energy expenditures, and Dr. Scantlebury found the cats were not expending significantly more energy than mammals of similar size. He added that their research showed the major energy costs appeared to be incurred more by traveling than by securing prey.
“If you can imagine walking up and down sand dunes in high temperatures day in, day out, with no water to drink you start to get a feel for how challenging these cats’ daily lives are, and yet they remain remarkably adapted and resilient,” the professor said in a statement. “They can even withstand other species, such as lions and hyenas, stealing their prey.”
“The reality may be that human activities – for example erecting fences that inhibit free travel or over-hunting cheetah prey – are forcing cheetahs to travel ever-increasing distances and that this may be compromising their energy more than any other single factor,” he continued, adding that their study “seriously questions previously held assumptions about the factors affecting population viability in large predators threatened by extinction.”
“Research of this type helps improve our understanding of the challenges facing cheetahs as they strive to survive and helps inform future decisions on conservation strategies for cheetahs and other threatened animals,” said co-researcher Dr. Nikki Marks, also of Queen’s University. Likewise, co-author Dr. John Wilson of North Carolina State University said the study demonstrated that people, not lions or hyenas, were driving the creatures’ decline.

Image Above Credit: Thinkstock.com
A similar study, also published Friday in the journal Science, tracked energy usage for another type of big cat – the mountain lion (or puma) – living in Santa Cruz, California, and found that the creatures used between 10 and 20 percent of their total daily energy taking down prey that can be as much as four times larger than they are.
As Christine Dell’Amore of National Geographic explained, the authors of this study used specially designed radio collars to track the movement and speed of the four wild mountain lions in the Santa Cruz Mountains. While they have calculated how much effort they put into hunting, they have not yet calculated the average kilojoules expended by a mountain lion during an average day.
Lead investigator and University of California, Santa Cruz ecologist Terrie Williams told Dell’Amore that the data collected by her team are especially valuable, as puma attacks are rarely witnessed. Williams added that the work of both research teams has provided information that will prove vital to scientists working to protect big cats.
—–

Choosing A Fibromyalgia Treatment That Works

About Fibromyalgia Treatments

Fibromyalgia treatments require a lifelong investment in both short-term and long-term therapies. Accordingly, quick remedies are often tantamount in importance to long-term treatment options. In some cases, only short-term remedies can suppress acute, painful flare ups when they arrive unexpectedly.

In this article, you will learn about short-term remedies known to enhance pain management among patients with this chronic disorder.

The Importance of Short-term Treatments

If you have an established, long term regimen, then what is the practical application of short-term pain relief? There may be occasions in which you suffer from acute, symptomatic flares that render you incapacitated or immobile. This may occur as a result of stress, poor diet, lack of exercise, and poor sleeping habits. A quick pain relief method can heighten concentration, sleep and comfort, as well.

Exercise

One of the most potent, short-term methods of symptomatic relief is aerobic exercise. The efficacy of this method has been confirmed by study after study. Exercise invariably leads to improved well-being, elevated mood states, increased energy and pain relief. In essence, workouts are regarded as a critical necessity by doctors and patients alike.

Some patients are misguided under the impression that exercise is dangerous. Fibromyalgia patients often lack the muscular strength and robustness necessary to exercise. But, in spite of the pain and weakness that one may experience, exercise cannot be dismissed under any circumstances.

Exercise is so powerful that its effects rival both traditional medicine and homeopathic therapies. Aerobic exercise confers a plethora of benefits, including stress reduction, increased flexibility, muscle recovery and healing, hypertrophy, endorphin release and pain reduction. At the end of the day, exercise can mean the difference between a self-sufficient patient, and a debilitated sufferer.

If you fear the prospect of incurring an injury during exercise, simply increase the intensity of your exercise over time, without overwhelming yourself.

Quick Fibromyalgia Treatment

Acupressure

Acupressure is a celebrated short-term remedy, as no licensed professional is required to enjoy its benefits. This rapid course of treatment helps compress nerve pathways that transmit painful signals throughout the body. Different pressure points connect with corresponding areas of pain. By activating these pressure points, you can impede the transmission of pain signals and trigger endorphin release, as well. Therefore, when medications fail to alleviate symptoms, acupressure may be of benefit.

Capsaicin

Capsaicin is a chemical compound that resides within pepper plants. It can be found in cayenne pepper, and a number of other commercialized foods. It contains a unique precursor that helps diminish painful symptoms. Therefore, in the heat of a painful flare up, capsaicin may be a feasible solution.

Muscle Relaxer

Muscle relaxers were manufactured to reduce pain and tension experienced in various muscle groups.

Steroid Injection

A steroid injection is a quick means of relieving both tension and pain in localized muscle groups. Symptomatically speaking, if you suffer from “tender points” or concentrated pain, you will benefit from this method.

Shiatsu Massage and Deep Tissue Massage

Massage therapy is yet another popular pain management method. This homeopathic therapy yields immediate benefits in most cases. Fibromyalgia patients tout massage as one of the most potent remedies for their condition.

For some massage techniques, an extended wait time is needed to yield noticeable effects. For example, if you are seeking immediate relief, avoid deep tissue massage, as stiffness and pain can last afterward for 1-2 days. Of course, deep tissue massage does present a host of benefits in the long run, including improved connective tissue, relaxed muscles, muscular recovery and healing, as well as improved circulation.

The recommended option is to have a shiatsu massage instead. This method incorporates stretches, massage strokes and pressure to generate the desired immediate effect. Shiatsu massage is a dated technique that has thrived in Japanese culture for ages now. It can improve pain signals for the better and interact directly with nerve pathways.

Meditation

Fibromyalgia patients experience central sensitization, a phenomenon of the central nervous system. When central sensitization occurs, it inflames the spinal cells and distorts pain pathways and perception. This accounts for the chronic, somatic nature of fibromyalgia pain.

Meditation, however, may provide a mild decrease in these painful symptoms. Meditation is known to both enact long term changes in neural pathways, and to calm the central nervous system, as well.

Acupuncture

As one of the most dated forms of pain therapy, acupuncture is a reliable source of homeopathic wellness for sufferers of this condition. During the process of acupuncture, small needles are placed in specific regions of the body, and adjusted to activate energy pathways, thereby removing energy blockages.

Cognitive Behavioral Therapy

Pain has a latent, psychological influence on its sufferers. As it turns out, this phenomenon is not entirely physical. The psyche and emotional centers of the brain associate pain with “hurting” or “suffering”, hence the unpleasant effects. Without emotions or labels, however, the presence of pain would cease to induce suffering in human beings.

Many patients undergo cognitive behavioral therapy (CBT) to supplement their traditional modes of treatment. One might ask, ‘How could therapy possibly help a chronic pain patient like myself? I deal with stabbing muscular pains, stiffness, and sleep disturbances.”

Quite often, stress and anxiety complicate pain mechanisms by increasing cortisol and blood pressure. Furthermore, the mind influences outlook, expectations and perception. And when you anticipate suffering, your brain finds evidence to substantiate your initial expectations.

Cognitive behavioral therapy is a proven method of shaping one’s perception and responses to unpleasant circumstances. While CBT is not an apt replacement for opiates or muscle relaxers, it can help one shift awareness away from suffering. Furthermore, you can easily do it on your own. Here is a thought-replacement, CBT technique that people use for a number of conditions:

Example Negative Thought: “My life is filled with discomfort and pain, there’s no cure for this, and every moment of my life is excruciating. Time to endure another day.”

Example Positive Replacement Thought: I accept the reality of my condition. And until a cure is introduced, I will successfully manage and improve my symptoms with proper medication, healthy diet, exercise, massage, and physical therapy. If I combine each of these measures, my pain will subside and permit a healthy sleep cycle and life. I choose laughter, health, and exercise over suffering.”

Ginseng

Ginseng is another natural remedy that is often used by fibromyalgia patients. Ginseng stimulates the release of the body’s endorphins, which are natural painkillers with an opiate-like effect on pain.

Loss Of Smell May Be A Predictor Of Death

April Flowers for redOrbit.com – Your Universe Online
The older generation is a source of constant study for scientists seeking to understand why some of us live such long lives, and some of us die relatively early. Studies have found many predictors of death in the elderly, including broken hips, love handles and nocturia (frequent bedwetting). A new study from the University of Chicago has added loss of smell (olfaction) to that list.
The findings, published in PLOS ONE, reveal that not being able to identify scents is a strong predictor of death within five years. In fact, 39 percent of those who failed the simple smelling test died during the five year time period. For those with moderate smell loss, only 19 percent died, and for those with no smell loss, the death rate was only 10 percent.
The study, part of the National Social Life, Health and Aging Project (NSHAP), found that the dangers of losing your sense of smell are “robust,” more so than most chronic diseases. For example, olfactory dysfunction was found to be a better predictor of death than a diagnosis of heart failure, cancer or lung disease. Only severe liver damage was found to be a more powerful predictor, and for those already at high risk, losing their sense of smell more than doubled the probability of death.
“We think loss of the sense of smell is like the canary in the coal mine,” Jayant M. Pinto, MD, an associate professor of surgery at the University of Chicago who specializes in the genetics and treatment of olfactory and sinus disease, said in a recent statement. “It doesn’t directly cause death, but it’s a harbinger, an early warning that something has gone badly wrong, that damage has been done. Our findings could provide a useful clinical test, a quick and inexpensive way to identify patients most at risk.”
NSHAP, the first in-home study of social relationships and health in a large, nationally representative sample of men and women between ages 57 and 85, provided the data for the study. Between 2005 and 2006, professional survey teams from the National Opinion Research Center at the University of Chicago conducted the first wave of NSHAP. They used a well-validated test which measures the ability to identify five distinct common odors. The test was adapted for the by Martha McClintock, PhD, of the Institute for Mind and Biology at the University of Chicago.
According to Smitha Mundasad of BBC News, the five scents were peppermint, fish, orange, rose and leather encased on the tips of felt tip pens named Sniffin’Sticks. Each participant was asked to identify each smell from a set of four choices.

Image Above: Jayant Pinto, M.D., is shown with one of the Sniffin’ Sticks used to test a patient’s ability to identify scents for his research on olfactory dysfunction and aging. Credit: Robert Kozloff/The University of Chicago
The team discovered that 75 percent of the participants were classified as “normosmic,” or having a normal sense of smell — of those, 45.5 percent correctly identified five out of five, while 29 percent identified four out of five. Almost 20 percent of the total cohort were classified as “hyposomic,” getting three out of five correct.
The final 3.5 percent could only identify one scent or less, classifying them as “anosmic.”
The researchers adjusted their findings for demographic variables such as age, gender, socioeconomic status (as measured by education or assets), overall health and race. Despite this, those with the poorest sense of smell were still at the highest risk.
Prof Pinto added: “The sense of smell is a little underappreciated – it plays a very important part in everyday life. But we don’t want people to panic. A bad cold, allergies, and sinus problems, can all affect your sense of smell.”
“People shouldn’t be too worried, but if problems persist they should speak to their physicians. And perhaps this study shows we need to start paying more attention to sensory health overall.”
—–

The Facts About Ebola: How The Disease Can (And Can’t) Be Spread

Chuck Bednar for redOrbit.com – Your Universe Online
In the wake of recent developments involving the ongoing outbreak of Ebola Hemorrhagic Fever, officials from the US Department of Health and Human Services (HHS) and the Centers for Disease Control and Prevention (CDC) are reaching out to make certain that people have the facts about the dangerous viral disease.
On September 30, CDC tests confirmed that a Liberian man, later identified as Thomas Eric Duncan, had become the first person to be diagnosed with Ebola while in the US. Duncan reportedly started showing symptoms approximately five days after arriving from West Africa and is currently being treated for the disease in the Dallas, Texas area.
In a statement, the CDC said that it “recognizes that even a single case of Ebola diagnosed in the United States raises concerns.” The agency added that “medical and public health professionals across the country have been preparing to respond” and that experts “have been reminded to use meticulous infection control at all times.”
On Friday, reports surfaced that a freelance cameraman who had been working with NBC News on their coverage of the outbreak had been diagnosed with Ebola. That individual, 33-year-old Ashoka Mukpo, first reported symptoms on Wednesday. He will be flown back to the US for treatment this weekend, and his colleagues have decided to voluntarily quarantine themselves for 21 days, according to William M. Welch of USA Today.
With stories like these dominating news broadcasts and print headlines, and given the fact that the CDC has declared that this is the largest Ebola epidemic in history, US health officials have been making a concentrated effort to quell concerns about a domestic outbreak of the disease. The CDC explained that the risk of a US Ebola outbreak was “very low,” that it and its partners were “taking precautions to prevent this from happening.”
Likewise, HHS officials sent out an email statement (see below) to the press Friday morning ensuring the country was “prepared” to deal with the disease, and that unlike the regions of Africa being hardest hit by Ebola, that the US “has a strong health care system and public health professionals who will make sure this case does not threaten our communities.”
According to the agency, the virus responsible for causing the viral hemorrhagic fever disease, as of Thursday, was reportedly responsible for more than 3,300 deaths worldwide. The symptoms include fever, headache, joint and muscle aches, weaknesses, diarrhea, vomiting, stomach pain, a lack of appetite, and abnormal bleeding. Symptoms can appear anywhere from two to 21 days after exposure.
The HHS went on to explain that while Ebola was a potentially deadly disease, it is “not a highly contagious disease.” It can only be transmitted through direct contact with either the blood or bodily fluids of a person who has been infected and is showing symptoms of the disease, or through exposure to objects (such as needles) that have been contaminated with infected secretions.
Ebola is not a respiratory disease and cannot be transmitted through the air, the HHS said. Likewise, it is not transmitted through water, and a person cannot catch the viral hemorrhagic fever by eating contaminated food in the US, the agency noted. Likewise, it cannot be caught from a person who has been infected but is not showing any symptoms of the disease – a person would have to have “direct contact with an individual who is experiencing symptoms or has died of the disease,” according to the HHS statement.
“Ebola can be scary. But… the United States has a strong health care system and public health professionals who will make sure this case does not threaten our communities,” CDC Director Dr. Tom Frieden explained in a statement announcing the details of the Duncan diagnosis on Tuesday. “While it is not impossible that there could be additional cases associated with this patient in the coming weeks, I have no doubt that we will contain this.”
The CDC also ensured that the agency had taken multiple steps to help prepare for the potential domestic spread of Ebola, including enhancing surveillance and laboratory testing capacity to better detect cases at the state level, for providing recommendation to help prevent infections and control the spread of the disease, and informing flight crews, emergency medical responders and other official on how to report ill travelers to the CDC.
—–
HHS Statement:
This week, the Centers for Disease Control and Prevention (CDC) announced the first confirmed case of Ebola diagnosed in the United States in a person who traveled from West Africa.
There’s all the difference in the world between the U.S. and parts of Africa where Ebola is spreading. The United States is prepared, and has a strong health care system and public health professionals who will make sure this case does not threaten our communities. As CDC Director Dr. Frieden has said, “I have no doubt that we will control this case of Ebola, so that it does not spread widely in this country.”
Although Ebola is a highly destructive disease, it is not a highly contagious disease.
Here are the facts you should know about Ebola:
What is Ebola? Ebola virus is the cause of a viral hemorrhagic fever disease. Symptoms include: fever, headache, joint and muscle aches, weakness, diarrhea, vomiting, stomach pain, lack of appetite, and abnormal bleeding. Symptoms may appear anywhere from 2 to 21 days after exposure to Ebola virus though 8-10 days is most common.
How is Ebola transmitted? Ebola is transmitted through direct contact with the blood or bodily fluids of an infected symptomatic person or though exposure to objects (such as needles) that have been contaminated with infected secretions.
Can Ebola be transmitted through the air? No. Ebola is not a respiratory disease like the flu, so it is not transmitted through the air.
Can I get Ebola from contaminated food or water? No. Ebola is not transmitted through food in the United States. It is not transmitted through water.
Can I get Ebola from a person who is infected but doesn’t have any symptoms? No. Individuals who are not symptomatic are not contagious. In order for the virus to be transmitted, an individual would have to have direct contact with an individual who is experiencing symptoms or has died of the disease.
Click here for more: Questions and Answers about Ebola

Social Media Users Prefer Positivity, Unless They’re In A Bad Mood

Chuck Bednar for redOrbit.com – Your Universe Online

People generally use social networks to connect with individuals who tend to post positive or success-oriented updates – unless they are in a bad mood, according to research scheduled for publication in the December edition of Computers in Human Behavior.

When social media users are in a negative mood, they are more likely to seek out people who appear to be doing even worse than they are: “the less attractive, less successful people on their social media sites,” study co-author of the study and Ohio State University communications professor Silvia Knobloch-Westerwick said in a statement.

The findings, Knobloch-Westerwick and fellow co-author Benjamin Johnson explained, additional context to recent research that has concluded that men and women who spend a lot of time on Facebook often feel more frustrated, angry and lonely. It was believed that this is the result of happy, upbeat updates from friends that make them feel inadequate in comparison, but the new study indicates that the mood of the user plays a major role.

Knobloch-Westerwick and Johnson, a former doctoral student in communication at Ohio State and now an assistant professor at VU University Amsterdam, recruited 168 college students and made sure they were in either a good mood or a bad one by having them complete a test on facial emotion recognition. Regardless of how they did, all of the participants were randomly praised or criticized for their performance on the examination.

Next, each of the individuals was asked to review what they were told was a brand new social media website known as SocialLink. The overview page of this supposed social network features profiles of eight individuals, designed to make the subjects of the profile appear to be either successful and attractive, or unsuccessful and unattractive.

Each profiled individual was ranked on a scale of 0 to 5 on both career success (represented by a number of dollar signs next to their profile) and attractiveness, or “hotness” (represented by the number of hearts next to the profile). Each profile had either half of a dollar sign (low career success) or 4 1/2 dollar signs (high career success), and likewise they each had either one-half heart (low attractiveness) or 4 1/2 hearts (high attractiveness).

The profile images of the supposed members were blurred out so that the study participants could not tell what the men and women actually looked like. The study participants were able to click on the profiles to learn more, and when they did, they discovered that all of the status updates were pretty much the same – relatively mundane and devoid of discussion of career success, academic achievement, physical appearance or romantic relationships.

“So the only real difference between the profiles was the ratings of career success and attractiveness signified by the dollar signs and hearts,” explained Johnson. The experiment revealed that people typically spent more time on the profiles of people who were rated as successful and attractive, but that those who had been put in a negative mood spent significantly more time than others browsing the profiles unsuccessful and unattractive people.

“If you need a self-esteem boost, you’re going to look at people worse off than you. You’re probably not going to be looking at the people who just got a great new job or just got married,” added Knobloch-Westerwick. “One of the great appeals of social network sites is that they allow people to manage their moods by choosing who they want to compare themselves to.”

——

Shop Amazon – Kindle Fire HDX – A Powerhouse Tablet Built for Work and Play

Does Growing Older Cause Your Sense Of Humor To Change?

Chuck Bednar for redOrbit.com – Your Universe Online
If you’re watching a TV sitcom and find that you’re the only one in the room laughing at the jokes, it may be because you’re the only one young enough (or old enough) to appreciate the humor, according to research appearing in the September edition of the journal Psychology and Aging.
Jennifer Tehan Stanley, an assistant professor of psychology at the University of Akron, and colleagues from Brandeis University and Northeastern University set out to determine whether or not young, middle-aged and older adults found video clips depicting inappropriate social behavior to be funny. They did this by showing footage of the shows The Office, Golden Girls, Mr. Bean and Curb Your Enthusiasm to adults of various ages.
According to Olga Khazan of The Atlantic, Stanley and her co-authors recruited 30 young adults, 22 middle-aged individuals, and 29 senior citizens to watch various comedy segments, then rate how funny and how socially appropriate they felt each one was. The researchers also used facial electromyography to measure the degree to which each of the clips caused the facial muscles responsible for forming a smile to move in each of the subjects.
The authors discovered that older adults were less likely to enjoy what is known as the aggressive style of humor, which involves laughing at the expense of others (a trademark of the Michael Scott character in The Office). People between the ages of 64 and 84 were about 23 percent less likely than middle-aged people to find a clip from that show funny, and 19 percent less than those in the 17- to 21-year-old category, Khazan said.
“Young adults were also more likely to smirk at the clips that showed self-deprecating humor, as exemplified in an episode of Curb Your Enthusiasm in which Larry pumps his waiter for information about how much his friend left as a tip,” she added. “The older participants, meanwhile, liked affiliative humor – the kind of jokes that bring people together through a funny or awkward situation. Stanley says a Golden Girls clip in which the women try to buy condoms and suffer an embarrassing price check is a good example.”
As Michael K. McIntyre, a reporter with The Cleveland Plain Dealer, pointed out, the age-related differences in humor does not suggest that certain age groups don’t “get” certain types of jokes or fail to understand why something is supposed to be funny – just that older people tend to differ from younger adults in what they find humor in.
The study wasn’t just about determining what makes people of different ages laugh, however. Stanley, who conducted the research at Brandeis University before moving to Akron, told McIntyre that she hopes the research “helps us better understand how we perceive social and emotional events in young, middle-age and older adulthood.” She noted that it was the first study to demonstrate differences in the way generations perceive humor.
Stanley’s research, which could also help advertisers who are attempting to reach specific audiences, was inspired by a UK study that found that older adults had difficulty distinguishing between socially appropriate and inappropriate behavior in video clips from the British version of The Office. She said that she believed that the study was misguided, and that perhaps not finding humor in some material influences their ratings of appropriateness.
“The study raises some intriguing questions about our concept of what is funny,” the University of Akron said in a statement. “Is that concept based on factors peculiar to generations, or does it evolve over time as we age and, perhaps, mellow? Those possibilities will need to be explored in a future episode of humor research. Stay tuned.”

Google Threatened With Lawsuit Over Hacked Nude Celebrity Photographs

Chuck Bednar for redOrbit.com – Your Universe Online
An attorney representing some of the female celebrities whose private photos were stolen and posted online by hackers a little over one month ago is threatening to sue Google for failing to do enough to remove the images from its website, various media outlets reported on Thursday.
According to Reuters reporter Brendan Pierson, entertainment industry lawyer Martin Singer wrote in a letter Wednesday that Google was “making millions and profiting from the victimization of women.” Singer, who did not reveal which celebrities he is representing, said that the tech giant could be liable for more than $100 million in damages.
The attorney, who said that he sent the Mountain View, California-based company multiple notices of violation of the Digital Millennium Copyright Act in an attempt to get the pictures removed from the website, also wrote that Google knew the photos were “hacked stolen property, private and confidential photos and videos unlawfully obtained and posted by pervert predators who are violating the victims’ privacy rights and basic human decency.”
The incident, which occurred in early September, saw hackers allegedly gain access to cloud storage accounts and steal private (and, in some cases, compromising) photos reportedly belonging to several celebrities, including Jennifer Lawrence, Kate Upton, Victoria Justice, Mary Elizabeth Winstead and Ariana Grande. Shortly thereafter, Apple launched an investigation into iCloud security issues believed to have played a role in the incident.
According to Eriq Gardner of the Hollywood Reporter, Singer said that “”Google’s ‘Don’t be evil’ motto is a sham,” and demanded that the search engine immediately remove all offending images. He also asked the website to preserve all records related to the pictures “pending subpoenas to be issued in the upcoming/pending litigation.”
Following the scandal, Google began conducting an investigation on matters such as fair use and who owns the copyright for a selfie before removing website URLs from its search results, Gardner explained. In the letter, Singer said the company had a responsibility to the celebrity victims that goes beyond its typical business operations, and also accused it of accommodating YouTube and Blogspot users who posted the offending images.
“If your wives, daughters or relatives were the victims of such blatant violations of basic human rights, surely you would take appropriate action,” the letter said. “But because the victims are celebrities with valuable publicity rights, you do nothing – nothing but collect millions of dollars in advertising revenue from your co-conspirator advertising partners as you seek to capitalize on this scandal rather than quash it… Google has turned a blind eye while its sites repeatedly exploit and victimize these women.”
A Google spokesperson told CNET’s Don Reisinger that the company believed it has acted responsibly, removing tens of thousands of images within hours of receiving the requests. However, Reisinger said that as of Thursday afternoon, CNET was still able to locate some of the nude images using Google’s Image search, though he noted they were “few and far between.”
Singer, who has previously represented actors John Travolta and Charlie Sheen and X-Men director Bryan Singer in high-profile cases, addressed his letter to Google founders Larry Page and Sergey Brin, as well as executive chairman Eric Schmidt and other high-ranking executives with the company, said Philip Sherwell of The Telegraph. The attorney also went on to praise Twitter for its quick action in removing the offending images and suspending the accounts of users that posted them.
“The letter to Google appears at this point to simply be a warning shot, but the attorney left open the possibility of the celebrities suing Google,” Reisinger said. Singer added that “the seriousness of this matter cannot be overstated. If Google continues to thumb its nose at my clients’ rights – and continues to both allow and facilitate the further victimization of these women – and disregards the demands of this letter, it does so as its own peril.”
—–
Amazon.com – Read eBooks using the FREE Kindle Reading App on Most Devices

Medical Discovery First Step On The Path To New Painkillers

Emma Thorne, The University of Nottingham

A major medical discovery by scientists at The University of Nottingham could lead to the development of an entirely new type of painkiller.

A drug resulting from the research, published in the journal Neurobiology of Disease, would offer new hope to sufferers of chronic pain conditions such as traumatic nerve injury, for which few effective painkillers are currently available.

The work, led by Dr. Lucy Donaldson in the University’s School of Life Sciences, in collaboration with David Bates, Professor of Oncology in the University’s Cancer Biology Unit, focuses on a signal protein called vascular endothelial growth factor (VEGF).

VEGF controls the re-growth of blood vessels in tissues which have been damaged by injury. It is a widely targeted compound for cancer, eye disease and other illnesses in which abnormal blood vessel growth occurs.

Drugs are used to inhibit the VEGF in cancer, which can otherwise lead to the formation of new blood vessels that provide oxygen and nutrients to tumors.

Professor Bates and colleagues had previously discovered in 2002 that VEGF comes in two forms and acts like a switch — one which turns on the growth of blood vessels and another that blocks growth.

Pain prevention

However, this latest research has shown for the first time that these two forms of VEGF not only act on blood vessels but also differently affect the sensory nerves that control pain.

The academics discovered that the VEGF that promotes blood vessel growth causes pain, while the other, which inhibits blood vessel growth, prevents pain.

The study has centered on understanding how these two types of VEGF work and why the body makes one form rather than the other.

The academics have been able to switch from the pain stimulating form to the pain inhibiting VEGF in animal models in the laboratory and are now investigating compounds to replicate this in humans. It is thought these compounds could form the basis for new drugs to be tested in humans in clinical trials.

The research has been funded by the Wellcome Trust, Diabetes UK, the British Heart Foundation and the Richard Bright VEGF Research Fund.

> Continue reading…

FCC Hires Firm To Help Convince TV Stations To Participate In Spectrum Auction

Chuck Bednar for redOrbit.com – Your Universe Online
In an attempt to win over TV stations reluctant to participate in next year’s spectrum incentive auction, the US Federal Communications Commission (FCC) has recruited an investment bank to draft a pitch document designed to highlight the benefits of participating, the Wall Street Journal reported on Tuesday.
According to WSJ writer Gautham Nagesh, the document will be sent to every US broadcaster eligible to take part in the 2015 auction, and will include dollar-value estimates in each market. “The broadcasters have thrown up roadblocks throughout the process and are currently challenging aspects of the FCC’s order in court,” he explained, “though they say they are fine with the auction as long as it remains voluntary.”
In order to help win over the skeptics, FCC Chairman Tom Wheeler has hired investment bank Greenhill & Co. to prepare the document with the goal of convincing stations that the auction – during which they will take bids to sell airwaves and either cease operations or relocate to a new channel – will provide them with a unique opportunity to earn extra money by parting with unwanted or unused spectrum.
“The auction represents in many ways an existential threat to the broadcasting industry, as it could result in dozens of stations going off the air,” Nagesh said. “The latest information from the FCC indicates the auction could affect more stations and markets than initially thought, partly because of the high projected prices stations can expect to receive for their spectrum in large urban and suburban markets.”
The incentive auction, which is set for mid-2015, will give wireless carriers their first opportunity to purchase low-frequency spectrum – but doing so will require television stations to be willing to part with them first, according to Rishika Sadam of Reuters.
Many station owners have been reluctant to commit to either going off the air or exchanging airwaves with one another, Sadam added, and Wheeler believes much of that centers around uncertainty that they would be properly compensated for their airwaves. An FCC official also said that the document will make it clear that the auction is not mandatory, and that TV stations are free to back out of the auction at any time.
“The success of the auction hinges on TV stations first volunteering to relinquish airwaves, for example going off air or sharing frequencies with another station,” explained Alina Selyukh, also of Reuters. “As broadcasters bid to sell their spectrum, wireless companies would bid to buy it, determining how much TV stations get paid.”
“Many TV station owners have been leery, unsure of the value of the sale or its risks,” Selyukh added.
The new packet, however, will show FCC projections which indicate that small and medium market TV stations could receive compensation comparable to those in Top 10 markets, where spectrum is considered to be the most valuable for wireless carriers. Representatives from Greenhill and the FCC also plan to meet with television executives in person over the next few months in order to explain the auction benefits, officials told Reuters.
“Senior FCC officials concede the presented estimates are high-end, based on an ‘optimal’ scenario where broadcasters give up 126 megahertz (MHz) of spectrum, which would mean 100 MHz ultimately would be sold to wireless companies to raise $45 billion,” Selyukh explained. “Other evaluations have estimated that 84 MHz of spectrum would be cleared for the auction, though FCC officials argue that may not necessarily mean less money raised depending on the size of the bids to buy that spectrum.”
The Commission’s estimates suggest that a station in the Wilkes Barre-Scranton, Pennsylvania region, which is the 54th largest television market in the US, could receive up to $150 million or a median of $140 million – more than a network in Washington, Boston or San Francisco, Reuters said. A station based in New York, the largest TV market in America, could get a maximum of $490 million or a median of $410 million, the news organization added.
“The FCC official cautioned that the figures are approximations, and said in many cases the final price would be lower depending on the demand for spectrum in specific markets,” said Nagesh. “The prices represent the value of a station’s spectrum, not the business as a whole, and in many cases appear to exceed the value of the stations themselves particularly with respect to smaller stations in large cities and suburbs.”
Like New York, larger markets such as Los Angeles ($340 million) and Philadelphia ($230 million) were among those markets where spectrum is expected to net the highest prices. However, smaller markets like Wilkes Barre-Scranton; Providence, Rhode Island and Palm Springs, California were also expected to have spectrum valued at more than $100 because of their proximity to larger urban areas, and even full power stations in secondary markets like Youngstown, Ohio, and West Palm Beach, Florida were expected to have a median value in excess of $90 million.
“The FCC’s model assumed the auction would produce revenue of $45 billion and free up 100 MHz of spectrum for the wireless carriers. Those figures can be adjusted depending on auction participation,” the Wall Street Journal reporter added. “The final formula used by the FCC to price stations must be approved by a commission vote, but the model uses a formula that should produce a similar result.”

Cold Cloud Of Toxic Hydrogen Cyanide Discovered Above South Pole Of Titan

Chuck Bednar for redOrbit.com – Your Universe Online
Analysis of data collected by NASA’s Cassini mission has revealed the presence of a large, cold, toxic cloud swirling above the south pole of Saturn’s moon Titan, according to research published in the October 2 edition of the journal Nature.
According to the US space agency, scientists from the Leiden Observatory in the Netherlands, the SRON Netherlands Institute for Space Research, the University of Bristol’s School of Earth Sciences, the LESIA-Observatoire de Paris and the Oxford University Department of Physics found that the giant polar vortex contained frozen particles of the toxic compound hydrogen cyanide (HCN).
“The discovery suggests that the atmosphere of Titan’s southern hemisphere is cooling much faster than we expected,” lead author Dr. Remco de Kok of Leiden Observatory and SRON Netherlands Institute explained in a statement Wednesday.
Titan is the only moon in the solar system that has a dense atmosphere, and like Earth it experiences seasons, the researchers explained. It takes 29 years to travel around the sun along with Saturn, and each of those seasons last approximately seven Earth years. The most recent season change took place in 2009, when spring took over for winter in the northern hemisphere and summer became autumn in the southern hemisphere.
In addition, the ESA explained that Titan’s atmosphere is dominated by nitrogen, and also contains small amounts of methane and other trace gases. Since the moon is roughly 10 times further away from the Sun than Earth, it is extremely cold, which allows methane and other hydrocarbons to rain onto the surface and form lakes and rivers.
Cassini first detected the swirling cloud in May 2012, when the southern hemisphere was experiencing autumn, NASA said. The cloud was several hundred miles across, and appeared to be an effect of the season change, the agency noted. However, scientists were puzzled by the altitude of the polar vortex – it is roughly 200 miles above Titan’s surface, and experts had believed that conditions there were too cold for clouds to form.
“We really didn’t expect to see such a massive cloud so high in the atmosphere,” Dr. de Kok said. To learn more about the cloud, the study authors poured over Cassini’s observations and discovered a key clue in the spectrum of sunlight reflected by the moon’s atmosphere.
“A spectrum splits the light from a celestial body into its constituent colors, revealing signatures of the elements and molecules present,” NASA explained. “Cassini’s visual and infrared mapping spectrometer (VIMS) maps the distribution of chemical compounds in Titan’s atmosphere and on its surface.”
Dr. de Kok explained that the light originating from the polar vortex was considerably different to other parts of Titan’s atmosphere, and noted that the researchers were able to clearly detect signs of frozen HCN molecules.
While HCN in gas form is present in small amounts in the moon’s atmosphere, the discovery of such molecules in the form of ice was surprising because HCN can only condense to form frozen particles if atmospheric temperatures reach levels of about minus 234 degrees Fahrenheit (minus 148 degrees Celsius) – about 200 degrees Fahrenheit (about 100 degrees Celsius) colder than theoretical models of Titan’s upper atmosphere currently predict.
To verify their results, the study authors looked at observations from Cassini’s composite infrared spectrometer (CIRS), which measures atmospheric temperature at different altitudes. That data revealed rapid cooling in Titan’s southern hemisphere, demonstrating that it would be possible to reach temperatures cold enough to cause the formation of the giant toxic cloud seen on the south pole.
“Atmospheric circulation has been drawing large masses of gas towards the south since the change of season in 2009,” NASA explained. “As HCN gas becomes more concentrated there, its molecules shine brightly at infrared wavelengths, cooling the surrounding air in the process. Another factor contributing to this cooling is the reduced exposure to sunlight in Titan’s southern hemisphere as winter approaches there.”
“These fascinating results from a body whose seasons are measured in years rather than months provide yet another example of the longevity of the remarkable Cassini spacecraft and its instruments,” added Earl Maize, Cassini project manager at the Jet Propulsion Laboratory (JPL) in Pasadena, California. “We look forward to further revelations as we approach summer solstice for the Saturn system in 2017.”

Newly Discovered Rectangular Structure Sheds New Light On Moon Mystery

Chuck Bednar for redOrbit.com – Your Universe Online
Analysis of a massive rectangular feature buried just below the lunar surface by NASA’s Gravity Recovery and Interior Laboratory (GRAIL) spacecraft has revealed that the giant basin on the moon’s near side was likely created by ancient lava flows and not a massive asteroid collision, according to a new study.
Writing in the October 1 online edition of the journal Nature, lead author Colorado School of Mines Geophysics Associate Professor Jeff Andrews-Hanna and his colleagues explain that the rectangular structure is roughly 1,600 miles across (nearly as wide across as the US) and could have been formed by magma-flooded rift valleys unlike anything else ever discovered on the moon.
The GRAIL scientists believe that the outline of the moon’s Procellarum region (also known as the Ocean of Storms) may have at one time resembled the rift zones found on the Earth, Mars and Venus. Furthermore, the gravity data being collected by GRAIL is allowing them to look beneath the surface at structures that are not visible by using the subtle gravitational pulls on the orbiting spacecraft, Andrews-Hanna and his colleagues noted.

Image Above: Earth’s moon as observed in visible light (left), topography (center, where red is high and blue is low), and the GRAIL gravity gradients (right). The Procellarum region is a broad region of low topography covered in dark mare basalt. The gravity gradients reveal a giant rectangular pattern of structures surrounding the region. Credit: NASA/GSFC/JPL/Colorado School of Mines/MIT
According to BBC News science correspondent Jonathan Amos, now that the scientists have learned of the feature’s existence, they are able to trace its outline, even in ordinary photographs. Andrews-Hanna told him that it covered approximately 17 percent of the lunar surface, and that in terms of area, it was roughly the same size as North America, Europe and Asia combined.
“It’s really amazing how big this feature is,” the professor told Amos on Wednesday. “When we first saw it in the Grail data, we were struck by how big it was, how clear it was, but also by how unexpected it was. No one ever thought you’d see a square or a rectangle on this scale on any planet.”
The study authors report that the Ocean of Storms region is comprised of a lot of naturally-occurring radioactive elements, including uranium, thorium and potassium. During the earliest days of the Moon, these elements would have heated the crust, and it would have contracted when it started to cool down.
“The angles seen in the edges of this valley are consistent with rift valleys on Earth,” said Rachel Feltman of the Washington Post, adding that the authors “believe that as the moon was cooling in the early days of its development, a rogue plume of magma shot up in this region.”
“Because the lava made this area so much hotter than the mostly-cooled rock around it, the surface cracked and shrank away from the cool surrounding crust,” Feltman added. Eventually, additional lava seeped out of this rectangular frame and filled the valley, forming Procellarum in the process.
However, the research team is not certain exactly how this magma plume would have reached the surface to begin with. Study co-author and MIT professor of geophysics and vice president for research Maria Zuber told the Washington Post that while some may continue to speculate that the moon’s surface was heated by an asteroid impact, which would have caused the plume to rise, there is no direct evidence to support that notion.
The rectangular shape was revealed by the ultra-precise lunar gravity map created by the GRAIL mission probes, said Space.com contributor Charles Q. Choi. He added that the newly discovered pattern of structures were similar in nature to those found on Enceladus, the icy moon of Saturn, suggesting that both moons had experienced the same type of geological history.
The discovery of these rift zones “reveals a much more dynamic early moon than we had previously envisioned. I think we are only just beginning to understand the earliest history of the moon,” Andrews-Hanna told Choi. He added that since previous research had not predicted the existence of these structures on either the moon or Enceladus, it reveals that there is “much left to learn in order to understand the full spectrum of planetary evolution.”
“Our gravity data is opening up a new chapter of lunar history, during which the Moon was a more dynamic place than suggested by the cratered landscape that is visible to the naked eye,” the professor added in a statement. However, he noted that additional research was needed to better understand the causes of this newly-discovered pattern of gravity anomalies, as well as its implications for the history of the Moon.

Smithsonian Scientists Discover Coral’s Best Defender Against An Army Of Sea Stars

Kathryn Sabella, Smithsonian

Coral reefs face a suite of perilous threats in today’s ocean. From overfishing and pollution to coastal development and climate change, fragile coral ecosystems are disappearing at unprecedented rates around the world. Despite this trend, some species of corals surrounding the island of Moorea in French Polynesia have a natural protector in their tropical environment: coral guard-crabs. New research from the National Museum of Natural History’s Smithsonian Marine Station scientist Seabird McKeon and the museum’s predoctoral fellow Jenna Moore of the Florida Museum of Natural History has helped unravel the complex symbiotic relationship between these crabs and the coral reefs they live in and defend. Details from this study are published in the Sept. 30 issue of the open-access journal, PeerJ.

The new research highlights the role of diversity in the healthy functioning of coral reef ecosystems and shows that guard-crab species and size classes offer different kinds of effective protection against various threats to coral reefs.

“We found that diversity in both species and size of coral guard-crabs is needed to adequately fend off coral predators,” said McKeon. “It is an example of how biodiversity is crucial to conserving reef environments and the essential resources they provide for thousands of species, including humans.”

Coral guard-crabs belong to the genus Trapezia and defend their habitat in coral reefs from predators called corallivores in exchange for shelter and nutrition. In 2008 and 2009, four species of coral guard-crabs known to protect coral from predator sea stars were studied in a series of experiments to examine the effectiveness of different species and various sizes of crabs at repelling multiple corallivores. At the time, crown-of-thorns sea stars, which can grow to the size of a trashcan lid and decimate coral reefs, experienced a population boom on Moorea, threatening the entire reef community found there.

In one trial, the research team selectively removed the largest species of coral guard-crab, T. flavopunctata, from corals in the path of the army of sea stars and observed the effects. The results were dramatic; corals without guard-crabs, or with other species of guard-crab, were eaten—usually overnight.

“Seemingly small differences among crabs guarding their coral homes can have big effects on coral survival,” said Moore. “Not only does the level of protection provided vary by species, but the smallest crabs were defending the coral from coral-eating snails, a threat that larger crabs ignored.”

The team concluded that multiple lines of defense are a direct result of guard-crab diversity and will be necessary to keep coral reefs safe long-term. Additional surveys revealed that the host corals were not the only species that benefitted from the diverse guard-crab ecosystem. Small corals of other species sheltering in the shadow of the crab-coral symbiosis were also shielded from corallivore predation.

Coral reefs have long protected humans living in coastal areas, acting as barriers again storms, wave damage and erosion. They are also a rich source of food for people around the globe, providing habitat and nurseries for as much as one-quarter of all ocean species. As waters to continue to warm and ocean acidification changes the chemistry of the Earth’s marine systems, corals, and the incredible diversity of life they support, are at risk of vanishing.

The Smithsonian Marine Station in Ft. Pierce, Fla., is a research center specializing in marine biodiversity and ecosystems. The Marine Station, a facility of the National Museum of Natural History, draws scientists and students from the Smithsonian and collaborating institutions to investigate the plants, animals and physical processes in the ocean and Indian River Lagoon.

> Continue reading…

Improving Babies’ Language Skills Before They’re Even Old Enough To Speak

Rob Forman, Rutgers University

A Rutgers researcher focuses infants on noticing the sounds that are most important

In the first months of life, when babies begin to distinguish sounds that make up language from all the other sounds in the world, they can be trained to more effectively recognize which sounds “might” be language, accelerating the development of the brain maps which are critical to language acquisition and processing, according to new Rutgers research.

The study by April Benasich and colleagues of Rutgers University-Newark is published in the October 1 issue of the Journal of Neuroscience.

The researchers found that when 4-month-old babies learned to pay attention to increasingly complex non-language audio patterns and were rewarded for correctly shifting their eyes to a video reward when the sound changed slightly, their brain scans at 7 months old showed they were faster and more accurate at detecting other sounds important to language than babies who had not been exposed to the sound patterns.

“Young babies are constantly scanning the environment to identify sounds that might be language,” says Benasich, who directs the Infancy Studies Laboratory at the University’s Center for Molecular and Behavioral Neuroscience. “This is one of their key jobs – as between 4 and 7 months of age they are setting up their pre-linguistic acoustic maps. We gently guided the babies’ brains to focus on the sensory inputs which are most meaningful to the formation of these maps.”

[ Watch Video: Helping Babies Learn Language Skills ]

Acoustic maps are pools of interconnected brain cells that an infant brain constructs to allow it to decode language both quickly and automatically – and well-formed maps allow faster and more accurate processing of language, a function that is critical to optimal cognitive functioning. Benasich says babies of this particular age may be ideal for this kind of training.

“If you shape something while the baby is actually building it,” she says, “it allows each infant to build the best possible auditory network for his or her particular brain. This provides a stronger foundation for any language (or languages) the infant will be learning. Compare the baby’s reactions to language cues to an adult driving a car. You don’t think about specifics like stepping on the gas or using the turn signal. You just perform them. We want the babies’ recognition of any language-specific sounds they hear to be just that automatic.”

Benasich says she was able to accelerate and optimize the construction of babies’ acoustic maps, as compared to those of infants who either passively listened or received no training, by rewarding the babies with a brief colorful video when they responded to changes in the rapidly varying sound patterns. The sound changes could take just tens of milliseconds, and became more complex as the training progressed.

Looking for lasting improvement in language skills

“While playing this fun game we can convey to the baby, ‘Pay attention to this. This is important. Now pay attention to this. This is important,’” says Benasich, “This process helps the baby to focus tightly on sounds in the environment that ‘may’ have critical information about the language they are learning. Previous research has shown that accurate processing of these tens-of-milliseconds differences in infancy is highly predictive of the child’s language skills at 3, 4 and 5 years.”

The experiment has the potential to provide lasting benefits. The EEG (electroencephalogram) scans showed the babies’ brains processed sound patterns with increasing efficiency at 7 months of age after six weekly training sessions. The research team will follow these infants through 18 months of age to see whether they retain and build upon these abilities with no further training. That outcome would suggest to Benasich that once the child’s earliest acoustic maps are formed in the most optimal way, the benefits will endure.

Benasich says this training has the potential to advance the development of typically developing babies as well as children at higher risk for developmental language difficulties. For parents who think this might turn their babies into geniuses, the answer is – not necessarily.  Benasich compares the process of enhancing acoustic maps to some people’s wishes to be taller. “There’s a genetic range to how tall you become – perhaps you have the capacity to be 5’6” to 5’9″,” she explains. “If you get the right amounts and types of food, the right environment, the right exercise, you might get to 5’9” but you wouldn’t be 6 feet. The same principle applies here.”

Benasich says it’s very likely that one day parents at home will be able to use an interactive toy-like device – now under development – to mirror what she accomplished in the baby lab and maximize their babies’ potential. For the 8 to 15 percent of infants at highest risk for poor acoustic processing and subsequent delayed language, this baby-friendly behavioral intervention could have far-reaching implications and may offer the promise of improving or perhaps preventing language difficulties.

> Continue reading…

Urban vs Rural Homes: Understanding The Health Benefits

Families and individuals often find themselves questioning where they should be living, in the city or in the country. The question of whether urban living is better or worse than rural living involves a number of issues and factors, but the bottom line is choosing a place to live that provides the best quality of life for that family or individual.

When considering whether to live in an urban or rural area, health and wellness is always a top concern. It is a medical fact that living in certain environments and under certain circumstances can directly and indirectly affect one’s health and wellbeing, so trying to determine which environment would be the best is not only about finding a home that makes one happy but also what makes one healthy. This article will take a look at the various benefits of living in both urban homes and rural homes.

Urban home advantages

Living in an urban environment means living with high levels of noise, activity and environmental pollution. Urban environments can have higher incidence of crime, large class sizes in schools and a lower ratio of available jobs to unemployed job seekers. However, there are a number of positive factors that make living in an urban home advantageous.

Urban living offers its residents diversity and choice, two factors that can greatly contribute to improved quality of life for an individual or family living there. For example, urban areas are characterized by a great diversity in cultures and ethnicities, with unique neighborhood enclaves featuring clothing, food and cultural events. The opportunity to consume a variety of natural foods, unique dishes, herbs and spices will encourage healthier eating and healthier lifestyles, while enjoying the many ethnic fairs and cultural events that occur in these neighborhoods offers local residents the opportunity to learn new things and meet new and interesting people.

Urban living is frequently communal, though social interactions may be limited to some extent. Urban communities center on the neighborhood, whether it is a block of apartment buildings or a high-rise complex of flats, and the residents and businesses in that neighborhood are a part of that community. It is not unusual for residents to be on a first-name basis with a local store owner or pharmacist. Many of these living communities will offer amenities to promote health and wellness among the residents such as fitness programs, organized excursions and even clubs and organizations for residents to enjoy each other’s company and put the day’s stress aside.

Urban areas also offer almost unlimited access to social events, entertainment and cultural events. Residents have the opportunity to interact with people from different cultures and classes in a variety of circumstances, increasing their level of culture and improve their quality of life. Urban centers also offer a wider range of educational and employment options, providing residents with opportunities and alleviating stress by making it possible to achieve goals more easily. Urban areas offer a greater access to healthcare services, giving residents greater ability to manage their health with the assistance of medical staff, and they place greater emphasis on walking as a way to get around, encouraging basic fitness and consequently improvement in general health and wellbeing.

Rural home advantages

Living in a rural area may not offer the diversity and choice that living in an urban center does, but rural living definitely has its own noteworthy advantages and health benefits. Stress can be one of the greatest strains on the quality of life for an individual, bringing on illness, depression and other health issues. The absence of urban stressors such as traffic jams, high crime rates, and long lines is one of the most appealing aspects of rural living.

Another important aspect of rural living involves the development and fostering of personal relationships. While urban living provides access to a greater variety of social networks and networking opportunities, rural personal relationships tend to be more personal and long lasting; it is not unusual for rural residents to begin school together and continue to be friends until graduation from high school and beyond.

Personalized education is another positive aspect of rural living. Rural schools usually have smaller class sizes, allowing teachers to focus more on individual students, and allowing those students to focus more on their own personal potential and not simple competing with the masses.

Nutrition is another advantage to living in rural areas. Rural areas have more immediate access to fresh fruits and vegetables as well as more direct access to organic products. While the amount of available fine dining options may be low, restaurants and grocers may offer freshly picked produce, often on the same day it was picked, and meat and dairy products are also available freshly butchered and even unprocessed.

The greatest health advantage for rural living is environmental. Rural areas are usually far from the smog and pollution of urban areas, offering clean, fresh air and plenty of open green spaces. Being in the fresh air outdoors is not only of advantage for physical health but also mental health. Fresh air and open spaces have a positive effect on the mental state of an individual, reducing stress that can do damage to the physical body when left unchecked. Unchecked stress can lead to weight gain and even the development of cardiac illnesses and ailments. Spending time in the outdoors allows one to decompress, clearing the head and refreshing the body.

Urban living vs. rural living

After comparing the two modes of living, their pros and cons, it is easy to see that both urban and rural homes have health benefits and health disadvantages. While urban living offers superior access to employment, healthcare, education options and cultural diversity, rural living offers residents a greater opportunity to connect with other people and to be more directly connected with nature. Both living areas offer high degrees of socialization, though the manner and personal quality of those interactions are less formal and more intimate in a rural environment.

US Authorities Arrest, Charge Four Members Of Xbox Underground Hacking Ring

Chuck Bednar for redOrbit.com – Your Universe Online
Four men who were part of an alleged international computer hacking operation have been charged with stealing over $100 million worth of intellectual property from Microsoft, the US Army and various top video game publishers, officials from the US Justice Department revealed on Tuesday.
According to Reuters reporter Aruna Viswanatha, Sanadodeh Nesheiwat, 28, of New Jersey and David Pokora, 22, of Ontario, Canada pleaded guilty to charges in Delaware federal court. Nathan Leroux, 20, of Maryland, and Austin Alcala, 18, of Indiana, were also charged in an 18-count superseding indictment that had been unsealed earlier in the day.
A fifth suspect is an Australian citizen who has been charged in his native country for his role in the activity, the Justice Department said. Prosecutors said the ring accessed and stole source code, technical specifications and other data from the computer networks of Microsoft and several of its partners, Viswanatha added.
Furthermore, the hackers are accused of stealing information about pre-release versions of the video game titles Gears of War 3 and Call of Duty: Modern Warfare 3, as well as logging into a US Army network to swipe simulator software for the Boeing Apache attack helicopter, Justice Department officials told Reuters on Tuesday.
“Those attacks occurred after the ring hacked into the network of Zombie Studios, a Seattle-based video game developer contracted by the Army to make the training software, according to the indictment,” said Viswanatha. “The men allegedly obtained access to the computer networks partly by using the stolen user names and passwords of employees at the partner firms.”
The indictment dates back to April, and the Justice Department has placed the value of the stolen technology between $100 million and $200 million, according to Nicky Woolf of The Guardian. Pokora and Nesheiwat each pleaded guilty to a single count of conspiracy to commit computer fraud and copyright infringement. Their sentencing is scheduled for January, and they face a maximum of five years in prison.
The hackers were members of a ring known as Xbox Underground, and in addition to stealing trade secrets and financial data, Gizmodo’s Adam Clark Estes reported that they even manufactured and sold a counterfeit Xbox One prior to the hardware’s release. In addition to those already charged, authorities believe there are six others involved, he added.
Assistant US attorney Ed McAndrew told Woolf that FBI officials in Delaware were first tipped off about Xbox Underground’s activity by an anonymous informant in January 2011, and that the gaming companies involved cooperated fully with the probe. Pokora, who was viewed by the other members of the group as a leader, was arrested back in March at a border crossing in Lewiston, New York.
The Justice Department also revealed that an Australian citizen has been charged under that country’s law for his alleged role in the hacking ring. The individual was not named, but The Guardian (citing Australian media reports) identified him as 19-year-old Perth native Dylan Wheeler. Wheeler, who was 17 at the time, was the one who listed the homemade Xbox One prototype on eBay at a time when the console was still under development.
“As the indictment charges, the members of this international hacking ring stole trade secret data used in high-tech American products, ranging from software that trains US soldiers to fly Apache helicopters to Xbox games that entertain millions around the world,” assistant US attorney General Caldwell said, according to Woolf.

Google Triples Maximum Bounty For Discovery Of Chrome Vulnerabilities

Chuck Bednar for redOrbit.com – Your Universe Online
Citing the extra effort required to find vulnerabilities in Chrome, Google has announced that it would be tripling the maximum bounty that bug hunters could earn by finding flaws in its web browser from $5,000 to $15,000.
“Due in part to our collaboration with the research community, we’ve squashed more than 700 Chrome security bugs and have rewarded more than $1.25 million through our bug reward program. But as Chrome has become more secure, it’s gotten even harder to find and exploit security bugs,” Tim Willis of the Chrome Security Team wrote in a blog post Tuesday.
“We’ll pay at the higher end of the range when researchers can provide an exploit to demonstrate a specific attack path against our users. Researchers now have an option to submit the vulnerability first and follow up with an exploit later,” he added. “We believe that this is a win-win situation for security and researchers: we get to patch bugs earlier and our contributors get to lay claim to the bugs sooner, lowering the chances of submitting a duplicate report.”
Under the new policy, the Mountain View, California-based software giant is paying between $500 and $15,000 to white-hat hackers based on the severity of the security flaw they locate, explained ZDNet’s Charlie Osborne. However, Google went on to note that in special cases, hackers could be eligible for more, as was the case when one researcher earned $30,000 for detecting serious exploits that could be used to circumvent the Chrome sandbox, she added.
Willis also noted that anyone compensated under the Chrome bounty program would have their name listed in the Google Hall of Fame, so they would have “something to print out and hang on the fridge,” and that the company would be paying submissions dating back to July 1 under the new, higher-level rewards programs.
“Now at least a decade old, bug bounties have become a way for tech firms to pay security researchers for their efforts without hiring them as full-time employees,” said Seth Rosenblatt of CNET. “The bounty programs benefit companies by not only finding security holes early, but keeping those vulnerabilities from being sold on the black market.”
Rosenblatt added that when Google initially launched its Chrome bounty program in 2010, it was criticized by some security experts who argued that the company was not offering enough compensation to make the bug-hunt worthwhile. Since then, however, the company has become known for generously rewarding those who submit particularly hard-to-find vulnerabilities. Google has also vowed to be more transparent about its payment scale.
“Bug bounty programs have proven fruitful for large Web companies such as Google and Facebook, who can attract a greater number of eyes to their software without hiring more security analysts,” said PC World’s Jeremy Kirk. “But independent researchers have a lot of options for selling vulnerabilities through professional brokers such as Vupen and Netragard to cybercriminals looking for new vulnerabilities they can use to spread malware.”
“We understand that our cash reward amounts can be less than these alternatives, but we offer you public acknowledgement of your skills and how awesome you are, a quick fix and an opportunity to openly blog/talk/present on your amazing work,” Willis added. “Also, you’ll never have to be concerned that your bugs were used by shady people for unknown purposes.”

Mars Exploration, Earth Observation The Focus Of Joint US And India Collaboration

Chuck Bednar for redOrbit.com – Your Universe Online
Just days after their respective spacecraft successfully entered orbit around Mars, top officials from NASA and the Indian Space Research Organisation (ISRO) have signed an agreement vowing to work together on future exploration missions involving the Red Planet, officials at the US space agency have announced.
At the International Astronautical Congress in Toronto on Tuesday, NASA Administrator Charles Bolden and ISRO chairman K. Radhakrishnan signed a charter establishing a joint NASA-ISRO Mars Working Group in order to enhance cooperation between the two organizations, as well as an international agreement detailing how they will join forces on the forthcoming NASA-ISRO Synthetic Aperture Radar (NISAR) mission.
“The signing of these two documents reflects the strong commitment NASA and ISRO have to advancing science and improving life on Earth,” NASA Administrator Charles Bolden said in a statement Tuesday. “This partnership will yield tangible benefits to both our countries and the world.”
According to NASA, the Mars Working Group will look to identify and implement new scientific and technological goals that it and the ISRO have in common regarding Mars exploration. The group plans to meet once per year to plan cooperative activities and programs, including potential cooperative future missions to Mars.
On September 21, NASA’s Mars Atmosphere and Volatile EvolutioN (MAVEN) spacecraft successfully entered orbit around the Red Planet, making it the first spacecraft dedicated to studying the upper atmosphere of the planet. Two days later, it was joined by ISRO’s Mangalyaan orbiter, allowing India to become the first Asian nation to reach the Red Planet and the only space program to ever successfully accomplish the feat on its first attempt.
One of the objectives of the Mars Working Group will be to explore potential coordinated observations and science analysis between the two vehicles, as well as other current and future Mars missions, the US space agency explained. John Grunsfeld, NASA associate administrator for science, said that the two countries had “a long history of collaboration in space science,” and that the new agreements would “significantly strengthen our ties and the science that we will be able to produce as a result.”
In addition, the two agencies have agreed to join forces on the NISAR Earth-observation satellite. NISAR, which is currently scheduled to launch in 2020, will measure the causes and consequences of land surface changes throughout the world, with potential research targets including ecosystem disturbances, ice sheet collapses and natural disasters. The mission is designed to measure subtle changes of the planet’s surface and improve the scientific community’s understanding of the effects of climate change.
“NISAR will be the first satellite mission to use two different radar frequencies (L-band and S-band) to measure changes in our planet’s surface less than a centimeter across,” NASA said. “This allows the mission to observe a wide range of changes, from the flow rates of glaciers and ice sheets to the dynamics of earthquakes and volcanoes.”
Under the terms of the new agreement, the NISAR mission’s L-band synthetic aperture radar (SAR) will be provided by NASA, along with a high-rate communication subsystem designed for science data, GPS receivers, a solid state recorder, and a payload data subsystem. Their ISRO colleagues, on the other hand, will provide the spacecraft bus, an S-band SAR, the launch vehicle and associated launch services required for the project.
NASA explained that agency scientists had been analyzing SAR mission concepts since the National Academy of Science’s decadal survey of its Earth sciences program in 2007. NASA first developed a partnership with ISRO in 2008, and the partnership has been governed by a framework agreement signed by both agencies at that time.
“This cooperation includes a variety of activities in space sciences such as two NASA payloads – the Mini-Synthetic Aperture Radar (Mini-SAR) and the Moon Mineralogy Mapper – on ISRO’s Chandrayaan-1 mission to the moon in 2008,” the US space agency concluded. “During the operational phase of this mission, the Mini-SAR instrument detected ice deposits near the moon’s northern pole.”
Image 2 (below): NASA Administrator Charles Bolden (left) and Chairman K. Radhakrishnan of the Indian Space Research Organisation signing documents in Toronto on Sept. 30, 2014 to launch a joint Earth-observing satellite mission and establish a pathway for future joint missions to explore Mars. Credit: NASA

Antioxidant Found In Grapes Can Help Treat Acne

Rachel Champeau, University of California – Los Angeles Health Sciences

Got grapes? UCLA researchers have demonstrated how resveratrol, an antioxidant derived from grapes and found in wine, works to inhibit growth of the bacteria that causes acne.

The team also found that combining resveratrol with a common acne medication, benzoyl peroxide, may enhance the drug’s ability to kill the bacteria and could translate into new treatments.

Published in the current online edition of the journal Dermatology and Therapy, the early lab findings demonstrated that resveratrol and benzoyl peroxide attack the acne bacteria, called Propionibacterium acnes, in different ways.

Resveratrol is the same substance that has prompted some doctors to recommend that adults drink red wine for its heart-health properties. The antioxidant stops the formation of free radicals, which cause cell and tissue damage. Benzoyl peroxide is an oxidant that works by creating free radicals that kill the acne bacteria.

“We initially thought that since actions of the two compounds are opposing, the combination should cancel the other out, but they didn’t,” said Dr. Emma Taylor, the study’s first author and an assistant clinical professor of medicine in the division of dermatology at the David Geffen School of Medicine at UCLA. “This study demonstrates that combining an oxidant and an antioxidant may enhance each other and help sustain bacteria-fighting activity over a longer period of time.”

The team grew colonies of the bacteria that causes acne and then added various concentrations of resveratrol and benzoyl peroxide both alone and together. The researchers monitored the cultures for bacterial growth or killing for 10 days.

They found that benzoyl peroxide was able to initially kill the bacteria at all concentration levels, but the effect was short lived and didn’t last beyond the first 24 hours.

Resveratrol didn’t have a strong killing capability, but it inhibited bacterial growth for a longer period of time. Surprisingly, the two compounds together proved the most effective in reducing bacteria counts.

“It was like combining the best of both worlds and offering a two-pronged attack on the bacteria,” said senior author Dr. Jenny Kim, professor of clinical medicine in the division of dermatology at the Geffen School.

Scientists have understood for years how benzoyl peroxide works to treat acne, but less has been known about what makes resveratrol effective — even though it has been the subject of previous studies. Using a high-powered microscope, the UCLA researchers observed that bacteria cells lost some of the structure and definition of their outer membranes, which indicated that resveratrol may work by altering and possibly weakening the structure of the bacteria.

The researchers also cultured human skin cells and blood cells with the two compounds to test their toxicity. They found that benzoyl peroxide was much more toxic than resveratrol, which could help explain what causes skin to become red and irritated when it’s used as a topical treatment in high dose or concentration.

Taylor noted that combining the two compounds allowed for prolonged antibacterial effects on the acne bacteria while minimizing its toxicity to other skin cells. The finding could lead to a more effective and less irritating topical acne therapy.

“We hope that our findings lead to a new class of acne therapies that center on antioxidants such as resveratrol,” Taylor said.

The next stage of research will involve further laboratory testing to better understand the mechanism of the two compounds. Additional research will be needed to validate the findings in patients.

Millions suffer from acne, and it has a significant psychosocial effect on patients, but limited progress has been made in developing new strategies for treating it. According to researchers, antibiotic resistance and side effects limit the efficacy of the current treatments, which include benzoyl peroxide, retinoids, antibiotics and Accutane (isotretinonin).

Asthma May Be Linked To Lack Of Ventilation For Gas Stoves

Michelle Klampe, Oregon State University
Parents with children at home should use ventilation when cooking with a gas stove, researchers from Oregon State University are recommending, after a new study showed an association between gas kitchen stove ventilation and asthma, asthma symptoms and chronic bronchitis.
“In homes where a gas stove was used without venting, the prevalence of asthma and wheezing is higher than in homes where a gas stove was used with ventilation,” said Ellen Smit, an associate professor in the College of Public Health and Human Sciences at OSU and one of the study’s authors. “Parents of all children should use ventilation while using a gas stove.”
Researchers can’t say that gas stove use without ventilation causes respiratory issues, but the new study clearly shows an association between having asthma and use of ventilation, Smit said. More study is needed to understand that relationship, including whether emissions from gas stoves could cause or exacerbate asthma in children, the researchers said.
Asthma is a common chronic childhood disease and an estimated 48 percent of American homes have a gas stove that is used. Gas stoves are known to affect indoor air pollution levels and researchers wanted to better understand the links between air pollution from gas stoves, parents’ behavior when operating gas stoves and respiratory issues, said Eric Coker, a doctoral student in public health and a co-author of the study.
The study showed that children who lived in homes where ventilation such as an exhaust fan was used when cooking with gas stoves were 32 percent less likely to have asthma than children who lived in homes where ventilation was not used. Children in homes where ventilation was used while cooking with a gas stove were 38 percent less likely to have bronchitis and 39 percent less likely to have wheezing. The study also showed that lung function, an important biological marker of asthma, was significantly better among girls from homes that used ventilation when operating their gas stove.
Many people in the study also reported using their gas stoves for heating, researchers found. That was also related to poorer respiratory health in children, particularly when ventilation was not used. In homes where the gas kitchen stove was used for heating, children were 44 percent less likely to have asthma and 43 percent less likely to have bronchitis if ventilation was used. The results did not change even when asthma risk factors such as pets or cigarette smoking inside the home were taken into account, Coker said.
“Asthma is one of the most common diseases in children living in the United States,” said Molly Kile, the study’s lead author. Kile is an environmental epidemiologist and assistant professor at OSU. “Reducing exposure to environmental factors that can exacerbate asthma can help improve the quality of life for people with this condition.”
The findings were published recently in the journal “Environmental Health.” Co-authors included John Molitor and Anna Harding of the College of Public Health and Human Sciences and Daniel Sudakin of the College of Agricultural Sciences. The research was supported by OSU.
Researchers used data from the Third National Health and Nutrition Examination Survey, or NHANES, conducted by the National Center for Health Statistics from 1988-1994. Data collected for NHANES is a nationally representative sample of the U.S. population.
The third edition of the survey is the only one in which questions about use of gas stoves were asked, Coker said. Participants were interviewed in their homes and also underwent physical exams and lab tests.
Researchers examined data from about 7,300 children ages 2-16 who has asthma, wheezing or bronchitis and whose parents reported using a gas stove in the home. Of those who reported using no ventilation, 90 percent indicated they did not have an exhaust system or other ventilation in their homes, Coker said.
Even though the study relies on older data, the findings remain relevant because many people still use gas stoves for cooking, and in some cases, for heat in the winter, the researchers said.
“Lots of older homes lack exhaust or other ventilation,” Coker said. “We know this is still a problem. We don’t know if it is as prevalent as it was when the data was collected.”
Researchers suggest that future health surveys include questions about gas stove and ventilation use. That would allow them to see if there have been any changes in ventilation use since the original data was collected.
“More research is definitely needed,” Coker said. “But we know using an effective ventilation system will reduce air pollution levels in a home, so we can definitely recommend that.”
> Continue reading…

Windows 10: Microsoft Officially Unveils Newest Version Of Its Operating System

Chuck Bednar for redOrbit.com – Your Universe Online
As anticipated, Microsoft unveiled the next version of their venerable Windows operating system on Tuesday, opting against calling it “Threshold” (its internal codename), “Windows 9” (the logical numerical successor to its previous OS, Windows 8) or “Windows One” (following in the footsteps of its Xbox line) and instead dubbed it “Windows 10.”
Somewhat unorthodox naming practices aside, Microsoft clearly has big plans for the latest edition of its flagship product line, according to Reuters reporter Bill Rigby. In fact, Terry Myerson, the Washington-based company’s head of operating systems, vowed that it would be “our greatest enterprise platform ever” at the San Francisco-based event.
Windows 10 will be the successor to the “largely unpopular” Windows 8, Rigby said, and will mark the return of the traditional Start button menu, much to the delight of the “many PC users” which the Reuters reporter said had “demanded” its return after Microsoft omitted it in the previous, mobile-friendly version. The new OS is due out next year and will feature both touch-control and mouse-and-keyboard control methods for use with different devices.
“Windows 10 represents the first step of a whole new generation of Windows,” Myerson said at Tuesday’s event, according to Brier Dudley of the Seattle Times. In addition to variations based on the type of hardware it is running on (smartphones, tablets laptops, desktops, etc.), the top priority was that the new platform should seem familiar to business users.
Microsoft Vice President Joe Belfiore also took the stage during the presentation to provide a demo of the new system in action, Dudley said. His presentation revealed a Start menu that combines a traditional “most used” list of files or programs, as well as a search box and a panel containing Windows 8-style “live tiles.” Belfiore said that the tiles were popular among customers, and that their height and width can be customized, the Times reporter added.
“It gives the familiarity of Windows 7 with some of the new benefits that exist in Windows 8,” Belfiore said, adding that the menu combines traditional style Win32 apps with programs designed especially for Windows 8 and Windows Phone and distributed through the company’s app store. In addition, he revealed that launching an app will no longer force the client to switch to the modern interface from the traditional desktop or vice versa in Windows 10.
“Windows 10 will also allow users to work with multiple desktops,” added Alex Wilhelm and Frederic Lardinois of TechCrunch. “Thanks to Microsoft’s new ‘Snap Assist’ UI, the company is making it easier for these power users who need these multiple desktops to grab apps from multiple desktops and move them around… And yes – if you really want love your keyboard, you can always drop back down into the command line, too, which has also been improved quite a bit.”
“While Microsoft focused mostly on the regular mouse and keyboard combo for interacting with the operating system, the company stressed that it is not giving up on touch,” they added. “Some of the gestures will change a bit in Windows 10 (swiping in from the left now gets you a task view, for example), but the overall feature set seems to be very similar to that in Windows 8 and even the Windows 8 Charms bar is still available.”
According to ZDNet’s Mary Jo Foley, Microsoft is expected to release a preview test build of Windows 10 within the next few days, and Wilhelm and Lardinois noted the company will launch a Windows Insider Program on Wednesday which will give PC and laptop users access to a very early beta version of the OS.
Sources report the final version of Windows 10 should be available by Spring 2015, according to Foley. The cost of the software has not been revealed, but earlier this week, Justin Haywald of Gamespot published a report suggesting that Windows 8 users would be able to upgrade to the next-gen version for free. Officials from Microsoft told him they would not comment on those rumors at that time.

Simulations Reveal An Unusual Death For Ancient Stars

Linda Vu, Berkeley Lab Computing Sciences
Findings made possible with NERSC resources and Berkeley Lab Code
Certain primordial stars—those between 55,000 and 56,000 times the mass of our Sun, or solar masses—may have died unusually. In death, these objects—among the Universe’s first-generation of stars—would have exploded as supernovae and burned completely, leaving no remnant black hole behind.
Astrophysicists at the University of California, Santa Cruz (UCSC) and the University of Minnesota came to this conclusion after running a number of supercomputer simulations at the Department of Energy’s (DOE’s) National Energy Research Scientific Computing Center (NERSC) and Minnesota Supercomputing Institute at the University of Minnesota. They relied extensively on CASTRO, a compressible astrophysics code developed at DOE’s Lawrence Berkeley National Laboratory’s (Berkeley Lab’s) Computational Research Division (CRD). Their findings were recently published in the Astrophysical Journal (ApJ).
First-generation stars are especially interesting because they produced the first heavy elements, or chemical elements other than hydrogen and helium. In death, they sent their chemical creations into outer space, paving the way for subsequent generations of stars, solar systems and galaxies. With a greater understanding of how these first stars died, scientists hope to glean some insights about how the Universe, as we know it today, came to be.
“We found that there is a narrow window where supermassive stars could explode completely instead of becoming a supermassive black hole—no one has ever found this mechanism before,” says Ke-Jung Chen, a postdoctoral researcher at UCSC and lead author of the ApJ paper. “Without NERSC resources, it would have taken us a lot longer to reach this result. From a user perspective, the facility is run very efficiently and it is an extremely convenient place to do science.”
The Simulations: What’s Going On?
To model the life of a primordial supermassive star, Chen and his colleagues used a one-dimensional stellar evolution code called KEPLER. This code takes into account key processes like nuclear burning and stellar convection. And relevant for massive stars, photo-disintegration of elements, electron-positron pair production and special relativistic effects. The team also included general relativistic effects, which are important for stars above 1,000 solar masses.
They found that primordial stars between 55,000 to 56,000 solar masses live about 1.69 million years before becoming unstable due to general relativistic effects and then start to collapse. As the star collapses, it begins to rapidly synthesize heavy elements like oxygen, neon, magnesium and silicon starting with helium in its core. This process releases more energy than the binding energy of the star, halting the collapse and causing a massive explosion: a supernova.
To model the death mechanisms of these stars, Chen and his colleagues used CASTRO—a multidimensional compressible astrophysics code developed at Berkeley Lab by scientists Ann Almgren and John Bell. These simulations show that once collapse is reversed, Rayleigh-Taylor instabilities mix heavy elements produced in the star’s final moments throughout the star itself. The researchers say that this mixing should create a distinct observational signature that could be detected by upcoming near-infrared experiments such as the European Space Agency’s Euclid and NASA’s Wide-Field Infrared Survey Telescope.
Depending on the intensity of the supernovae, some supermassive stars could, when they explode, enrich their entire host galaxy and even some nearby galaxies with elements ranging from carbon to silicon. In some cases, supernova may even trigger a burst of star formation in its host galaxy, which would make it visually distinct from other young galaxies.
“My work involves studying the supernovae of very massive stars with new physical processes beyond hydrodynamics, so I’ve collaborated with Ann Almgren to adapt CASTRO for many different projects over the years,” says Chen. “Before I run my simulations, I typically think about the physics I need to solve a particular problem. I then work with Ann to develop some code and incorporate it into CASTRO. It is a very efficient system.”
To visualize his data, Chen used an open source tool called VisIt, which was architected by Hank Childs, formerly a staff scientist at Berkeley Lab. “Most of the time I did my own visualizations, but when there were things that I needed to modify or customize I would shoot Hank an email and that was very helpful.”
Chen completed much of this work while he was a graduate student at the University of Minnesota. He completed his Ph.D. in physics in 2013.
> Continue reading…