Meditation may slow brain aging, study says

Provided by Mark Wheeler, University of California, Los Angeles (UCLA)
Since 1970, life expectancy around the world has risen dramatically, with people living more than 10 years longer. That’s the good news.
The bad news is that starting when people are in their mid-to-late-20s, the brain begins to wither — its volume and weight begin to decrease. As this occurs, the brain can begin to lose some of its functional abilities.
So although people might be living longer, the years they gain often come with increased risks for mental illness and neurodegenerative disease. Fortunately, a new study shows meditation could be one way to minimize those risks.
Building on their earlier work that suggested people who meditate have less age-related atrophy in the brain’s white matter, a new study by UCLA researchers found that meditation appeared to help preserve the brain’s gray matter, the tissue that contains neurons.
The scientists looked specifically at the association between age and gray matter. They compared 50 people who had mediated for years and 50 who didn’t. People in both groups showed a loss of gray matter as they aged. But the researchers found among those who meditated, the volume of gray matter did not decline as much as it did among those who didn’t.
The article appears in the current online edition of the journal Frontiers in Psychology.
Dr. Florian Kurth, a co-author of the study and postdoctoral fellow at the UCLA Brain Mapping Center, said the researchers were surprised by the magnitude of the difference.
“We expected rather small and distinct effects located in some of the regions that had previously been associated with meditating,” he said. “Instead, what we actually observed was a widespread effect of meditation that encompassed regions throughout the entire brain.”
As baby boomers have aged and the elderly population has grown, the incidence of cognitive decline and dementia has increased substantially as the brain ages.
“In that light, it seems essential that longer life expectancies do not come at the cost of a reduced quality of life,” said Dr. Eileen Luders, first author and assistant professor of neurology at the David Geffen School of Medicine at UCLA. “While much research has focused on identifying factors that increase the risk of mental illness and neurodegenerative decline, relatively less attention has been turned to approaches aimed at enhancing cerebral health.”
Each group in the study was made up of 28 men and 22 women ranging in age from 24 to 77. Those who meditated had been doing so for four to 46 years, with an average of 20 years.
The participants’ brains were scanned using high-resolution magnetic resonance imaging. Although the researchers found a negative correlation between gray matter and age in both groups of people — suggesting a loss of brain tissue with increasing age — they also found that large parts of the gray matter in the brains of those who meditated seemed to be better preserved, Kurth said.
The researchers cautioned that they cannot draw a direct, causal connection between meditation and preserving gray matter in the brain. Too many other factors may come into play, including lifestyle choices, personality traits, and genetic brain differences.
“Still, our results are promising,” Luders said. “Hopefully they will stimulate other studies exploring the potential of meditation to better preserve our aging brains and minds. Accumulating scientific evidence that meditation has brain-altering capabilities might ultimately allow for an effective translation from research to practice, not only in the framework of healthy aging but also pathological aging.”
The research was supported by the Brain Mapping Medical Research Organization, the Robson Family and Northstar Fund, the Brain Mapping Support Foundation, the Pierson‐Lovelace Foundation, the Ahmanson Foundation, the Tamkin Foundation, the William M. and Linda R. Dietel Philanthropic Fund at the Northern Piedmont Community Foundation, the Jennifer Jones‐Simon Foundation, the Capital Group Companies Foundation and an Australian Research Council fellowship (120100227). Nicolas Cherbuin of the Australian National University was also an author of the study.
—–
Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Stars you can only see in the Southern Hemisphere

Emily Bills for redOrbit.com – Your Universe Online

Take it from Danny and Sandy, stargazing is a must on summer nights. But it can also be worth it to brave the cold as you wait for a meteor shower with a cup of hot chocolate.

Stars represent the other galaxies, universes, and whatever else that lays far beyond our night sky. And because of this, we will never get tired of looking for constellations.

But we in the Northern Hemisphere can’t see some of the beloved constellations viewed by our friends in the Southern Hemisphere. So, if you’re an avid stargazer, you may need to start planning a trip. Here’s a list we’ve compiled of Southern Hemispheric-specific constellations for your reading pleasure.

Stars you can only see in the Southern Hemisphere

1. The Southern Cross (Crux)

For those who live south of the equator, it’s the Southern Cross, or Crux, that you look for to guide you whilst stargazing. The four bright stars make up a cross formation, (although some people say it looks more like a kite). Two of the four stars, Acrux and Becrux, are first-magnitude stars, meaning they are among the brightest in the sky. Along the eastern edge of the Crux is the Coalsack Nebula, a dark place where stars are born. Just near the Coalsack is something called The Jewel Box, which holds a cluster of red, white, and blue stars. To see the Southern Cross, you have to be below 25 degrees north. Or watch this video:

Or this (kinda):

2. Carina (The Keel)

This constellation is known as the keel, or the bottommost part of a ship. It makes a kind of “U” shape, and the circle at the end could be seen as the ship’s figurehead. It was originally part of a huge constellation called Argo Navis, a huge boat in the sky. The brightest star in Carina is Canopus, and is actually the second brightest star in the sky after Sirius.

3. Centaurus (The Centaur)

Centaurus lies below Hydra and Scorpius and resembles a half-man, half-horse creature. Its brightest star, Alpha Centauri, is the third brightest star in the sky and the closest star to the sun. It is also the home of the brightest and biggest globular cluster, a spherical cluster of stars that is extremely dense in the middle, Omega Centauri (NGC 5139). This cluster is actually visible to the naked eye as a faint smudge in the sky.

4. Sagittarius

While you can technically see this constellation in the Northern Hemisphere, it is best seen in the South. Like Centaurus, Sagittarius is a centaur, but he has a man riding atop him. This constellation is very large and most of the stars are faint, so many people just recognize the brighter stars in the center as a little teapot. On very dark summer and fall nights, you may be able to see ‘steam’ rising out of the spout. This steam is actually the galactic center of our frothy Milky Way.

Click here for Stars you can only see in the Northern Hemisphere

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Do matches actually burn off fart smells?

John Hopton for redOrbit.com – Your Universe Online

Let’s be honest, farting isn’t all bad. It can be (should be) a source of amusement, no matter what age you are. Or if you’re not in a humorous mood, you can use your partner’s farting as a source of righteous indignation and superiority. “How can you possibly be right about which mortgage to choose when you can’t even control your own sphincter?”

But on those occasions when we simply want the whole sorry incident to be over and the smell to be quickly banished, help is at hand. Happily, it is true that lighting a match is a quick and efficient way to get rid of fart smells. Assuming, of course, that you keep matches lying around for this or some other purpose – it would be a hell of a hefty release if it was still lingering after you had been to the store.

The smell in farts comes from hydrogen sulphide (sulphur), which can be burned away and decomposed into other, less offensive elements. Sulphur will ignite at around 250 degrees Celsius, which luckily is around the same as the ignition temperature of phosphorus–the ingredient in a match head. Another stroke of good fortune is that when sulphur breaks down into water vapor and sulphur dioxide, the sulphur dioxide smells the same as a burnt match, leaving other smellers believing that there is no trace of the bodily gas left at all.

What caused the offending item in the first place? Scientist Wendy Zukerman from ABC News Australia explains:

A fart (or flatus in the science terminology) is gas from our intestines. Gas gets into our intestine through several sources: when we swallow air, the chemical reactions in our guts (from when we breakdown food), and gas made from bacteria living in our intestine. How the gas entered our belly (and out of our bottom) will determine what type of chemicals our fart is made of, and how badly the fart smells. Flatus that came to our intestine from the air we swallow is mostly made of nitrogen (because the oxygen in air is absorbed by the body before it gets into the gut). Farts made from gas in bacteria are mostly made from hydrogen, methane and carbon dioxide.

Methane and hydrogen are flammable gases, and Wendy says that burning fart gas is just like burning gas on a stove–although presumably she does not advise using it to fry up bacon.

She also tells us that foods jam-packed with sulphur, like cauliflower, eggs and meat, are the prime suspects for a smelly fart, while beans, although notorious as fart-producers, don’t have a lot of sulphur in them, so they actually don’t really cause stinkiness.

It should be noted that there are two reasons why we would want to get rid of a fart. One of them is the fact that the smell is unpleasant, which the match takes care of, but there will also be times when we need to act quickly so that our those around us never knew there was a fart at all, for example on a date. A possible solution here, when your date wonders why you have suddenly struck up a match, is to say “I hold a tiny vigil at this time each day for all the little baby seals killed by trawler fishing.”

Or you could confidently pull out the new phrase: “He who smelt it dealt with it.”

Or you could just stop farting so much, you animal.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Humans are faithful lovers or promiscuous, study finds

Brett Smith for redOrbit.com – Your Universe Online

Is your ring finger longer than your index finger? If so, then you’re more likely to have a promiscuous love life, according to a new study.

Published in the journal Biology Letters, the new study also found that humans can fall into one of two groups: promiscuous or faithful. This is different than most other sexually-reproductive species, which are exclusively one or the other.

To reach their conclusion, the researchers compiled answers from almost 600 North American and British people on a standardized survey approximately social and sexual attitudes. They also looked at measurements of the right index finger compared to the right ring finger from over 1,300 British study volunteers. The team compared these fingers because the shorter the index finger is compared to the ring finger, the greater the levels of testosterone that person is more likely to have been exposed to while growing in the womb. Also, greater amounts of fetal testosterone have been linked to greater sexual promiscuity as an adult. While not predictive of a person’s behavior, finger length can help find out the group of people who are more prone to be promiscuous, the researchers said.

The two groups

Survey responses showed that people fall firmly into two groups: those who are faithful and those who tend to be promiscuous. The researchers saw that this was true for both men and women, with slightly more men falling into the promiscuous than the faithful group.

“We need to collect more data to confirm the possibility that there are more men in the promiscuous group, as these results are still quite preliminary,” said study author Rafael Wlodarski, a psychology researcher at Oxford University in the United Kingdom.

The evaluation of the finger lengths likewise learned that men tended to separate into two groups. One group had a ring finger which was considerably longer than the index finger, indicating that they had been exposed to more fetal testosterone and may be more prone to look for many sexual partners. The other group had fingers which were very similar in length, a sign they are more prone to seek long-term relationships.

“This research suggests that there may be two distinct types of individuals within each sex, pursuing different mating strategies,” Wlodarski said. “We observed what appears to be a cluster of males and a cluster of females who are more inclined to ‘stay’, with a separate cluster of males and females being more inclined to ‘stray,’ when it comes to sexual relationships.”

“It is important to note that these differences are very subtle, and are only visible when we look at large groups of people: we cannot really predict who is going to be more or less faithful,” said study author Robin Dunbar, a professor of experimental psychology at Oxford. “Human behavior is influenced by many factors, such as the environment and life experience, and what happens in the womb might only have a very minor effect on something as complex as sexual relationships.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Jogging too much causes higher mortality rate

Eric Hopton for redOrbit.com – Your Universe Online

For those of us who take their exercise with caution, this may come as something of a vindication. When it comes to physical action, we at redOrbit think a riveting round of ping pong is enough exercise for the day.

Don’t get us wrong – keeping fit is a great thing and, in moderation, leads to better average longevity and general health. As this study from Denmark points out, people who are physically active had at least a 30% lower risk of death during follow-up compared with those who are inactive.

Too much of a good thing

The conclusions of the study were simple. There was a definite association between all-cause mortality and the “dose” of jogging undertaken. “Dose” was calibrated by pace, quantity, and frequency of jogging. Light and moderate joggers were found to have lower mortality than sedentary non-joggers. But, and here’s the catch, those strenuous super-joggers had a mortality rate “not statistically different from that of the sedentary group.”

The research was part of the Copenhagen City Heart Study and the results have been published in the Journal of the American College of Cardiology. The study has been following 1,098 healthy joggers and 3,950 healthy non-joggers since 2001.

It was the “moderate” joggers who came out best. The team behind the study found that, compared with sedentary non-joggers, 1 to 2.4 hours of jogging each week was associated with the lowest mortality rate and that the optimal frequency of jogging was 2 to 3 times per week. The optimal pace was slow – around 5 miles an hour. The joggers were divided into light, moderate, and strenuous joggers. The lowest “Hazard Ratio” (HR) for mortality was found in light joggers, followed by moderate joggers, then strenuous joggers. The participants who ran for more than four hours a week or did no exercise at all had the highest death rates.

So, if the findings of this work are replicated elsewhere, it may be that there is after all an upper limit for safe exercise. The conclusion was that, “when prescribing exercise to improve longevity, strenuous exercise is not necessary, and might reduce the health benefits of light to moderate physical activity.” The scientists stress that further studies are needed “to explore the mechanisms by which excessively strenuous exercise adversely affects longevity before the pattern of association between exercise intensity and long-term mortality can be incorporated into physical activity recommendations for the general public.”

While the mechanism behind this fall-off of benefit from more strenuous exercise is not yet understood, the authors suspect that changes to the heart and arteries during the stress of exertion may be at least part of the answer.

It looks like the widely accepted recommendation that we all have 150 minutes of moderate-intensity activity a week is spot on. It’s not exactly a hare and tortoise situation, but it might just make the brisk walker or the plodding jogger feel better when the next racing runner flies past them.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Insulin-decreasing hormone discovered in flies, humans

Chuck Bednar for redOrbit.com – Your Universe Online

Researchers from the Stanford University School of Medicine have for the first time discovered evidence of the existence of limostatin, a long-suspected hormone that decreases insulin in both flies and humans, according to research published Tuesday in the journal Cell Metabolism.

That hormone, which was named for the Greek goddess of starvation, Limos, lessens insulin levels during recovery from fasting or starvation, according to the study authors. By doing so, it ensures that essential nutrients remain in the blood stream long enough to rebuild injured tissues instead of being quickly ferried into fat cells where it is far less accessible.

Senior author Dr. Seung Kim, a professor of developmental biology at Stanford, lead author and graduate student Ronald Alfa, and their colleagues initially discovered limostatin in fruit flies. Shortly thereafter, however, they found a protein in humans which performed a similar function.

“Starvation or famine is an ancient, ever-present specter faced by all living organisms,” Dr. Kim explained. “The ways to deal with it metabolically are likely to be ancient and conserved. This research clearly connects the dots between flies and humans, and identifies a new potential way to regulate insulin output in humans.”

Specifically, a family whose members possess an inherited mutation in the human version of the hormone exhibited many of the same physiological characteristics as flies that had been modified to be unable to produce limostatin. Among those traits were elevated levels of circulating insulin, low levels of blood sugar and a tendency towards early onset obesity, the authors explained.

Limostatin

Limostatin is part of a class of hormones known as decretins, which scientists have speculated about for the past 150 years. In 1932, a class of hormones known as incretins (which were expressed in the gut following a meal and stimulated the secretion of insulin) were identified for the first time. Other metabolic studies suggested that there might be another hormone that might suppress insulin production during times of famine or starvation.

Furthermore, the discovery of the decretin limostatin helps validate the use of fruit flies as a model to study diabetes in humans, Dr. Kim added. It could also help explain a phenomenon that sometimes occurs following bariatric surgery, in which the removal of a portion of the stomach to enable weight loss rapidly reverses signs of diabetes before any weight loss is observed.

“Your body has multiple checks and balances to control insulin production,” explained Dr. Kim. “We believe it’s likely that the decretin pathway acts as a sensing and regulatory method in many other situations in addition to starvation. Perhaps this is one possible explanation for the effect of bariatric surgery.”

The hormone was identified by virtue of its response to fasting. The researchers withheld food from laboratory fruit flies for 24-28 hours, then checked to see which types of genes were highly expressed during this period. They then narrowed down the list to those that encoded proteins resembling hormones, and found one that caused characteristics of insulin deficiency when overexpressed in flies.

“This work has critical ramifications for our understanding of metabolism, and has the potential to transform our approach to treating diseases like diabetes,” said Dr. Domenico Accili, director of the Columbia University Diabetes and Endocrinology Research Center who was not involved in the research.

“The discovery of limostatin, a new hormone that can act to decrease insulin release, is an important advance,” he added. “The notion that mammals express a related family of intestinal hormones that can affect insulin secretion may inform new efforts to find drugs that combat diabetes in humans.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Good emotions lower inflammation, study says

Brett Smith for redOrbit.com – Your Universe Online
Taking in culture or having a religious experience can to more than just make you feel good, it can actually make you physically healthier.
According to a new study published in the journal Emotion, these experiences and the emotions associated with them are linked to lower levels of pro-inflammatory cytokines, signals that make the immune system work harder.
“Our findings demonstrate that positive emotions are associated with the markers of good health,” said study author Jennifer Stellar, a postdoctoral researcher at the University of Toronto.
Cytokines serve an essential function in the immune system, rounding up cells to fight infection, disease, and trauma. However, constantly high levels of cytokines are linked to poorer health, and more specifically, to disorders as type-2 diabetes, arthritis, and clinical depression.
It’s been well-established that eating well and getting enough sleep leads to healthy levels of cytokines, but the new study is one of the first to look at how emotions are linked to these inflammation-causing proteins.
In two independent trials, over 200 young adult volunteers were asked if they had experienced positive emotions like amusement, awe, empathy, contentment, pleasure, love, and pride on that day. Next, gum and cheek tissue specimens were taken from participants. The researchers saw that that those who felt more of these positive emotions, especially awe, marvel, and amazement, had the lowest quantities of the inflammation-causing cytokine, Interleukin 6.
Along with autoimmune diseases, raised cytokines have been linked with depression, with a recent study showing that depressed volunteers had greater amounts of the cytokine TNF-alpha than non-depressed participants. Researchers have theorized that cytokines can obstruct more ‘positive’ hormones and neurotransmitters, such as serotonin and dopamine.
“That awe, wonder, and beauty promote healthier levels of cytokines suggests that the things we do to experience these emotions – a walk in nature, losing oneself in music, beholding art – has a direct influence upon health and life expectancy,” said study author Dacher Keltner, a UC Berkeley psychologist.
Stellar added that the study findings could indicate that, “awe is associated with curiosity and a desire to explore” as opposed to behavioral responses found in those with inflammation, “where individuals typically withdraw from others in their environment.”
The Toronto research noted that the study did not find a cause-and-effect relationship, meaning the team could not draw the conclusion that positive thoughts lead to lower cytokine levels.
“It is possible that having lower cytokines makes people feel more positive emotions, or that the relationship is bidirectional,” Stellar said.
A paper published back in September revealed that overweight individuals have higher levels of stress-induced inflammation than those within a healthy weight-range. The study was based on participants’ levels of Interleukin-6.
“We’ve known that overweight and obese individuals already have chronic, low grade inflammation,” said study author Nicolas Rohleder, a psychology professor from Brandeis University. “Now, it seems that when you add stress to the mix, it’s a double hit.”
—–
Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Manatees flock to spring (prepare for world domination?)

Chuck Bednar for redOrbit.com – Your Universe Online

A swarm of more than 300 manatees forced officials at the US Fish and Wildlife Service to temporarily close Three Sisters Springs in Florida, according to various media reports.

Manatees, also known as sea cows, are large marine mammals that are primarily herbivorous. They can be up to 13 feet (4 meters) in length and weigh up to 1,300 pounds (590 kilograms). We’ll refrain from any ‘Your Mom’ jokes.

An average of 65 manatees typically enter the Citrus County springs during cold weather.

However, according to WTSP News in Tampa, nearly five times that many moved into the interior of the springs sometime around noon on Monday. Volunteers on Tuesday counted a total of 293 manatees there as of 1:00 pm ET, and over 300 were there by 1:30 pm ET.

As a result, the USFWS declared that Three Sisters Springs would be temporarily closed, and the agency said that it would continue to monitor the situation. Officials reopened the area Tuesday, but said that it would close again if the rising of the tides caused another mass influx of manatees, according to USA Today.

Laura Ruettiman, an environmental education guide at the Springs, told the newspaper that the manatees frequent the area during high tide and cold weather conditions. She noted that there had been “a record number” of manatees at the Springs this year, and that the increasing number may be due to greater protection in the area and habitat loss elsewhere in the state.

Last month, the manatees at Three Sisters Springs made headlines for a different reason, as regulators moved to place limits on human interactions with the endangered creatures, according to Reuters. Advocates had asked the USFWS to approve bans on canoes and paddle boards in the area, as well as the creation of human-free zones designed to protect the sea cows.

Taking over hu-manatee?

“It’s kind of a madhouse,” explained Kimberly Sykes, assistant manager of the Crystal River National Wildlife Refuge, which includes Three Sisters Springs. “People are just bumping into manatees, because they can’t see them.”

“Overcrowding, both human and animal, has become hard to ignore at Three Sisters Springs. The 1.5 acre (6,000 square meter) waters are drawing record numbers of manatees seeking to warm up in waters that are heated by springs and are constantly 72 degrees Fahrenheit,” Reuters added.

In fact, over 125,000 people went there to swim with the manatees in 2013, the news agency said, and on some days upwards of 100 tourists per hour are there swimming with the manatees. With the number of people and manatees both on the rise, Manatee EcoTourism Association of Citrus County president Michael Birns said that space is getting cramped.

“We’ve got more people. We’ve got more manatees. What we don’t have is more space,” Birns said. Refuge manager Andrew Gude added that manatees are “very unique as a mammal in that they are so tolerant of people in this area,” and said that he has not seen them come to any harm. That has stopped for the call for limits to the human-manatee interaction.

“Under other Fish and Wildlife Service protections being discussed, only visitors with disabilities would be allowed to take kayaks, canoes and paddle boards into the springs. It is not known how quickly the new restrictions may win approval,” Reuters said. “Some conservationists would like to go even further and see the springs closed all winter.”

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Engineers create tri-layered artificial blood vessel

Chuck Bednar for redOrbit.com – Your Universe Online

Researchers at the Shanghai University Rapid Manufacturing Engineering Center have for the first time developed an artificial blood vessel composed of three layers, according to a new study currently appearing in the American Institute of Physics journal AIP Advances.

The developers used a combination of micro-imprinting and electro-spinning techniques to make the tri-layered vascular graft, and their efforts could allow scientists to use separate materials that respectively possess mechanical strength and promote new cell growth, something that has been a significant issue for existing artificial blood vessels comprised of a single of double layer.

Vascular grafts are usually attached to an obstructed or unhealthy blood vessel by surgeons, and they allow blood flow to be permanently redirected around those damaged tissues. They work by repurposing existing vessels from a patient’s own body or from a matched donor.

However, such sources are often insufficient due to the limited supply in the individual’s body, and the artificial blood vessels could be affected by the original conditions that required the graft in the first place. As a result, there has been tremendous research towards the development of synthetic blood vessels capable of mimicking naturals ones.

“The composite vascular grafts could be better candidates for blood vessel repair,” Yuanyuan Liu, an associate professor at the Rapid Manufacturing Engineering Center, said in a statement on Monday. Liu’s team had previously worked with bone scaffolds, which are used to repair bone defects, prior to turning their focus to treating cardiovascular disease.

Typically, things like bone scaffolds or artificial blood vessels need to closely mimic the natural vasculature of their targeted tissue. For vascular grafts, this can be fabricated by electrospinning, a process that uses an electrical charge to draw liquid inputs (in this case, a mixture of chitosan and polyvinyl alcohol) into extremely fine fibers.

The electrospinning process also makes it possible for a high surface-to-volume ratio of nanofibers, which provides plenty of space for host cells to grow and connect, the study authors explained. These components naturally decompose over a period of six months to one year, and once they are gone, they are replaced by a new, intact, naturally grown blood vessel.

Three layers are better than 2

However, the structure is not very rigid, and in order to solve this problem, Liu and her colleagues developed the three-layer model. In this model, the mixture was electrospun onto both sides of a microimprinted middle layer of poly-p-dioxanone, a biodegradable polymer commonly used in biomedical applications. The ends of the sheet were then folded and attached, producing a tube-like vessel that more accurately simulates a real blood vessel.

The tri-layered scaffold was then seeded using rat fibroblast cells, which are said to be ideal for such purposes due to their ease of cultivation and quick growth rate, in order to test the efficacy of the scaffold in promoting the expansion and integration of cells.

They found that the cells on these scaffolds proliferated quickly, likely due to the functional amino and hydroxyl groups introduced by the chitosan. While human trials are still far off, the scientists said that they are optimistic about their work, adding that they next plan to test the implants in an animal model to observe its efficacy with live vascular cells.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

FCC Chairman to propose making broadband a public utility

Chuck Bednar for redOrbit.com – Your Universe Online

US Federal Communications Commission chairman Tom Wheeler has revealed that he will be distributing proposed new net neutrality guidelines to members of the agency this week.

In a column the chairman penned for Wired on Wednesday, Wheeler explained that the rules would be designed “to preserve the internet as an open platform for innovation and free expression” and were “rooted in long-standing regulatory principles, marketplace experience, and public input received over the last several months.”

According to Engadget, Wheeler’s plan would turn broadband Internet access into a public utility, like telephone service, by invoking the FCC’s Title II authority. The rules would prohibit ISPs to charge for faster access to broadband networks, and would also ban the slowdown of content and services that were not against the law. The rules will likely also apply to mobile broadband.

“Broadband network operators have an understandable motivation to manage their network to maximize their business interests. But their actions may not always be optimal for network users,” the chairman wrote. “The Congress gave the FCC broad authority to update its rules to reflect changes in technology and marketplace behavior in a way that protects consumers. Over the years, the Commission has used this authority to the public’s great benefit.”

Wheeler explained that the internet would likely be quite different had the commission not mandated open access for network equipment in the 1960s. Prior to that, AT&T prohibited users from attaching non-AT&T equipment to the network, he said. Modems that helped make the modern internet possible and allowed companies like AOL to grow during the early days of the computer were only possible because of the FCC’s open-network policy.

Wheeler also relayed a story about a company he was involved with in the mid-1980s, NABU: The Home Computer Network. He said that the firm was using new technology to transmit high speed data to computers over cable TV lines, but failed because it was forced to rely on cable operators to give them access to their networks. AOL, on the other hand, succeeded (despite the fact that its speeds were slower) because it was using open telephone networks.

“The phone network’s openness did not happen by accident, but by FCC rule,” he wrote, adding that providing “that kind of openness for America’s broadband networks” has been at the heart of the net neutrality debate in recent months. While he was initially planning to accomplish this through a portion of the Telecommunications Act of 1996 focused on the principle of “commercial reasonableness,” he ultimately changed his mind because he felt doing so would benefit companies more than consumers.

Title II, hear me roar

By using Title II authority, Wheeler said that he is “submitting to my colleagues the strongest open internet protections ever proposed by the FCC… Under that authority my proposal includes a general conduct rule that can be used to stop new and novel threats to the internet. This means the action we take will be strong enough and flexible enough not only to deal with the realities of today, but also to establish ground rules for the as yet unimagined.”

“The internet must be fast, fair and open,” he concluded. “That is the message I’ve heard from consumers and innovators across this nation. That is the principle that has enabled the internet to become an unprecedented platform for innovation and human expression. And that is the lesson I learned heading a tech startup at the dawn of the internet age. The proposal I present to the commission will ensure the internet remains open, now and in the future, for all Americans.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Astronomers discover new features in spiral galaxy

Chuck Bednar for redOrbit.com – Your Universe Online

Astronomers from Case Western Reserve University have discovered new features in the first known spiral galaxy, a system that has been studied for more than 150 years, according to new research published in the latest edition of the Astrophysical Journal Letters.

The researchers reported finding faint plumes extending from the northeast and south of M51a, which is a nearby spiral galaxy also known as the “Whirlpool Galaxy.” Analysis of a new image made using a 20-hour exposure revealed new details of the nearly 120,000 light-years long linear northwest plume, as well the absence of stars one part of the southeast tail, they explained.

“These features can be used in future modeling to understand the history of M51, when it and its companion galaxy first started to interact,” said Aaron Watkins, a PhD student in the university’s department of astronomy and lead author of the study. Working along with Watkins on the study were Case Western astronomy professor Chris Mihos and observatory manager Paul Harding.

M51a was the first spiral galaxy ever located, having been originally identified and sketched by William Parsons, the Earl of Rosse, in 1845. It and M51b, its smaller companion galaxy, are both located approximately 31 million light years away in the constellation Canes Venatici.

“No professional astronomer we know of has ever taken such a deep image of this galaxy,” said Watkins. The images used in the new study were captured using Case Western’s Burrell Schmidt telescope at the Kitt Peak National Observatory near Tucson in 2010 and 2012, the authors said.

Watkins and his colleagues aimed their telescope at M51 on moonless evenings in February, March, and April of those years, exposing its digital camera to the galaxy’s light at 20-minute intervals. They recalibrated the equipment in between exposures. For a total of 10 hours, the light was filtered to reveal younger stars, and for another 10 hours, it was filtered to reveal older stars.

Those images were then combined to create the final picture.

The northwestern plume of M51a was first observed in the 1970s, but technological limits at the time limited the amount of detail that could be detected. Astronomers found that it was made up primarily of older, redder stars and had only small patches of gas. Based on the age of the stars and the length of the plume, it was likely formed as the result of an interaction between an outer disk of M51 with another galaxy 200 million years ago or more.

On the other hand, the southern plume has no morphological similarities with the surrounding parts of M51, and is completely devoid of gas. The plume has comparatively few stars, and as a result, little mass and total light. The researchers suggest that the plume may be the remains of a third satellite or body in the M51 system.

Watkins said that the northeast plume has about the same total light as the southern one, and while it may be an extension of the north side of the galaxy, that cannot be confirmed at this time. Previous research into the southeastern gas tail believed it was stretched out during an interaction with another galaxy, but this new analysis found no stars.

While that is an unusual trait for a tail such as this, the study authors said that it can serve as a test for future interaction models. They are now investigating different ways to study the galaxy, with the goal of gathering more detail from the faint plumes. Watkins believes that the northwest plume is bright enough to be studied further using the Hubble space telescope.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Hospital report cards doing little to improve surgery quality (Great.)

Chuck Bednar for redOrbit.com – Your Universe Online

A program designed to give hospitals report cards on the quality of their overall performance is doing little to improve the quality of surgery at medical centers, according to research published online Tuesday in the Journal of the American Medical Center (JAMA).

The report cards, which are part of the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) initiative, are designed to let hospitals known which areas of their day-to-day operations need to be improved. However, in the new study, experts from the University of Michigan said that they have done little to make things better for patients.

The study found no difference in surgical safety among 263 hospitals participating in the ACS-NSQIP program and 526 similar hospitals that were not involved. Data from over 1.2 million seniors who were enrolled in Medicare and who had one of 11 major operations at those centers over the past decade were analyzed during the course of the study.

Even as adults, we apparently still don’t like report cards

Preparation of those report cards involves careful analysis of every operation conducted at the hospitals. That information is then sent to a secure central database, where it is then reviewed by the ACS and shared with medical centers and doctors so they can see how their performance compares to other hospitals and surgeons.

However, the authors of the new study found that this quality reporting alone has not been adequate to speed-up the pace of improvement in surgical safety, nor has it helped lower the cost of such procedures. While they insist they are not suggesting that the report card initiative should be halted, they believe that the data needs to be put to better use.

“Although ACS-NSQIP hospitals are improving over time, so are other non-participating hospitals,” said lead author Dr. Nicholas Osborne, a vascular surgeon at the UM Health System’s Frankel Cardiovascular Center. “Our study suggests that the ACS-NSQIP is a good start, but that reporting data back to hospitals is not enough. The ‘drilling down’ that is needed to improve quality using these reports is better suited for regional collaboratives.”

Dr. Osborne explained that this analysis is the first to use a control group of hospitals in order to gauge the impact of participation in the ACS-NSQIP program, and that each participating center was matched with two control facilities. Patients treated at both types of hospitals were generally similar, though the ACS-NSQIP hospitals were larger and performed operations.

Eleven different types of operations were analyzed: esophagectomy, pancreatic resection, colon resection, gastrectomy, liver resection, ventral hernia repair, cholecystectomy, appendectomy, abdominal aortic aneurysm repair, lower extremity bypass, and carotid endarterectomy.

Ok, but…why isn’t this system working??

The lack of benefit from participating in the report card program could be due to several different factors, the researchers said. The hospitals may not have used the information to improve care; they may lack the infrastructure needed to develop effective improvement strategies; or their efforts may not have had an impact on the four items evaluated by the study.

“Knowing where you perform poorly is the important first step, but the next leap from measuring outcomes to improving outcomes is much more difficult,” Dr. Osborne said.

“Better approaches for engaging surgeons, better systems for supporting them in change efforts, and better tools for helping them re-engineer care are clearly needed,” added senior author Justin Dimick. “Future national and regional quality improvement initiatives must be aimed at not only providing feedback to participants, but also providing an infrastructure for implementing change.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

The bubonic plague strikes Madagascar

Brett Smith for redOrbit.com – Your Universe Online

It might seem ridiculous to worry about the bubonic plague in 2015, but that’s just what the World Health Organization is doing after 119 cases of the disease that ravaged Medieval Europe were identified in Madagascar last year.

“The outbreak that started last November has some disturbing dimensions,” the WHO said in a statement this week. “The fleas that transmit this ancient disease from rats to humans have developed resistance to the first-line insecticide.”

Bubonic plague is one thing; pneumonic plague is another

Black Death, or simply the plague, is caused by the bacterium Yersinia pestis, which is spread by rodent-loving fleas. When a human is bitten by an infected flea, they can develop bubonic plague. Symptoms of the plague include swollen, painful lymph nodes, fever and skin color change in extreme cases.

If the infection progresses to the point where bacteria invade the lungs, the disease then becomes known as the pneumonic plague and can be spread by coughing and inhalation.

“If diagnosed early, bubonic plague can be successfully treated with antibiotics,” the WHO statement said. “Pneumonic plague, on the other hand, is one of the most deadly infectious diseases; patients can die 24 hours after infection.”

In Madagascar, 40 people died from the plague last year and the disease was found spreading within the slums of the country’s capital, Antananarivo. Experts noted that recent flooding in the country has displaced thousands of people, and along with them disease-carrying rodents.

“Measures for the control and prevention of plague are being thoroughly implemented in the affected districts,” the WHO said. “Personal protective equipment, insecticides, spray materials, and antibiotics have been made available in those areas.”

The organization does not currently recommend any trade or travel ban based on the current status of the outbreak.

Leave the furry friends in the forest

Typically, for a person to survive the plague, it needs to be diagnosed and treated in the earliest stages. In 2012, a 7- year-old Colorado girl came down with the disease and survived after a two-week battle.

The girl is thought to have contracted the disease when she asked to bury a dead squirrel carcass she had found. Fleas from the squirrel most likely jumped off the squirrel and onto the girl. It was the first confirmed case of the plague in Colorado in over six years.

In response to the girl’s illness, Colorado health officials posted notices at the campground about avoiding animals and wearing bug spray. The posting also made it clear that a background level of plague in wild animals is common in Colorado. At the time, health officials said they had only identified two other cases of the plague across the US, which is the typical amount.

That level of death is a far cry from the impact that the plague had across Europe during the Middle Ages. From about 1340 to 1400, the plague took the lives of millions of Europeans. In 1347, it killed about one-third of the continent’s population. During this time, the rest of the world was largely unaffected by the plague and many areas saw robust population growth.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Meet penta-graphene: The newest structural variant of carbon

Chuck Bednar for redOrbit.com – Your Universe Online

Drawing inspiration from a pentagonal pattern of tiles found on the streets of Cairo, researchers from the US, China and Japan have developed a new structural variant of carbon that appears to be dynamically, thermally and mechanically stable.

The newly discovered material is known as penta-graphene, and it is a very thin, single-layer sheet of carbon pentagons. Unlike most forms of carbon, which are comprised of hexagonal building blocks occasionally interspersed with pentagons, penta-graphere would be a unique two-dimensional carbon allotrope made up entirely of five-sided polygons.

“The three last important forms of carbon that have been discovered were fullerene, the nanotube and graphene,” said Dr. Puru Jena, distinguished professor in the Department of Physics at the Virginia Commonwealth University College of Humanities and Sciences and senior author of a new paper appearing in the journal Proceedings of the National Academy of Sciences. “Each one of them has unique structure. Penta-graphene will belong in that category.”

It started with a dinner in China

The research that led to the material was launched at VCU and Peking University, and inspired by one professor’s trip to a restaurant in Beijing. While dining with her husband, Dr. Qian Wang, a professor at Peking University and an adjunct professor at VCU, noticed a piece of artwork that depicted pentagon tiles from the streets of Cairo hanging on the wall.

“I told my husband, “Come, see! This is a pattern composed only of pentagons,’” Dr. Wang explained. “I took a picture and sent it to one of my students, and said, ‘I think we can make this. It might be stable. But you must check it carefully.’ He did, and it turned out that this structure is so beautiful yet also very simple.”

By using computer simulations to model the synthesis of penta-graphene, the researchers found that the material could potentially outperform graphene in some applications. It would be quite strong, mechanically stable and able to withstanding temperatures of up to 1,000 degrees Kelvin. It is also a semiconductor, while graphene is a conductor of electricity, they noted.

“You know the saying, diamonds are forever? That’s because it takes a lot of energy to convert diamond back into graphite. This will be similar,” Jena said. “When you take graphene and roll it up, you make what is called a carbon nanotube which can be metallic or semiconducting. Penta-graphene, when you roll it up, will also make a nanotube, but it is always semiconducting.”

It also stretches in an unusual way, Wang said: “If you stretch graphene, it will expand along the direction it is stretched, but contract along the perpendicular direction. However, if you stretch penta-graphene, it will expand in both directions.”

Penta-graphene derives its mechanical strength from a rare property, Negative Poisson’s Ratio, which could allow for unique technological applications, the researchers said. The material’s property indicates that it could be used in electronics, biomedicine, nanotechnology and more. First, however, the carbon variant must actually be synthesized.

“Once you make it, it [will be] very stable,” Jena said. “So the question becomes, how do you make it? In this paper, we have some ideas. Right now, the project is theoretical. It’s based on computer modeling, but we believe in this prediction quite strongly. And once you make it, it will open up an entirely new branch of carbon science.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Sensor suit could give humans a magnetic sixth-sense

Chuck Bednar for redOrbit.com – Your Universe Online

Engineers from Germany and Japan have reportedly developed a new type of sensor that has led to the creation of an electric suit capable of giving humans a magnetic sixth-sense.

The research, which was carried out at the Leibniz Institute for Solid State and Materials Research (IFW Dresden) and the TU Chemnitz in collaboration with colleagues at the University of Tokyo and Osaka University in Japan, led to the development of a magnetic sensor that is thin and pliable enough to be adapted to human skin, even the most flexible part of the palm.

Unlike many forms of bacteria, insects and some vertebrates, humans lack magnetoception (the innate ability to detect magnetic fields for orientation and navigation purposes). Using the new sensor, however, the engineers developed an electronic skin with a magneto-sensory system that gives the wearer the ability to perceive the presence of static or dynamic magnetic fields using a so-called “sixth sense.”

As the researchers explain in the journal Nature Communications, the suit was created using magnetoresistive sensor foils that are highly sensitive, flexible and durable  while being less than two micrometers thick and weighing just three grams per square meter.

The sensors can also withstand extreme bending with radii of less than three micrometer, and survive crumpling like a piece of paper without sacrificing the sensor performance. They can also be stretched to over 270 percent on elastic supports like a rubber band, lasting for more than 1,000 cycles without fatigue, the authors added.

“We have demonstrated an on-skin touch-less human-machine interaction platform, motion and displacement sensorics applicable for soft robots or functional medical implants as well as magnetic functionalities for electronics on the skin,” said study author and PhD student Michael Melzer.

“These ultrathin magnetic sensors with extraordinary mechanical robustness are ideally suited to be wearable, yet unobtrusive and imperceptible for orientation and manipulation aids” added Professor Oliver G. Schmidt, who is the director of the Institute for Integrative Nanosciences at the IFW Dresden.

Currently, the sensors do not provide tactile feedback to the person wearing them, according to CNET. Instead, they are connected to an array of LEDs which light up when a person approaches a magnetic field. While this could make them somewhat inconvenience for day-to-day human use, it could be ideal for application in the field of robotics, the website added.

The study authors said that their sensors enable “detection, navigation and touchless control,” and could be used for “soft robotics, safety and healthcare monitoring, consumer electronics and electronic skin devices.” They also said that they hope that their work will “inspire a diverse number of devices that will benefit from a ‘sixth sense’ magnetoception.”

“The integration of magnetoelectronics with ultrathin functional elements such as solar cells, light-emitting diodes, transistors, as well as temperature and tactile sensor arrays, will enable autonomous and versatile smart systems with a multitude of sensing and actuation features,” the team of engineers conclude in their study, which was published online on January 21.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

NASA most likely going to Europa in 2016

Chuck Bednar for redOrbit.com – Your Universe Online

The Obama administration is proposing an $18.5 billion budget for NASA in 2016, an increase of about $500 million in funding for the US space agency over this year.

NASA unveiled its 2016 fiscal year budget estimates during a Monday briefing, and according to the Washington Post, many key projects will continue to receive full funding – including the in-development Orion capsule and Space Launch System rocket, the James Webb Space Telescope (which is scheduled to launch in 2018) and the Commercial Crew Program.

The Commercial Crew Program, the program in which NASA awarded contracts to SpaceX and Boeing towards the development of domestically-built spacecraft that will be used to transport American astronauts to the International Space Station, will receive $1.2 billion – a $400 billion increase from fiscal year 2015, according to various media reports.

The budget also indicates that the agency plans to move forward with its controversial Asteroid Redirect Mission (ARM), in which a robotic satellite will capture and carry a small asteroid into orbit around the moon. Provided those efforts are successful, NASA astronauts would then travel to the asteroid by 2020 as a step towards the ultimate goal of reaching Mars.

We’re going to Europa (?)

Interestingly enough, Popular Science reports the budget also reveals a trip to Jupiter’s moon Europa is pretty much a done deal. The mission is listed amongst the $5.288 billion of the budget allocated to science research, and involves $1.361 billion for a unmanned mission to the moon to see if the vast oceans buried beneath its icy crust are home to organic lifeforms.

Casey Dreier, a blogger at The Planetary Society, called the commitment to a Europa mission “the most exciting feature of the president’s 2016 budget request for NASA,” and Sara Susca, payload systems engineer with the Europa Clipper mission, told UPI that her team was “really looking forward to next spring when, hopefully, we’ll become another flagship mission.”

The Europa Clipper concept, Discovery News explained, is something that has been in the works at NASA for quite a while. It will consist of a spacecraft that will orbit Jupiter and make an estimated 45 flybys of the moon’s surface over a period of three years – similar to the way that the Cassini spacecraft in orbit around Saturn has carried out flybys of its moon Titan.

The sub-surface ocean on Europa holds three times as much H2O as the oceans here on Earth, and astrobiologists believe that it might hold life within its 62-mile (100 kilometer) deep waters. It is comparable to the Mariana Trench, the deepest region of the Pacific Ocean, where complex biology has managed to evolve in depths of 6.8 miles (11 kilometers), the website said.

Understanding habitability

While the scientists report that Europa has the liquid water, heat source and possible nutrient cycling capable of supporting life, their mission is not designed specifically to find it. Rather, as Kevin Hand, JPL’s Deputy Chief Scientist for Solar System Exploration pointed out in a media briefing on Monday, their goal is “to understand habitability; the ingredients for life.”

A surface mission would be required to actually look for life on the Jovian moon, he added, and such an endeavor is currently beyond NASA’s technological capability. The Europa Clipper spacecraft could be ready to launch within the next decade, Discovery News said, and once it is ready, NASA’s Space Launch System (SLS) could carry it to the moon in less than three years.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

UK legislators vote to legalize three-person babies

Chuck Bednar for redOrbit.com – Your Universe Online

Lawmakers in the UK have overwhelmingly voted in favor of a law that would allow for the creation of genetically-engineered babies containing the DNA of three people (two women and one man), various media outlets reported on Tuesday.

According to BBC News,  the 382-128 vote in the House of Commons brings the UK one step closer to becoming the first country in the world to allow for the creation of a fetuses that contain DNA from three different people – an attempt to stop the passage of genetic diseases.

As Engadget explained, the technique is a twist on traditional in vitro fertilization methods which is meant to prevent would-be mothers suffering from mitochondrial disease to pass the condition onto their children. Mitochondrial disease causes these so-called cellular power plants to not function properly, leading to a variety of muscular, neurological and cardiovascular.

Because the disease attacks the body’s own building blocks, there is no actual cure for the disease, the website added. Since mitochondrial DNA is passed onto a child exclusively from the mother, however, adding healthy nuclear DNA from the would-be mother and transplanting it onto the egg of a donor could help alleviate the risk that the baby will inherit the condition.

“When all is said and done, only a fraction of a fraction of the resulting baby’s DNA will come from that second woman (think on order of 0.1 percent),” Engadget said. “There’s no risk of the kid looking like a veritable stranger either, since the nuclear DNA from the primary mother (and not the mitochondrial DNA from the donor) is what helps define a child’s appearance.”

Good news for progressive medicine

The issue still needs to go before the House of Lords, and if the other branch of the legislature votes likewise, the first baby conceived from three patents could be born as early as next year. Supporters called the vote “good news for progressive medicine,” while critics argue the technique “raises too many ethical and safety concerns,” according to BBC News.

Approximately 40 scientists from 14 countries have urged the British legislature to approve laws allowing mitochondrial DNA transfer, The Guardian said. An estimated 100 children annually are affected by genetic defects in the mitochondria, and in 10 percent of those cases, the defects can result in blindness, brain damage, liver failure and other severe illnesses.

In this technique, two eggs are removed (one from the mother and another from a donor), the newspaper explained. The nucleus of the donor egg is removed, leaving the rest of the contents (including the mitochondria). That nucleus is then replaced with the one from the mother’s egg, and the process can be done either before or after the egg is fertilized with sperm.

“This is good news for progressive medicine,” said Alison Murdoch, head of Newcastle Fertility Centre at Life, where the IVF technique was pioneered. “In a challenging moral field, it has taken scientific advances into the clinic to meet a great clinical need and Britain has showed the world how it should be done.”

“Families who know what it is like to care for a child with a devastating disease are best placed to decide whether mitochondrial donation is the right option for them,” added Dr Jeremy Farrar, director of the Wellcome Trust. “We welcome this vote to give them that choice, and we hope that the House of Lords reaches a similar conclusion so that this procedure can be licensed under the UK’s internationally-admired regulatory system.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Blame men for political gridlock

Patrick Miller, a University of Kansas assistant professor of political science, discusses his recent study that found men were more uncomfortable with political ideas associated with the opposing political party and therefore less open to engage in bipartisan talks or compromise.
Credit: University of Kansas, KU News Service

600-year drought responsible for abandonment of ancient city

Chuck Bednar for redOrbit.com – Your Universe Online

Once a thriving metropolis, the city of Cantona was abandoned by its 90,000 residents due to a drought that lasted more than six centuries, according to a new study led by researchers from the University of California, Berkeley and published in the journal PNAS.

Cantona was located just east of modern-day Mexico City in the state of Puebla, Mexico. At its peak, it was said to be one of the largest cities in the New World, and was a major source of obsidian. The city was also likely an important trade route and a key military hub.

However, Cantona was deserted by its citizens during a 150-year span from 900 and 1050 AD. Researchers have long debated why this happened—though many suspected climate change as a factor.

In order to investigate the matter, the UC Berkeley-led team collected sediment cores from Aljojuca, a lake located approximately 20 miles south of the former city. They analyzed those samples and discovered evidence of a 650-year period of frequent droughts between 500 AD and 1150 AD.

Oxygen isotopes

According to reports published by Discovery News, the researchers examined the relationship between oxygen isotopes in the water, which allowed insight into the cycle of precipitation and evaporation. The ratio of these elemental variants was high which, when combined with other sediment analyses, indicates that the area had drier summers

“Overall, Cantona still had wet summers and dry winters, but its regular monsoon season was disturbed by frequent long-term droughts, which likely harmed the area’s crops and water supply, the researchers said,” the website said. “Moreover, the droughts lasted hundreds of years.”

“The decline of Cantona occurred during this dry interval, and we conclude that climate change probably played a role, at least towards the end of the city’s existence,” added lead author Tripti Bhattacharya, a graduate student in the UC Berkeley Departments of Geography and Earth and Planetary Science.

“In a sense the area became important because of the increased frequency of drought,” explained Roger Byrne, an associate professor of geography at UC Berkeley. “But when the droughts continued on such a scale, the subsistence base for the whole area changed and people just had to leave. The city was abandoned.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Private firms exploring moon mining

Chuck Bednar for redOrbit.com – Your Universe Online

NASA may have called off plans to make a return trip to the moon, but that doesn’t necessarily mean that Americans are done exploring the lunar surface, according to a new report published in the February edition of the journal Physics World.

In that paper, science writer Richard Corfield explains that private sector companies and space agencies are looking at ways to make it back to the moon so they can tap into the 1.6 billion tons of water ice at its poles and the abundant rare-earth elements hidden beneath its surface.

The lunar landscape is attractive for possible mining activities, and the presence of the polar water ice has “more than anything else… kindled interest in mining the moon,” Corfield explained Monday in a statement, “for where there is ice, there is fuel.”

A gas station in space

Shackleton Energy Company (SEC) is one of the companies said to be developing plans to harness the reserves of water ice and to convert it into rocket propellant in the form of hydrogen and oxygen. The Texas-based firm would then market it to partners in low-Earth orbit.

Essentially, the plan is to build a “gas station in space,” SEC CEO Dale Tietz explained. The company would extract water ice by sending people and robots to mine the lunar poles, convert it to fuel, and use some of it to power their own activities. The rest of the propellant would be sold at orbital facilities for a fraction of the price of shipping fuel from Earth, they added.

“There are billions of tons of water ice on the poles of the Moon. We are going to extract it, turn it into rocket fuel and create fuel stations in Earth’s orbit,” the company said on its website. “Our fuel stations will change how we do business in space and jump-start a multi-trillion dollar industry… Much like gold opened the West, lunar water will open space like never before.”

HTP

Another privately funded company, Moon Express, is also looking to use water ice as fuel, but their plan is somewhat different. It intends to mine lunar resources to make “high-test peroxide” (HTP), which would be used to fuel its spacecraft and other interstellar operations.

“We believe it’s critical for humanity to become a multi-world species and that our sister world, the Moon, is an eighth continent holding vast resources than can help us enrich and secure our future,” the company explained on its website.

“Most of the elements that are rare on Earth are believed to have originated from space, and are largely on the surface of the Moon,” Moon Express added. “Reaching for the Moon in a new paradigm of commercial economic endeavor is key to unlocking knowledge and resources that will help propel us into our future as a space faring species.”

Diaspora to the stars (whatever that means)

With terrestrial rare-earth elements dwindling, and in light of the fact that most of those that remain have already been claimed, Corfield said that it is no surprise that companies are looking towards the moon for new sources of these essential elements, which are used in everything from smartphones and computers to car batteries.

“All interested parties agree that the Moon – one step from Earth – is the essential first toehold for humankind’s diaspora to the stars,” added Corfield, a research fellow at Oxford University and the author of Lives of the Planets: A Natural history of the Solar System and other books.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Cold plasma can kill norovirus, researchers find

Chuck Bednar for redOrbit.com – Your Universe Online

The pathogen responsible for most cases of gastroenteritis worldwide could be killed off by cold atmospheric pressure plasma (CAPP), researchers from the University of Veterinary Medicine in Hannover report in a recent edition of the online, open-access journal mBio.

Treating surfaces with CAPP, also known as cold plasma, could reduce the risk of transmitting norovirus, a contagious virus leading to stomach pain, nausea and diarrhea and which has gained international notoriety by causing multiple outbreaks on cruise ships in recent years.

According to BBC News, tens of millions of cases of the illness are reported worldwide every year, and complicating matters is the fact that norovirus is highly resistant to several different types of chemical disinfectants. Chlorine-based bleach is currently the most effective treatment, though scientists have long been searching for a better alternative.

Cold plasma, the so-called “fourth state of matter” made up of ionized gas molecules at room temperature, could be the solution, the British media outlet added. These ions are capable of killing several types of microbes, though their effectiveness against viruses has been unknown.

A matter of fecal

However, a series of experiments led by Dr. Birte Ahlfeld and Professor Günter Klein of the University of Veterinary Medicine in Hannover investigated cold plasma’s effect on norovirus by preparing sterile petri dishes containing three dilutions of a stool sample collected from a German soldier who was infected with the disease-causing agent in 2011.

They treated the samples using CAPP for different lengths of time in a plasma chamber, and found that samples treated for the longest time (15 minutes) has the lowest content of the virus. The cold plasma treatment reduced the amount of potentially infectious virus particles from 22,000 (similar to what would be found on an untreated surface touched by someone who had been infected with norovirus) to 1,400 in 10 minutes and to 500 after 15 minutes.

In a statement, Klein called cold plasma “an environmentally friendly, low energy method that decreases the microbial load on surfaces.” He noted that the technology has been demonstrated to be effective “against viruses with a high tenacity, like noroviruses, and that its “successful application in medical therapy should be transferred to other areas.”

“Cold plasma was able to inactivate the virus on the tested surfaces, suggesting that this method could be used for continuous disinfection of contaminated surfaces,” Klein added. While CAPP treatment could not completely eliminate the virus, he said that “a reduction is still important to lower the infectious dose and exposure for humans.”

In future research, Klein’s team (which include experts from the Max Planck Institute for Extraterrestrial Physics and the Central Institute of the Bundeswehr Medical Service Kiel) plan to test CAPP’s disinfection properties on other types of surfaces, as well as on other forms of norovirus. They also plan to use electron microscopes to examine the structure of the pathogen both before and after treatment with cold plasma.

“A spread of norovirus can be inhibited at crucial points, which as we know from our previous studies are all surfaces with frequent contact to human skin or hands,” Klein told BBC News. “Handheld devices can be used to disinfect different surfaces or a plasma box for hands or cutlery or plates is possible.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Smoke from wildfires increases tornadic intensity

Chuck Bednar for redOrbit.com – Your Universe Online

Wildfires and forest fires can have an unexpected on the weather, as smoke from those events can actually increase the intensity of tornadoes, researchers from the University of Iowa report in the latest edition of the journal Geophysical Research Letters.

In the study, UI chemical and biochemical engineering professor Gregory Carmichael, Center for Global and Regional Environmental Research (CGRER) postdoctoral fellow Pablo Saide, and their colleagues examined the impact of smoke from land-clearing spring agricultural fires in Central America on a severe weather outbreak that occurred in 2011.

That outbreak, which took place on April 27, produced a total of 122 tornadoes and resulted in 313 deaths across the southeastern US. It is believed to be the most severe event of its kind since 1950, and the researchers found that smoke from those Central American fires played a role.

Shear luck

According to the study authors, the outbreak was caused primarily by supercell-producing environmental conditions. But the smoke particles intensified those conditions, Saide and Carmichael noted.

The duo reported the smoke caused the base of the clouds to lower and increased wind shear, which is the variation of wind speed in respect to altitude. Combined, those factors increased the likelihood of more severe tornadoes. It marks the first time that the impact of smoke has been linked to this phenomenon, and the research has found the reasons for these interactions.

“These results are of great importance, as it is the first study to show smoke influence on tornado severity in a real case scenario,” Carmichael explained in a statement. “Also, severe weather prediction centers do not include atmospheric particles and their effects in their models, and we show that they should at least consider it.”

“We show the smoke influence for one tornado outbreak, so in the future we will analyze smoke effects for other outbreaks on the record to see if similar impacts are found and under which conditions they occur,” added Saide. “We also plan to work along with model developers and institutions in charge of forecasting to move forward in the implementation, testing and incorporation of these effects on operational weather prediction models.”

Their findings are based on computer simulations based on data collected during that 2011 event. In one of those models, smoke and its effect on solar radiation and clouds were included, while the other omitted smoke. The simulation which included the smoke resulted in a lowered cloud based and greater wind shear.

Carmichael, who also serves as director of the Iowa Informatics Initiative and co-director of CGRER, added that future research on the field will focus on improving scientists’ understanding of how smoke can impact near-storm environments and tornado occurrence, intensity, and longevity.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Researchers find potential hiding place for latent HIV

Chuck Bednar for redOrbit.com – Your Universe Online

While new treatments have proven successful at suppressing HIV infection, they have thus far been unable to eliminate it due to the fact that they cannot attack the virus as it hides dormant in the cells of a person’s immune system.

New research published last Thursday in the journal Cell is aiming to change that, however, by helping to identify  which types of cells probably harbor latent HIV and which ones probably do not. The authors of the study believe their work could help find a cure for the virus.

“It has recently been shown that infected white blood cells can proliferate over time, producing many clones, all containing HIV’s genetic code,” explained study author Lillian Cohn, a graduate student at the Rockefeller University Laboratory of Molecular Immunology.

“However, we found that these clones do not appear to harbor the latent reservoir of virus,” she added. Instead, Cohn said that she and her colleagues found that cells that have never divided are likely the primary source of latent HIV.

The pathogen belongs to a family of viruses which insert themselves directly into the genome of the host cells, the researchers explained. Once there, they can hide largely undetected following the initial infection. HIV targets primarily CD4 T lymphocytes, a type of  cell that helps initiate an immune response, by integrating itself into their genetic code.

By hijacking these cells, HIV is able to self-replicate in order to infect other cells, killing it in the process. Antiretroviral drugs that suppress the infection disrupt this process. However, the virus may not necessarily produce an active infection, opting instead to stay hidden within the genome. As a result, there is no process for the drug to disrupt, allowing the infection to remain latent.

Typically, however, what really happens is something in the middle: the virus manages to invade some of the T cell’s genome, but issues with the process prevent it from completely taking over the cell and replicating itself. The few fully-successful integrations that take place, however, still do damage to a person’s immune system and leaves him or her susceptible to diseases.

“If a patient stops taking antiretrovirals, the infection rebounds. It is truly amazing that the virus can give rise to AIDS 20 years after the initial infection,” Cohn explained. Her team believes that the latent HIV virus may be hiding away in one type of CD4 T cell – a long-lived memory cell which helps a patient’s immune system remember specific types of pathogens.

When these T cells come in contact with a disease-causing agent it has previously encounters, they trigger the immune response designed to recognize it, a process known as clonal expansion. Previous studies have indicated that this process is vital to maintaining HIV’s latent reservoir.

Cohn and her co-authors analyzed cloned and unique CD4 T cells in blood samples from 13 HIV-infected individuals, using a special analytical computational technique which allowed them to identify sites where HIV had become integrated into individual cells.

“Given the size of the human genome, it is highly unlikely the virus would insert itself in exactly the same place more than once,” said Cohn. “So, if multiple cells contained virus with identical integration sites, we classified them as clones. Meanwhile if a cell had a unique integration site, one not shared with any other cell, then we assumed that cell was unique.”

They tested 75 viral sequences obtained from the expanded clones of cells to see if they had the potential to produce more of the virus, and found that none of them could. While Cohn explained that they “cannot rule out the possibility that a rare clone of cells may contain an active virus, it appears most likely that latent reservoir – and the potential target for therapies meant to cure HIV – resides in the more rare single cells containing unique integrations.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Sony Online Entertainment sold, will begin developing Xbox games

Chuck Bednar for redOrbit.com – Your Universe Online

Sony Online Entertainment (SOE), the studio behind EverQuest, DC Universe Online and several other MMO titles, has been sold off by its parent company and will now expand its operations to include development for the Xbox One, various media outlets reported on Monday.

In a statement posted on the SOE forums, the company confirmed that it had been acquired by investment management firm Columbus Nova, and that it would shift its focus to becoming an independent game studio working as a multi-platform game developer and publisher.

According to Ars Technica, the new company has been renamed Daybreak Game Company, and officials promise that all existing SOE games would continue to function as normal. In fact, they even suggested that they may actually have additional resources to devote to those titles.

DayBreak said that it would “continue to focus on creating exceptional online games for players around the world,” and that in addition to its PlayStation and PC, it would be expanding into the field of Xbox and mobile development. It also promised “new exciting developments for our existing IP and games as we can now fully embrace the multi-platform world we are living in.”

Moving forward

The new owners reportedly plan to move forward with the release of the upcoming EverQuest Next in the “near future,” according to TechCrunch. The studio released the original EverQuest in 1999, one year after it was formed, and had over 500,000 subscribers for the title as of 2004. It had most recently released the zombie-themed survival game H1Z1.

While surprising, the sale comes in the midst of financial struggles for Sony. The Japanese tech giant is currently projecting a loss of more than $2 billion for the fiscal year ending next month, according to reports. If that happens, it would mean that the company has posted a total of $10 billion in losses in the past eight years, despite a thriving PlayStation gaming division.

SOE president John Smedley told GamesIndustry.biz that he and his colleagues were “excited to join Columbus Nova’s impressive roster of companies,” adding that the investment firm has “a proven record in similar and related industries” and that they were “eager to move forward to see how we can push the boundaries of online gaming.”

“We see tremendous opportunities for growth with the expansion of the company’s game portfolio through multi-platform offerings as well as an exciting portfolio of new quality games coming up, including the recently launched H1Z1 and the highly anticipated EverQuest Next to be released in the near future,” added Jason Epstein, a senior partner at Columbus Nova.

Epstein went on to tout the ‘early access’ launch success of H1Z1 as one indicator of the “talent and dedication of the studio’s developers to create great online gaming experiences.” New York-based Columbus Nova was founded in 2000 and is run by CEO Andrew Intrater, a former energy, base metals and mining executive.

Terms of the acquisition were not disclosed, but DFC Intelligence analyst David Cole told GamesIndustry.biz that the price paid by Columbus Nova was probably “fairly cheap” and that Sony most likely “had to eat a lot of development costs.” He added that the move was “definitely a positive for SOE/Daybreak because I think they were hampered by being tied to the Sony corporate behemoth and now they can concentrate on their core market.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

DNA ‘clock’ could predict how long you will live

Chuck Bednar for redOrbit.com – Your Universe Online

Chemical changes that take place in a person’s DNA as a person grows older could serve as a “biological clock” which could provide clues as to how long a person will live, according to new research published in the journal Genome Biology.

In the study, scientists from the University of Edinburgh and colleagues from Australia and the US report that these chemical changes can help predict a person’s age, and that they can compare that data with those individuals’ actual ages to predict how long those people will live.

Those people whose biological age, based on their DNA, was greater than their actual age were more likely to die sooner than those whose biological and actual ages were the same, according to four studies that tracked nearly 5,000 older people for periods of up to 14 years each.

The biological age of each individual was measured using a blood sample provided at the start of the study, and follow-ups were conducted throughout the course of the research. The authors found a correlation between a biological clock that tended to run faster and early death, and that link held true even after accounting for other factors such as smoking and cardiovascular disease.

The chemical modification, which is known as methylation, does not change a person’s DNA sequence. However, it does play a key role in biological processes, and can influence how genes are activated or switched off, the researchers said. Methylation changes can impact many genes and occur throughout a person’s life, and this information was used to measure biological age.

“The same results in four studies indicated a link between the biological clock and deaths from all causes,” said Dr. Riccardo Marioni from the University of Edinburgh’s Centre for Cognitive Aging and Cognitive Epidemiology (CCACE). “At present, it is not clear what lifestyle or genetic factors influence a person’s biological age. We have several follow-up projects planned to investigate this in detail.”

The use of methylation to determine a person’s biological age dates back to a 2012 study led by scientists at the University of California, San Diego School of Medicine, which described the markers and developed a model to quantify how aging occurs at the genetic and molecular level. The technique was said to provide researchers with a more precise method of determining how old a person is, as well as to predict and treat age-related diseases and other ailments.

“This new research increases our understanding of longevity and healthy ageing,” said lead investigator Professor Ian Deary, also from the CCACE. “It is exciting as it has identified a novel indicator of ageing, which improves the prediction of lifespan over and above the contribution of factors such as smoking, diabetes, and cardiovascular disease.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Why are people so annoying on social media?

Abbey Hull for redOrbit.com – Your Universe Online

Psychological scientist Irene Scopelliti and researchers of City University London wanted to know why we find other people are so annoying. The answer? Our self-presentation to the world backfires and we have a difficult time responding to situations with empathy.

Humans don’t always keep emotional perspective in mind; we believe that other people share our emotions instead of having their own. People confuse self-promotion with humility, leading to bragging with the belief that the other person is truly interested in sharing our pride and joy. And when they don’t share our emotions, we become the annoying ones.

Scopelliti and her colleagues decided to test this theory in a few experiments, soon to be published in Psychological Science. Researchers first asked a group of subjects to describe a time in their life when they bragged to someone and what they believed the recipient had felt. Another group of subjects did the opposite, telling a story of when they were bragged to, and their reaction towards the experience.

In the results, self-promoters were more likely to report positive emotions than recipients of self-promotion. The hypothesis was proven correct with the understanding that self-promoters often see situations egocentrically and have a hard time stepping into the other person’s shoes.

In a secondary test, researchers found two more groups of self-promoters and recipients. They asked each group to rate the recipients’ experience as happy, prideful, jealous, angry, annoyed, inferior, or upset. The results were the same, with self-promoters overestimating the recipient and underestimating their annoyance.

Finally, scientists studied self-promoters and the belief that excessive bragging leads to a good impression. The researchers asked subjects to appear as successful as possible, but they promoted themselves too aggressively and their recipients found them off-putting.

Bragging can be much more annoying than we are aware, but it may also be even worse than we thought. Through self-promotion, it is speculated that the mystery behind a person is gone. There is nothing positive left to discover.

Hence, science reaffirms the old adage: Modesty is the best policy.

—–

Follow redOrbit on Twitter, Facebook, Google+, Instagram and Pinterest.

The science behind Idina Menzel’s recent cringe-worthy performances

Brianne “Beta” Angarole for redOrbit.com – Your Universe Online

Oh Idina. We want you to do well…but what is going on with these live performances? The pitchiness? The unforgettably good-turned-unforgettably-bad belty notes? Where has the Broadway star gone? When did you stop defying gravity?

I mean, it’s one thing to occasionally slip up on stage when you’re doing musicals. But these are big mistakes on massive stages: the Oscars, New Years eve, and most recently (cue spotlight) the Super Bowl.

I saw all these posts about how “Idina killed the National Anthem” and it made me scratch my head. Was I watching a different National Anthem as everyone else? Surely, I couldn’t be. All those pitchy, unsure notes leading up to that flat note on “[land of the] free”? I’d definitely heard them. No, they weren’t as bad as her New Year’s Eve performance of Frozen‘s “Let It Go”:

(cringe-worthy moment at 3:04)

Or her performance of the same song at the Oscars last year: “Let It Go”:

(cringe-worthy moment at 2:18)

But it was enough of a sour note to make me wonder: Why does this keep happening? She’s a professional.

As a musician, I’m going to posit it’s a combination of two things: nerves and range.

When you’re nervous, you hold tension, and tension is a vocalist’s worst enemy.

General tension leads to poor breathing which in turn leads to notes being forced through the vocal cords with inadequate support.  This aggravates the mucosal lining and propagates vocal nodes, hoarseness and (yup, you guessed it) flat/ugly notes. Usually professionals know the tricks for avoiding notes that could potentially cause nervousness and tension.

Which brings me to my second point: range.

Every vocalist has something called tessitura. Tessitura is the most comfortable/natural range for a singer. A vocalist may be able to sing higher or lower than their tessitura, but those notes will take more focus and support. In Idina’s performances of “Let It Go” at the Oscar’s, and during New Year’s Eve, she belts an Eb 5 (the fifth Eb up from the bottom of the piano) and it’s obvious the Eb 5 is out of her tessitura. During her performance on Jimmy Fallon, she used a professional vocalist’s trick of lowering the song a half step to a D 5. And guess what? Her confidence combined with singing a note closer to her tessitura resulted in a beautiful, tension-free belt.

Watching this, it’s unfortunate Idina didn’t keep the lowered rendition for the Oscars and New Years Eve.

Despite all of this critiquing, though, Idina Menzel is incredibly talented (and I love her). I leave you with a redeeming trailer of her doing what she does best: singing on Broadway–this time, in her newest musical If/Then.

Sing girl. Sing.

Brianne “Beta” Angarole is a recording artist, vocal coach, and freelance creative out of Nashville, TN. She graduated with a degree in commercial voice from Belmont University, released two albums with placement on NBC and Nickelodeon, and recently appeared on The Sing Off Holiday Special. Brianne owns a private vocal studio in East Nashville called “Betatroupe Lessons” where she emphasizes healthy vocal practices.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

‘Mini-brain’ in spinal cord aids in balance

Chuck Bednar for redOrbit.com – Your Universe Online

While it takes a tremendous amount of focus to make it across an icy parking lot in the dead of winter, new research from the Salk Institute for Biological Studies reveals that there are hidden, unconscious things going on in our bodies that help keep us from falling.

In a study published Thursday in the journal Cell, Salk Institute professor Martyn Goulding and his colleagues explain that when we need to keep our balance on slick surfaces such as ice, there is a cluster of neurons in our spinal cords, which serve as a sort of “mini-brain.”

These neurons integrate sensory information and are responsible for making adjustments to our muscles that prevent us from slipping and falling. In their paper, the study authors map the neural circuitry of the spinal cord that processes the sense of light touch, which allows the body to automatically make slight adjustments to foot position and balance that help keep us upright.

First-ever blueprint of the spinal circuit

The experiments conducted by Goulding and his fellow researchers in mice represent the first-ever detailed blueprint of this spinal circuit, which acts like a control center by integrating motor commands from the brain with sensory information from the limbs.

A better understanding of these mechanisms could led to better treatment for spinal cord injuries, and diseases known to affect motor skills and balance. It could also help prevent falls for senior citizens, according to the study authors.

“When we stand and walk, touch sensors on the soles of our feet detect subtle changes in pressure and movement. These sensors send signals to our spinal cord and then to the brain,” said Goulding.

“Our study opens what was essentially a black box, as up until now we didn’t know how these signals are encoded or processed in the spinal cord,” he added. “Moreover, it was unclear how this touch information was merged with other sensory information to control movement and posture.”

Maintaining balance while walking across an icy parking lot requires use of several of our senses. Vision tells us whether we’re on nearly-invisible “black ice” damp asphalt, while the inner ear’s balance sensors help us keep our heads level with the ground. Also, our muscles and joints help monitor our arms and legs as they change position during locomotion.

All of that information, including signals from the light touch transmission pathway detailed in the new study, are transmitted to the brain every millisecond. The brain preprocesses this data in sensory way stations such as the eye or spinal cords. While scientists have long suspected that the neurological processes required for movement required data-processing circuits in the spinal cord, those regions had never been identified or mapped – until now.

Mapping the mini-brain

Using imaging techniques involving a reengineered rabies virus, Goulding and his colleagues traced nerve fibers that carry signals from touch sensors in the feet to their connections in the spinal cord, and found that those fibers connect to a group of neurons known as RORα neurons. RORα neurons, which are named for a specific type of molecular receptor found in those cells, are in turn connected by neurons to the motor region of the brain.

When the RORα neurons in the spinal cord were disabled in genetically modified mice, the researchers found that they were significantly less sensitive to movement across the surface of the skin or to a sticky piece of tape that was placed on their feet. However, the rodents were still able to walk, and could also stand normally on flat ground.

When faced with a more difficult task – walking across a narrow beam elevated off of the ground – the mice struggled and tended to be clumsier than those that had their RORα neurons intact. The study author attributed this to the reduced ability of the animals to sense skin deformation when a foot was slipping off the edge of the beam, and to respond with slight adjustments in the position of their feet required to maintain balance on ice or other slippery surfaces.

The Salk Institute team also found that RORα neurons don’t just receive signals from the brain and the light touch sensors, but also connect directly with neurons located in the ventral spinal cord that control movement. As such, they are essentially at the core of a “mini-brain” which integrates signals from the brain and the senses to ensure that limbs function correctly.

“We think these neurons are responsible for combining all of this information to tell the feet how to move,” explained Steeve Bourane, a postdoctoral researcher in Goulding’s lab and first author of the paper. “If you stand on a slippery surface for a long time, you’ll notice your calf muscles get stiff, but you may not have noticed you were using them. Your body is on autopilot, constantly making subtle corrections while freeing you to attend to other higher-level tasks.”

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Binge-watching TV may be sign of depression

Chuck Bednar for redOrbit.com – Your Universe Online

Sitting back and binge watching House of Cards or Game of Thrones might be good way to spend a lazy winter day, but new research suggests that such activity could actually be a sign of loneliness and depression.

As part of their research, Yoon Hi Sung, Eun Yeon Kang and Wei-Na Lee from the University of Texas at Austin interviewed 316 people between the ages of 18 and 29, asking them how often they watched television, how often they binge-watched, and how often they experienced feelings of loneliness, depression, and self-regulation deficiency.

They discovered that the more lonely and depressed the study participants were, the more likely they were to watch marathon sessions of their favorite TV shows – using the programming as a way to separate themselves from their negative emotions.

They also found that those who lacked the ability to control their behavior were more likely to binge-watch, as they were unable to keep from clicking “next” even when they knew that there were other things they had to do.

According to the researchers, there has been little empirical evidence on the practice of binge-watching television, as it is a relatively new phenomenon. However, psychological factors such as loneliness, depression, and self-regulation deficiency have long been identified as indicators of binge-type or addictive behaviors in general.

For instance, people who are feeling lonely or depressed tend to engage in addictive behavior in order to temporarily get away from reality, and a person’s lack of self-control has been shown to impact the level of his or her addictive behavior. In analyzing binge-watching behavior, the researchers said that they started by using this collection of known psychological factors.

“Even though some people argue that binge-watching is a harmless addiction, findings from our study suggest that binge-watching should no longer be viewed this way,” Sung explained in a statement Thursday. “Physical fatigue and problems such as obesity and other health problems are related to binge-watching and they are a cause for concern.”

“When binge-watching becomes rampant, viewers may start to neglect their work and their relationships with others,” the study author added. “Even though people know they should not, they have difficulty resisting the desire to watch episodes continuously. Our research is a step toward exploring binge-watching as an important media and social phenomenon.”

Binge watching may not be completely terrible, however. A 2014 survey of 1,000 Netflix users found that more than half of them would be more willing to exercise if they could watch marathons of their favorite movies or TV shows while doing so. In addition, 45 percent said they would be more motivated to exercise if they could watch those programs on demand for free while working out.

Sitcoms were the most popular watch-while-you-exercise choice, according to the survey. Thirty-six percent of respondents said they liked to laugh while working out, while 27 percent preferred dramas, 24 percent liked sci-fi/fantasy and 20 percent would watch reality shows.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

More flu-related deaths in cities of Super Bowl teams

Chuck Bednar for redOrbit.com – Your Universe Online

Football fans in Seattle and New England fresh off rooting for their beloved Seahawks and Patriots in the Super Bowl on Sunday may want to use a little extra hand-sanitizer this morning, according to a new Tulane University study.

According to lead author Charles Stoecker, assistant professor of global health systems and development at Tulane University School of Public Health and Tropical Medicine, cities and regions that send teams to the Super Bowl typically see a big spike in flu-related deaths.

In fact, statistics from 1974-2009 revealed an 18 percent increase in flu deaths among those over the age of 65 in those areas. The reason for the phenomenon, Stoecker explained, is that playoff games tend to bring people closer together in the heart of the annual flu season. While rooting for their teams, those fans are also likely spreading germs in close quarters.

Step away from the dip

“You’re going to the bar or to peoples’ homes for watch parties and you’re double dipping the chip – or somebody else is – and you’re spreading the flu,” he said in a statement. “Football fans might contract a mild case of influenza, but then pass it on to other, potentially more susceptible people.”

The effect tends to be worse in years when the dominant strain is more virulent, as is the case with this year’s influenza A (H3N2), as well as when the Super Bowl occurs closer to the peak of flu season.

According to the US Centers of Disease Control and Prevention (CDC), flu activity can begin as October and continue to occur as late as May, making it difficult to pinpoint exactly when it will peak. However, it commonly peaks in the US between December and February, the agency said.

The Tulane University professor went on to explain that postseason play tends to cause people to change their travel patterns. As a result, fans and tourists tend to mix together while travelling to or from the game, presenting more opportunities for exposure to the pathogen.

Ironically, Stoecker and his colleagues did not find an increase in flu-related deaths in the cities that host the big game. Those places are usually warmer-weather areas such as Miami, Phoenix or New Orleans that are “less amenable to flu transmission,” he explained.

The study, which was co-authored by economists Alan Barreca of Tulane and Nicholas Sanders from the College of William & Mary, suggest that public health officials should redouble their efforts and make sure that fans in these cities to take extra precautions.

“Strategies exist to help offset such effects,” the authors wrote. “If a major contributor to increased influenza spread is local gatherings for watching games, a simple policy solution is to increase awareness of influenza transmission vectors during times of sports-related gatherings.”

“Reminding people to wash their hands and avoid sharing drinks or food at parties during the height of influenza season, especially if they have high amounts of contact with vulnerable populations, could have large social returns,” they continued, adding that the same phenomenon could also hold true for other sporting events, including the Olympics or the World Cup.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Solar-powered sensor reminds you to shut the windows

Chuck Bednar for redOrbit.com – Your Universe Online

A new solar-powered radio sensor chip could help you keep your heating bills down by letting you know if you left your window open, and keep you safe by alerting you if someone attempts to break into your house.

The sensor was developed by researchers from the Fraunhofer Institute for Microelectronic Circuits and Systems IMS in Duisburg, Germany. It is roughly the same size as a fingernail and is mounted on the aluminum profile separating a window’s two panes of glass. It also features a solar coating that allows it to generate its own power.

Currently available systems used to monitor window status require sensors to be attached via cable to a central alarm located within the home or building, the developers claim. The new chip has no such limitations, however, and is pretty much maintenance free, they added.

Once installed, the solar cell receives enough light (even in winter) to function, and a magnet and acceleration sensors embedded in the unit register if the window is opened a crack or all the way. It can transmit a radio signal to the base station if the window remains open for too long.

One chip to rule them all

The researchers, who demonstrated their radio sensor chip at the BAU trade fair in Munich in late January, believe that it has several potential applications, including reminding homeowners to ventilate their houses or warning if they leave a window open when leaving the house.

Furthermore, it can offer protection from intruders by detecting the difference between different types of fluctuations. For instance, they explain that it can tell the difference between a ball that hits against the windowpane and a crowbar being used by a burger to force open the frame. The developers claim that it can make this distinction within one-tenth of a second.

Its power consumption is so low that it can store enough power to operate in up to 30 hours without light, and it could be the first step towards products that could function in up to 14 days of total darkness.

Since both the processor and the chip are extremely tiny, they are relatively inexpensive, and the switches it uses consume little energy, according to the developers. The chip automatically enters sleep mode, and can be set by users to wake up and take a measurement at regular intervals. The production costs because the solar coating is part of the actual production process of the chips.

“As you can see, there are a lot of application areas,” explained engineer Dr. Gerd vom Bögel, who co-led the team along with physicist Dr Andreas Goehlich. “Only a handful of additional production steps are needed so that manufacturing can also be accomplished in high quantities.”

—–

Follow redOrbit on Twitter, Facebook, href=”http://instagram.com/redorbitdotcom”>Instagram and href=”http://www.pinterest.com/redorbit/” target=”_blank”>Pinterest a>.

Black Beauty meteorite came from Mars

Chuck Bednar for redOrbit.com – Your Universe Online

New spectroscopic analysis has revealed that Black Beauty, a meteorite found several years ago in the Moroccan desert, is actually a 4.4 billion year old piece of the Martian crust, and may have the same composition as the rocks that currently cover the surface of the Red Planet.

In the study, the researchers explain that measurements of the meteorite also known as NWA 7034 are an exact match with orbital measurements of the dark plains on Mars, a region of the planet where the coating of red dust is thin enough to expose the rocks beneath the surface.

Writing in the journal Icarus, Brown University graduate student Kevin Cannon and colleagues from the University of New Mexico report that their findings may demonstrate that Black Beauty is representative of the “bulk background” of the rocks found on the surface of Mars.

When analysis of NWA 7034 began back in 2011, scientists knew from its chemical makeup that it was of Martian origin. However, it was unlike any other Martian meteorite ever studied, as all previous Martian rocks found here on Earth were igneous rocks comprised from cooled volcanic material and were classified as SNC meteorites (shergottites, nakhlites, or chassignites).

Conversely, Black Beauty is classified as a breccia, a mixture of different rock types welded together in a basaltic matrix, the researchers noted. It contains sedimentary components that match the chemical makeup of rocks analyzed by the Mars rovers, leading scientists to conclude that it is a piece of Martian crust – the first sample of its kind to find its way to Earth.

Mystery solved?

Cannon and his co-authors believed that Black Beauty could help solve a longstanding mystery surrounding the fact that spectral signals from SNC meteorites never matched up entirely with the spacecraft measurements from the Martian surface. They acquired a sample of the meteorite and used several different types of spectroscopic techniques to analyze it.

Among the methods used were a hyperspectral imaging system that had been developed by a Massachusetts-based firm known as Headwall photonics, which allowed them to obtain detailed images of the entire sample instead of just a small portion, Cannon said. They obtained an average composition that matched the data collected by orbiting spacecraft.

The authors explained that the spectral match will help them better understand the dark plains, and they suggest that the region most likely contains primarily breccia rocks similar in nature to Black Beauty. Since the dark plains are dust-poor regions, it is believed that they could represent the type of rocks that are hidden beneath the red dust throughout much of Mars.

“This is showing that if you went to Mars and picked up a chunk of crust, you’d expect it to be heavily beat up, battered, broken apart and put back together,” Cannon said. Given what experts known about Mars, he and his colleagues explain that this would make a lot of sense.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Baby chicks can count, study says

Chuck Bednar for redOrbit.com – Your Universe Online

Usually calling somebody a bird-brain is meant as an insult, but it might be time to rethink that as new research reveals baby chicks can not only count, but also have the same basic concept of a left-to-right number line as humans.

In research published Friday in the journal Science, researchers from the University of Padova and the University of Trento in Italy report that experiments involving 60 three-day-old chicks revealed that the creatures can become familiar with the concept of placing smaller values to the left and larger ones to the right on a mental number line.

As part of those experiments, the baby birds were familiarized with a target number (5) and were able to spontaneously associate a smaller number (2) with the left space and a larger one (8) with the right space. Furthermore, the larger number (8) was itself associated with the left space when the chicks were presented with a second, higher target number (20).

While the existence of this mental number line is common in humans, this marks the first time that the left-right association has ever been observed in animals, according to BBC News. Lead investigator Dr. Rossa Rugani told the British news agency that it was unclear what caused the chicks to make the choices that they did, but the results were apparent.

“All we can judge is behavioral responses. Therefore, we don’t actually know if it is a real ‘number line’ but it strongly resembles what is observed in the human number line,” she said.

Left-to-right association

While humans are capable of making consistent associations between numbers and spatial locations, the researchers note that it is not certain how much of this is learned and how much of it develops automatically. Even in Arabic countries, where adults read from right to left, it is not clear if education produces the relationship, or just fine-tunes it in terms of direction.

Dr. Rugani explained that baby chicks were a good model to test this behavior, since they can be monitored shortly after hatching. She and her colleagues created experiments to determine if they also linked smaller and larger numbers which different spatial locations, training the birds to get food from behind a sign that displayed a number of shapes.

The chicks were then presented with two copies of a different number, and the researchers recorded how often each one selected the left-hand or right-hand one. If the new number was smaller than the training number, the chicks went left about 70 percent of the time, and if it was larger, they went to the right about 70 percent of the time.

Furthermore, the effect was found to be relative. Chicks went to the right if shown 8 when the original number was 5, but to the left if the target number was 20. Additional experiments saw the authors slightly alter the presentation of the numbers, but the results were unchanged, even when the smaller number was presented using different colors or larger shapes.

“Our results suggest a rethinking of the relationship between numerical abilities and verbal language, providing further evidence that language and culture are not necessary for the development of a mathematical cognition,” Dr Rugani, a psychologist at the University of Padova, told Discovery News.

“A number is not either small or large in an absolute sense, but rather it is smaller or larger with respect to another number,” she added. “[The chicks] associate small numbers with the left space and larger number with the right space, and this resembles the humans’ behavior in responding to numbers… I would not at all be surprised that the number spatial mapping is also found in other animals, and in newborn infants.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

ESA calls off search for Philae

Chuck Bednar for redOrbit.com – Your Universe Online

The search for the Philae lander on Comet 67P/Churyumov-Gerasimenko (67P/C-G) has been unsuccessful, and ESA officials will now wait to hear from the probe to confirm its location.

Philae, which landed on the comet last November, is believed to have come to rest on a 350m by 30m “landing strip” on the smaller lobe of 67P/C-G, the ESA said on Friday. A dedicated search using the OSIRIS near-infrared integral field spectrograph has not found its exact location.

Upon first contact with the comet, Philae bounced twice and crossed a large depression known as “Hatmeht” before settling down in a site that has been dubbed “Abydos.” That area is believed to be located just off the top of the duck-shaped comet’s “head,” according to BBC News.

The general location was imaged by the Rosetta orbiter on December 12, 13 and 14, and each picture was then scanned by eye for telltale signs of Philae – “a set of three spots that correspond to the lander,” according to OSIRIS principal investigator Holger Sierks from the Max Planck Institute for Solar System Research in Germany.

So far, the search has been unsuccessful, Sierks exlplained: “The problem is that sets of three spots are very common all over the comet nucleus. Hatmehit and the area around its rim where we’re looking is full of boulders and we have identified several sets of three spots.”

While Rosetta will be flying to within six kilometers of the comet’s surface on February 14, its current trajectory will bring to closer to the lower part of the larger comet lobe than the Philae’s potential landing site. The ESA has said that it does not plan to conduct any additional, dedicated searches for the lander, according to the BBC, and will simply wait for it to wake up.

“Rosetta’s busy science schedule is planned several months in advance, so a dedicated Philae search campaign was not built into the plan for the close flyby,” said Rosetta project scientist Matt Taylor. “We’ll be focusing on ‘co-riding’ observations from now on, that is, we won’t be changing the trajectory of Rosetta to specifically fly over the predicted landing zone in a dedicated search, but we can modify the spacecraft pointing and/or command images to be taken of the region if we’re flying close to the region and the science operations timeline allows.”

“After the flyby we’ll be much further away from the comet again, so are unlikely to have the opportunity for another dedicated lander search until later in the mission, maybe even next year,” added ESA’s Rosetta mission manager Fred Jansen. “But the location of Philae is not required to be able to operate it, and neither does it need to be awake for us to find it.”

Philae is expected to reactivate shortly after the comet’s southern hemisphere becomes in the next few weeks, and once it boots up, it should be able to communicate with Rosetta. The orbiter will call out to the probe and wait for a reply, and while the energy required for this process will make Philae immediately fall back to sleep, it will continue to gather light using its solar panels.

Eventually, it will recharge enough to maintain a stable telecommunications link and begin recharging the battery as well, according to the BBC. It could begin communications as soon as May or June, and resume its science operations sometime around August, when 67P/C-G will be closest to the Sun (perihelion) and in its most active phase.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Scientists, general public disagree on many key issues

Chuck Bednar for redOrbit.com – Your Universe Online

When it comes to some of biggest hot-button issues facing the world today, there is little common ground to be found between the scientific community and members of the general public, according to a new report from the Pew Research Center.

In fact, there is a 51-percentage point gap between scientists and the population at large over the safety of consuming genetically modified foods, the American Association for the Advancement of Science (AAAS), which collaborated with Pew on the study, said in a statement.

Only 37 percent of the public believes that it is safe to eat GMO products, which 88 percent of the AAAS scientists polled during the study believe that consuming such products is not harmful to a person’s health. This was the largest opinion difference between the two groups.

When it comes to the idea of using animals for research purposes, 89 percent of scientists are in favor of it, while just 47 percent of the general public supports the idea. Eighty-seven percent of scientists said that they believed that climate change was caused primarily by human activity, but only half of the non-scientific community felt likewise, the study revealed.

Similar disagreements were present on several other issues, including whether or not it is safe to eat foods grown with pesticides (68 percent of scientists say yet, compared to just 28 percent of citizens, a 40-percentage point gap) and whether or not humans have evolved over time (all but two percent of scientists say yes, while less than two-thirds of the public agree).

“Science is a huge, sprawling cluster of subjects,” explained lead author Cary Funk, associate director of science research at Pew. “We knew from the 2009 Pew Research Center study that there could be differences between the public and scientists on at least some issues. But we were surprised by the size of those differences and how often they occur.”

According to PBS Newshour, there is less disagreement between the populace and the scientific community when it comes to whether or not children should be vaccinated (86 percent of experts say yes, as do 68 percent of the general public); whether or not offshore drilling should increase (32 percent of scientists and 52 percent of the public agree); and if there should be more fracking (only 31 percent of scientists and 39 percent of the general population say yes).

Likewise, the public and scientists agree on one other core issue: that US elementary and secondary schools are not doing enough to advance science, technology, engineering and math education (STEM). Just 16 percent of scientists and 29 percent of the general public said they would rank American K-12 STEM programs as above average or best in the world.

“Whatever their disagreements, most in the public and science community see STEM education as a concern,” explained Lee Rainie, co-author and Pew Research Center director of internet, science and technology research. “When both groups basically speak in the same voice about an issue, it is worth paying attention.”

“While the public is still broadly positive about the contributions of science to society, there has been a slight rise in negative views across a number of measures, suggesting some softening in the perceived value of science to society,” Funk added. “These patterns will be important to watch over time.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Prostate study results

Lead author M. Minhaj Siddiqui, M.D., assistant professor of surgery at the University of Maryland School of Medicine and director of urologic robotic surgery at the University of Maryland Marlene and Stewart Greenebaum Cancer Center, discusses results of a new prostate cancer study, published in JAMA. Researchers found that targeted biopsy using new fusion technology, which combines magnetic resonance imaging (MRI) with ultrasound, is more effective than standard biopsy in detecting high-risk prostate cancer.

Credit: University of Maryland School of Medicine

Female libido busters and what to do about them

Duana C. Welch, Ph.D for redOrbit.com – Your Universe Online

Has your sexual get-up-and-go gotten up and gone? Low desire is the top, self-reported sexual problem for women—and the toughest to treat. Here are three top libido busters, and what to do if this is you.

Shameful thinking

Shame, the deep sense that something is wrong with us, is not our friend. It undermines our efforts in many areas of our lives, especially sexuality. Are you embarrassed by your body, and worried about a partner’s acceptance of it?  You can’t focus on your bra size and your pleasure!

And spectatoring, where you behave as your own worst critic during sex, is shame re-named. It’s as if you’re the armchair quarterback at the Sexual Olympics, and you’ve just voted yourself the Least Valuable Player.

Try this scientifically proven strategy to feel more self-acceptance, more in the moment, and more sexual mojo. Whenever you’re feeling un-sexy, notice shameful feelings as they arise, and redirect your thoughts to something more reality-based: “I’m feeling ashamed of my body, but my partner wants me—I am desired and desirable.”

Another aspect of shame is your sexual beliefs. Low desire is often related to beliefs taught to us by parents and society. Unfortunately, in our zeal to keep girls innocent, we often convey ideas that won’t serve them well as adults. Do you think sex is something “nice” women don’t (or shouldn’t) enjoy? Do you think your genitals are disgusting, something nobody should want to touch?

The fix for this is the same as the one for body shame: notice and redirect. “I’m feeling shame about wanting sex, but it’s normal, natural, and healthy for a grown woman to want and enjoy sexual connection. I deserve sexual pleasure.”

Ghosts from the past

I’m sure it won’t surprise you that rape and sexual abuse can dampen a woman’s desire for years. Yet most women apparently move through these issues without therapy. That said, if you need help, make sure you get it. Cognitive behavioral therapy is proven to help women overcome past abuses and get their groove back. If you have insurance, choose a provider who offers it.

Partner problems

It’s been said that for women, everything our partner says and does is foreplay—and it’s true. Many women with low desire are having partner issues that include feeling low trust in, or low love and respect from, our mate. If you and your partner need to get back on track with some great relationship skills, the top science-backed therapy is Gottman Method Couples Counseling.  Whether you both attend, or you have to go it alone, you can find a therapist using this link: http://www.gottman.com/private-therapy/

In conclusion

There are many causes of low libido, and unfortunately, there’s no magic cure that deals with them all.  Ultimately, though, a little mental floss, and perhaps some therapy, can help deal with these top causes so you can get your groove back.

Duana C. Welch, Ph.D., is the author of Love Factually: 10 Proven Steps from I Wish to I Do (2015); you can read more and get a free chapter at http://lovefactually.co.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Hurricane Sandy & Climate Change: redOrbit interviews author Adam Sobel

Justin Stokes for redOrbit.com – Your Universe Online

The October 2012 mega-storm known as Hurricane Sandy caused a massive blow to the East Coast. According to FEMA’s Sandy Recovery Office, the hurricane “forced tens of thousands of survivors into shelters and caused billions of dollars in damage” and took the lives of 73 people in the United States alone.

Though storms of Sandy’s magnitude carry their own deadly qualities, it’s the withdrawl from disaster preparedness and forward-thinking that gave the hurricane it’s teeth. Columbia University professor Adam Sobel witnessed the storm firsthand, and has since made it his personal mission to educate about future storms. His book Storm Surge: Hurricane Sandy, Our Changing Climate, and Extreme Weather of the Past and Future bypasses the culture of media compression and shares his findings directly with the public.

Chatting with redOrbit, Sobel shared his observations about Hurricane Sandy and the importance of acting quickly against dangers in the distance.

First thing’s first. Your book Storm Surge. Let’s talk about the inception of the book, you putting it together, and what spawned your interest in Hurricane Sandy and the following aftermath?

Right, well for about fifteen years now – let’s say about twelve years before Sandy –  one of the my research interests has been hurricanes, and the climate in which they occur. But I’m also somewhat of a weather observer. I watch weather coming and going in New York often, and so when Sandy was coming – about a week before landfall –some of the models started to predict that something would happen like what eventually did, and I started paying attention to it. As the week went on, colleagues and I watched it get closer and closer, and I started getting calls for media interviews. I think the reporters wanted to talk to an academic who studies hurricanes, but also lives in New York, because of the chance that it was going to hit here. The media interest kind of “snowballed,” and I got a lot of calls.

I hadn’t talked to the media before, and that’s one of the things that led to me writing the book. What happens as scientists is that when we do our research, we study relatively narrow questions. As researchers, we try to advance the “state of knowledge,” and you do that one narrow question at a time. You study every aspect of something. You study hurricanes, hurricane climates, etc.

The notion of being a specialist, so that you may further a particular discipline by studying every detail.

Right. By necessity we ask fairly specialized questions.  So you know, if you write a research paper that gets a press release, the media calls and then you’re talking about exactly the specialized topic of your research. But when they call to ask about a storm that just happened, all they know is that I’m a meteorologist, and they don’t exactly care about what my research is. They’re just calling to talk about the storm. They also ask broad questions like “What was that storm?”, “Why was it so bad for New York?”, “Was it caused by climate change?”. “What should we have done to be better prepared?” So they’re asking questions not limited to my own research expertise. It got me thinking about those broader questions, and I would ask myself “What should I have said?”

And the thing is, there’s no one who is qualified to answer all the broad questions that are asked. It’s not like the question, “What should we do next to be prepared for hurricanes?” has an answer you can find in a textbook. There’s no expert who has an absolute claim to all the knowledge needed to answer that with authority. So I realized I should have something to say about these things. I’m a scientist who is financed by government money, and the public has an interest in what we do. People care about what happened, and so I should have something to say about these things.

Many of my colleagues from related disciplines in the New York area were also involved in a huge amount of media coverage about all kinds of different issues related to sandy. So what I did in the following semester was organize a graduate course. Not just on [Hurricane] Sandy, but on all the different aspects. We met twice a week. I taught a few of the classes, but most of the time I’d have a guest speaker come in from a discipline other than mine. I had people talk about the subway systems, insurance, human psychology and decisions made in impeding disasters. At that time, I was thinking about writing a book.

I should say that I wanted to write a popular science book for some time, and I hadn’t had an idea that was good enough, but then Sandy happened and I felt that this is what I should write about. There was a huge amount of press, and I was just consuming everything. After a couple of months of this, I felt that there was a story about the storm and the aftermath. I also felt that I could use it as an excuse to write about many other scientific issues that were related. I was then able to get a publisher and a contract to write it. It’s not just the media, it was also the experience of studying one event as a meteorologist while living through it as a citizen.

So with this book, you’re combining the human experience with your interest in your discipline? Going back, let’s discuss your assessment of “basic question asking” through the media. Was there a lot of frustration with the limited amount of curiosity in media outlets?

No, actually, I would put it differently. I think the fact that I was asked the same questions over and over again made me think that if I didn’t have answers to those questions, maybe I should. It made me feel a responsibility that I didn’t before. Maybe this question is not about exactly what I do, but it’s close enough that if this is what the public wants to know – and the questions seem reasonable – then maybe it is my job to answer those questions.

But yeah… what’s frustrating for scientists dealing with the media is that we’re asked questions whose answers really are complex. We have to answer them in a short time, in a very small number of words. Especially for major outlets, like TV network news. We understand the reason for that; everyone has limited time in those shows, and you have to be concise. But often, you can’t get across all the nuances. I felt that if I wrote a book, I could put all that in there, so that people get the whole story. The format of most media stories is very limiting, and with an event the magnitude of Sandy, I feel it deserved a longer treatment.

You mentioned the human experiences in the book. What can you share about observing Hurricane Sandy?

I should say that the book is not mostly “human interest,” – it doesn’t have a lot of stories about victims of the storms, for example – but that it does have my perspective. I didn’t experience the worst of it by any means. I didn’t live in a neighborhood that was badly hit. I live up a hill, so we didn’t lose power. I didn’t even see much of the damage until quite a bit later.

In the week leading up to the hurricane… weather forecasts are only good to a point. Meteorology can’t tell you what the weather’s going to be three weeks from now. But at some point, we start to have forecasts that have some skill in them. In Sandy, that was about eight days before landfall; at that point we started getting those forecasts showing what might happen. It started out as “It might be a big deal, or it could blow out to sea and be nothing.” By about five days before, it was pretty certain that the storm would make landfall. By the weekend, it was clear that it was going to be a disaster for New York City.

That experience for me was fascinating, but also scary. Sandy, as a hurricane causing a disaster in NYC, was without precedent, at least in modern times where we have good data. There’s nothing quite like it in the historical record. So it was a really interesting event to watch. But I knew New York City was in trouble once it got within a certain range. That in itself was fascinating, because I was aware of the risks of a storm hitting here, and understanding those risks is part of my interest as a scientist. With a bad enough hurricane strike, I knew that flooding was a big risk. It’s documented; we have a lot of low lying areas and vulnerable infrastructure.

What made it a complicated experience, however, was that as it got closer and closer, it became more and more of a visible danger. At some point, it clicked in that “Wow! I actually live here.” (Laughs) And then it’s not just of academic interest. This is actually going to be real disaster where I live. The appropriate reaction was fear, but I couldn’t turn off the fascination. I think about the epidemiologists who study Ebola, and when there’s an epidemic, it’s gotta be fascinating to them. To be there when it’s happening. But it’s very scary and dangerous.

Now, I knew I wasn’t gonna die or anything. I knew that there wasn’t any personal harm. But it seemed likely that it was gonna be a big catastrophe for the city. I felt we might lose power, that the transit system might shut down, that there might be a disaster. And that people might be killed. There was an increasing level of actual personal concern. That, combined with still being fascinated as a scientist, was a complicated psychological experience that I still haven’t fully understood.

During the night of the storm, I was home talking to media and watching all of the data. I looked at the radar, the models, and all of the weather maps on the computer. I paid attention to the tide gauge, and watched the water rising. At some point on Monday night, as the landfall occurred and the storm surge was peaking, I realized that I had been inside the whole time. I just felt like “This is wrong, I’ve got to go out and see what the weather looks like,” even though the mayor was saying “Don’t go outside.” So I told my wife I would just go stand in the doorway of our apartment building to see how it is. And then I told myself “Maybe if it’s not too bad, I’ll go out a little bit further.”

It wasn’t that bad, so I walked half a mile to where I could see the Hudson River. I saw the water up over the street, near 133 St. and 12th Avenue. The water was very still, and though I knew this was a huge catastrophe, it was actually kind of a peaceful scene. But I knew what it meant.

Regarding the danger, what red flags should people be looking for regarding trends in weather patterns, and how much of that is associated with climate change? And what can be said for people who have referred to Hurricane Sandy as a “freak weather event?”

Well, there are climate change deniers… but to argue about whether the hurricane was caused by climate change doesn’t make you a “denier” by itself. The science is unequivocal now that human emissions of greenhouse gases are warming the climate significantly. But the science is by no means clear that Hurricane Sandy was directly related to climate change, at least if we talk about the storm itself.

We do think that a warming climate is going to make hurricanes more powerful. It may not make more or fewer of them, but the risk of a powerful hurricane may increase or decrease in some places. The relationship is a complicated story. Beyond that, Sandy was an unusual storm in a lot of ways. It was a very large, hybrid storm that merged with a Nor’easter, and it had an unusual path. All those things, and how they relate to climate change, hasn’t been studied in great detail. I don’t think the science allows us to point to a direct causality.

However, the connection between Sandy and climate change that is quite strong is sea level rise. It is happening, and there’s been an increase of about a foot in New York in the last hundred years. Maybe eight inches or so is due to the warming climate. That’s a small, but significant, part of the story. The storm surge was nine feet, and a portion of that was attributed to climate change. As we go into the future, the sea level rise is going to be faster. If we start from a higher baseline, then it takes less of a storm surge to cause the same flood. So a weaker storm will cause the same flood as Sandy did. Weaker storms occur more frequently than stronger storms. So if we don’t do anything to change the infrastructure, then coastal flood disasters will become more common.

I’m not happy about climate change denial, but the people who say that Hurricane Sandy was caused by changes in climate are overstepping the bounds of the existing science. Climate change did contribute to the flood for sure, by causing sea level rise, and it will increase the risk of similar floods more in the future. Sea level rise is definitely happening. We also are pretty confident that the rainfall in hurricanes is increasing due to climate change.

What problems lurk within the vulnerabilities of New York City’s Infrastructure?

In New York, it’s important to distinguish between before and after Sandy. Before, it was understood that there were issues. There were reports written about the risk of flooding due to a hurricane in New York City, and it was known what the damage would be from a big coastal flood. There were a number of studies. For example, the fact that the subways would flood, the transit tunnels, was known. It was known that there was a risk of power outages. These things were expected, and even though they were known, almost no investment was made in preventing them. Nobody knew when it would happen, and it seemed far away in the future. But now that Hurricane Sandy has happened, New York is seeing a much greater effort in addressing some of these vulnerabilities. Things still aren’t “storm proof,” but there’s a consciousness after the event happens.

There are many other areas on the US coast that are low-lying and at risk. Some of the obvious ones are Florida, which hasn’t had a big hurricane in a while. It’s just a matter of time, and they’re also very vulnerable to sea level rise because the state is quite flat. Also, a lot of their bedrock is limestone, which means that it’s very porous. As the sea level rises, the water will come underneath. We’ve already seen some of this in Miami, where neighborhoods have street flooding even without a storm sometimes.

Another area at risk is the Barrier Islands of North Carolina. I don’t know if this is still true, but a while ago the state made sea level rise illegal. They declared that you can’t use projections of sea level rise in capital planning, which to me is crazy.

A lot of the Eastern seaboard, as well as the Gulf Coast, has cities that could be at considerable risk from  the combination of sea level rise and storms. The West Coast, in some ways, is in better shape because they’re mountainous and the land rises up a little bit quicker. They also don’t get hurricanes, though they do see other types of storms.

Could other areas outside of the coast see problems associated with climate change?

It’s certainly possible. The Midwest, for example, doesn’t get hurricanes, but the Midwest is at risk of other kinds of things. Heatwaves in particular are the most certain problems associated with climate change. In a warmer climate, they’re going to be more frequent.

Studies have now been done about an Australian heatwave that occurred over a year ago. It conclusively showed that climate change played a role in that event. I think that as time goes on, the Midwest will be vulnerable to that. Not just the urban populations, but eventually agriculture will have problems adapting to the heat stress. Floods and droughts, ironically, may both become more common, and I think the Midwest will see more of those too.

In both the prediction and prevention of hurricane damage, how far back do those initial studies go?

It depends. There was an event that caused a flood about as bad as Sandy in 1821. The city only had 150,000 people instead of eight million, so there was a lot less to destroy. That event was known, but the data wasn’t as good. And there have been storms nearby that came close to causing disaster for the city. The big one that everyone remembers was the 1938 hurricane, which the book has a chapter about. That was really disastrous for Long Island and New England, though it mostly missed the city. If it comes close, obviously everyone understands that it could have taken a different turn. So it was clear that there was some risk of a bad hurricane in New York City.

There were also a series of reports written about how New York City should prepare for a possible hurricane. One that stands out was written in 1995, and it was new in that in contained simulations. Previous reports were just based on recent historical data from the past.

But for this particular study, the government planners in charge asked the Hurricane Center to make them an estimate of what was the worst that could happen, even if it hadn’t happened in the past. So the Hurricane Center did some simulations of the worst hurricane that could hit New York. They did some simulations of what the storm surge would be, how bad the flooding would be. They came up with data that was much worse than anything that the local authorities had thought about before. In particular, they took some transit facilities, took photos of them, and drew lines showing how high the water could get. There’s one great photo that shows the old South Ferry Subway Station (at the south end of Manhattan where you get on the ferry to go to Staten Island). At the subway station, they show the water getting way above the doors where you enter from the street. This meant for sure that the subway tunnels could fill up with water. And that was less than twenty years before Sandy. Then the Metro Transit Authority built a new subway station in South Ferry to replace the old one right in the same place with no protection. They opened it in 2012, before Sandy. That cost us $550 million, and it flooded. And the cost of fixing it was $600 million, which was totaled.

So, the documentation was there that this danger was known for at least twenty years. That you should not build a subway station at this place without some sort of flood protection, but that’s what they did. And I don’t think that New York is worse than anywhere else. I think that this is typical of human nature. This report didn’t say when there was going to be a storm like this; only that someday it could happen. It’s really hard for government to make investments if they’ve never seen it happen. Now that we’ve seen it, a lot of investments like this are starting to be made.

What can people learn from the Hurricane Sandy event, and how can we prepare against that level of damage?

I think the most important thing, and I think this is how Sandy is most importantly related to climate change – not just about whether climate change played a role in Sandy per se, but what Sandy teaches us about how humans respond to risk. So what I just told you was a story of how scientists said “There’s a risk of something happening,” and it’s clear what kinds of investments you could make to be better prepared for that risk, but nobody made those kinds of investments because it was far off in the future and outside people’s experience. I think climate change is the same way. Scientists are telling the human race that something is happening; we can’t tell all the details with precision, but we know the way things are going and that it carries a bunch of risks with it. And we know what we should do, which is to find ways to reduce greenhouse gases and be better prepared for the things that could happen. By and large, it’s difficult for people to respond to this, not just because there are “deniers,” but even the people who aren’t deniers… it’s not at the top of people’s agenda. There are things happening today, and it seems so far off into the future. But I think it’s important because it is outside of people’s experience. The climate of the future is going to be quite different than the climate of the past.

The coolest summer in New York, by the end of the 21st century, is going to be hotter than the hottest summer we’ve experienced up to now, in modern human history. It’s going to bring a whole lot of changes and risks. It seems far away and vague. But I think what Hurricane Sandy teaches us is that if we could find a way to learn to act on scientific predictions before the worst is upon us… because we’re most good at being “reactive.” We tend to get a lot better at prevention once we experience the bad event once. Sandy is one example where we’re making a lot of investments. Another example is with the Dutch. The Dutch have done all these impressive modern flood control measures, but it was after they experienced a particular disaster, in 1953.

So, we’re good at being reactive. But it would be great if we could do something before we’ve experienced it. Climate change is acting very slowly, but once it’s upon us it’s going to be very hard to reverse it – you can’t really reverse it, you can only slow it down. And we’re really not good at that (being proactive about long-term risk) as a species. If you’ve read this book by Daniel Kahneman called Thinking Fast and Slow, he writes about this. He’s a Noble-prize winning psychological whom I cite in the book. He’s brilliant, and he writes about “the availability bias.” People understand risks of something once it’s happened, and then we tend to over-estimate that something’s going to happen again soon. But if it’s never happened, people tend to under-estimate it. So in the assessment of risk, even if people know what the odds are, our level of preparation is not based on rationality. It’s based on what’s experienced recently. Since climate change is in the future, it’s hard for us to react to it now. It’s a human cognition problem that hinders our ability to overcome risk. Sandy is an example of this, even if it has nothing to do with climate change. We didn’t take it seriously enough.
Even people who believe the risk is there have a hard time seeing it as a high-priority issue. We have to get better at using rationality as a guide. The future is going to be different than the past.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Big Bang’s gravitational waves were a bust, report finds

Chuck Bednar for redOrbit.com – Your Universe Online

Reports published last year claiming to have detected evidence of gravitational waves emitted immediately following the Big Bang were inaccurate, new analysis of the data has revealed.

Last March, researchers at the Harvard-Smithsonian Center for Astrophysics announced in a news conference that an experiment called BICEP2 had revealed that a telescope at the South Pole had detected the waves while scanning the background radiation of the universe.

They said that the light had apparently become polarized by gravitational waves emitted in the initial moment after the explosion that caused the universe to expand. Their findings appeared to be evidence of the theory of cosmic inflation, which asserts that the waves made ripples in the cosmic microwave background radiation and helps explain the size and structure of the universe.

Shortly after their announcement, however, other researchers cast doubts on their findings, noting that the signal they detected could have been generated by dust within our galaxy. So the Harvard team joined forces with the European Space Agency to further investigate the matter, and sure enough, that’s exactly what happened, according to BBC News reports.

“Despite earlier reports of a possible detection, a joint analysis of data from ESA’s Planck satellite and the ground-based BICEP2 and Keck Array experiments has found no conclusive evidence of primordial gravitational waves,” the ESA confirmed in a statement.

A paper detailing the results of their analysis has been submitted to the peer-reviewed journal Physical Review Letters, but the BBC said that the conclusion was “not a major surprise,” since the team itself had already publically stated that their confidence in their discovery had waned.

As the National Science Foundation (NSF) explained in a press release, the initial findings were based on observations of the polarized Cosmic Microwave Background (CMB) in a patch of sky between 2010 and 2012. The data collected revealed a signal that was previously undetected: “curly B-modes” observed in stretches of the sky several times larger than a full moon.

While the BICEP2 team released evidence suggesting that the signal originated in primordial gravitational waves, the NSF noted that interstellar dust found in the Milky Way can produce a similar affect. The new paper concluded that the interpretation of the earlier data as evidence of gravitational waves was “no longer secure” once possible dust contamination is accounted for.

“Searching for this unique record of the very early universe is as difficult as it is exciting, since this subtle signal is hidden in the polarization of the CMB, which itself only represents only a feeble few percent of the total light,” said Jan Tauber, ESA’s project scientist for Planck.

“When we first detected this signal in our data, we relied on models for Galactic dust emission that were available at the time,” added John Kovac, a BICEP2 principal investigator at Harvard. “These seemed to indicate that the region of the sky chosen for our observations had dust polarization much lower than the detected signal.”

Planck observed the sky in nine microwave and sub-millimeter frequency channels, seven of which were also equipped with polarization-sensitive detectors. While the BICEP2 team selected a region of the sky where they believed foreground dust emissions would be low, the new analysis reveals that it could have been far higher than previously believed.

“This joint work has shown that the detection of primordial B-modes is no longer robust once the emission from Galactic dust is removed,” said Jean-Loup Puget, principal investigator of the HFI instrument on Planck at the Institute d’Astrophysique Spatiale in France. “So, unfortunately, we have not been able to confirm that the signal is an imprint of cosmic inflation.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

New telescope could be 1,000 times more powerful than Hubble

Chuck Bednar for redOrbit.com – Your Universe Online

A new instrument being developed at the University of Colorado Boulder could be capable of capturing images up to 1,000 times sharper than those provided by the Hubble Space Telescope, officials at the university have revealed.

According to CBS News, the device is known as the Aragoscope in honor of French scientist Francois Arago, who was the first man to ever detect diffracted light waves around a disk. The Aragoscope would consist of an orbiting space telescope and an opaque disk in front of it that could be up to one-half mile across.

The researchers behind the instrument said that diffracted light waves from a target star or other type of object would bend around the edge of the disk, converging in a central location. The light would then be sent to the telescope in order to provide extremely high-resolution photographs.

The Aragoscope could allow scientists to image space objects like black hole “event horizons” and plasma swaps between stars, the CU-Boulder team claims. It could also be pointed at Earth, where its ability to image rabbit-sized objects could help find campers lost in the mountains.

Professor Webster Cash and his colleagues updated NASA on the progress of the novel telescope system last week. The Aragoscope was one of 12 proposed project granted Phase One funding by the US space agency’s Innovative Advanced Concept (NIAC) program last June. In April, six of those 12 projects will be awarded a two-year, $500,000 award as part of Phase Two funding.

“Traditionally, space telescopes have essentially been monolithic pieces of glass like the Hubble Space Telescope,” said Anthony Harness, a doctoral student in the university’s Department of Astrophysical and Planetary Sciences. “But the heavier the space telescope, the more expensive the cost of the launch. We have found a way to solve that problem by putting large, lightweight optics into space that offer a much higher resolution and lower cost.”

Part of the Agaroscope’s architecture comes from a previous project that earned Cash Phase One and Phase Two NIAC funding – a daisy-shaped “starshade” that would block light from a star while allowing light from its planets to leak around the image so that they could be imaged. For that reason, he believes his team is “in pretty good shape” entering Phase Two.

The Aragoscope would be placed in a geostationary orbit 25,000 miles above the Earth’s surface and would follow the planet’s rotation, making it appear as though it is motionless. The opaque disk would be make from a strong, dark plastic-type material that could be folded up like a parachute at launch and then unfurled once it reaches orbit. The space shield would be fastened to the telescope at distances of up to hundreds of miles, based on the size of the disk.

“The opaque disk of the Aragoscope works in a similar way to a basic lens. The light diffracted around the edge of the circular disk travels the same path length to the center and comes into focus as an image,” said Harness. Since the resolution of the image would increase along with telescope diameter, being able to launch a large but lightweight disk would allow for higher resolution images than smaller, traditional space telescopes, he added.

The CU-Boulder team plans to conduct a lab-based demonstration of the Aragoscope concept using a one-meter disk placed several meters from a telescope, and a light source that would be fixed between five and 10 meters behind the disk. In addition, they hope to test the starshade concept by fixing a space disk on a mountaintop, attaching a telescope on a hovering aircraft, and using it to capture an image of the binary star system Alpha Centauri.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Gender identity is innate in transgender children, study finds

Chuck Bednar for redOrbit.com – Your Universe Online

In contrast to the commonly-held belief that transgender children are confused or subject to peer pressure, a new University of Washington study indicates that those youngsters have deeply-held gender identities that remain consistent over a variety of different measures.

The new research, which was led by UW psychological scientist Kristina Olson and appears in the Association for Psychological Science (APS) journal Psychological Science, is believed to be one of the first to explore gender identity in transgender children using implicit measures that are less susceptible to modification than self-report measures.

In a statement, Olson explained that she decided to start the project partially because she wanted to see what children felt about social groups, but mainly because she had seen a close friend with a transgender child go through a series of challenges related to that youngster’s identity.

“Seeing how little scientific information there was… was hard to watch,” said Olson. “Doctors were saying, ‘We just don’t know,’ so the parents have to make these really big decisions: Should I let my kid go to school as a girl, or should I make my kid go to school as a boy? Should my child be in therapy to try to change what she says she is, or should she be supported?”

The notion that prepubescent children can truly be transgendered has met with some skepticism, and many experts believe that the best approach to dealing with “gender-variant” children is to encourage them to become comfortable with their biological genders.

Recently, though, doctors, parents and mental health professionals have increasingly started encouraging these youngsters to live that the gender that they most identify with. To get a better understanding of gender identity in transgender children, Olson set out to use scientific methods to find out if gender identity is deeply held, or if it is the result of confusion or pretense.

She and co-authors Nicholas Eaton from Stony Brook University and Aidan Key from Seattle-based transgender support group Gender Diversity analyzed transgender children who had been living as their identified gender in all aspects of their lives, who had not yet reached puberty, and who lived in home environments that were supportive of their decisions.

Those youngsters, as well as their cisgender (non-transgender) siblings, were recruited for the study through support groups, conferences, and word of mouth. In addition, cisgender children who were age-matched to the other participants were recruited for the study from a database of families interested in participating in developmental psychology research studies.

The study authors first used self-reporting measures to ask children to reflect on aspects of their genders, as well as implicit measures designed to measure the strength of their automatic gender associations. Overall, data from the measurements indicated that transgender children responded in a manner indistinguishable from the responses of two groups of cisgender children.

“While future studies are always needed, our results support the notion that transgender children are not confused, delayed, showing gender-atypical responding, pretending, or oppositional,” the authors wrote. Instead, their responses were “entirely typical and expected for children with their gender identity,” and the researchers believe that their findings “should serve as further evidence that transgender children do indeed exist and that this identity is a deeply held one.”

Olson said that she hopes to recruit up to 100 additional transgender children and follow them into adulthood in order to gauge how the degree of support they receive influences their overall development. In addition, she hopes to figure out if that support translates into more positive outcomes than those found in current transgender adults living in the US.

“We have absolutely no idea what their lives will look like, because there are very few transgender adults today who lived as young kids expressing their gender identity,” she added. “That’s all the more reason why this particular generation is important to study. They’re the pioneers.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

NASA successfully launches SMAP spacecraft

Chuck Bednar for redOrbit.com – Your Universe Online

A NASA spacecraft designed to provide high-resolution soil moisture measurements and help scientists better predict extreme weather events successfully launched from the Vandenberg Air Force Base in California this morning.

The Soil Moisture Active Passive (SMAP) will also report on the state of that moisture (i.e. if it is frozen or thawed) once it the Earth’s orbit, according to the US space agency. SMAP will map the entire globe every two to three days for a period of at least three years, they added.

The vehicle will provide what NASA is calling the most accurate and highest-resolution maps of soil moisture ever obtained, and that information is also expected to help monitor climate change and improve our understanding of our planet’s water, energy and carbon cycles.

SMAP will enter a final circular polar orbit of 426 miles (685 kilometers) at an inclination of 98.1 degrees. The probe will orbit the Earth once every 98.5 minutes, repeating the same ground track every eight days, and the information it provides will help scientists monitor droughts, keep tabs on crop productivity and even improve the quality of weather forecasts.

The mission blasted off from Vandenberg’s Space Launch Complex 2 at 9:22am EST, and was carried into space by a United Launch Alliance Delta II rocket. The successful launch comes on the heels of delays that forced Thursday’s originally-scheduled launch to be scrubbed due to high winds and Friday’s rescheduled event to be pushed back for repairs to the launch vehicle.

SMAP will combine measurements from both a radar and a radiometer in order to compile data that has high resolution and high accuracy. Those two instruments will work together to look into the top two inches (five centimeters) of the soil through clouds and moderate vegetation, and will be able to operate at any time during the day or night, according to NASA.

The term “active passive,” which is part of the mission name, is a reference to the two types of instruments that will be used by SMAP. One of those instruments is a synthetic aperture radar that will actively emit a signal and measure the backscatter that returns from Earth. The other is a radiometer instrument that passively records Earth’s naturally emitted microwave signal.

Information about changes in soil moisture will be communicated through variations in the signals of the two instruments. SMAP’s radar observations will have higher resolution but lower accuracy than the radiometer, and conversely, the radiometer will provide observations that are more accurate by in lower resolution than is possible using the radar equipment.

“Soil moisture is an important part of the Earth’s climate,” Chuong Nguyen, SMAP mission manager at the Kennedy Space Center Launch Services Program (LSP), said on Wednesday. “As it evaporates, it condenses into the clouds and atmosphere, and that in turn becomes rain later in the weather cycle. SMAP will help with climate forecasting and help predict a good growing season. That’s an important part of agriculture, in the U.S. and around the world.”

Saturday’s payload also includes the ELaNa X CubeSats, a total of four probes representing three separate missions: ExoCube, which will measure the densities of ions and neutrals in the upper ionosphere and lower exosphere; GRIFEX, which will help the proposed GEO-CAPE mission monitor atmospheric chemistry and pollution levels; and FIREBIRD-II (A and B), space weather satellites that will study electron microbursts in the Van Allen radiation belts.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Low-mass particle could lead to dark matter detection

Chuck Bednar for redOrbit.com – Your Universe Online

Even though dark matter is believed to make up 85 percent of the universe’s mass, no one has managed to detect the elusive material, but a new fundamental particle proposed by scientists at the University of Southampton could finally change that.

Dark matter is believed to exist because of the gravitation effect it has on stars and galaxies, the gravitational lensing (or bending of light rays) that occurs around these objects, and its imprint on the afterglow of the Big Bang (also known as the Cosmic Microwave Background).

Despite what the researchers call “compelling” indirect evidence to support its existence, and an immense amount of effort from astronomers, no one has been able to directly detect dark matter yet. Clues as to what it could be can be found through particle physics, however.

The standard view, the study authors explain, is that dark matter particles have a very large mass for fundamental particles, similar to those of heavy atoms. Lighter dark matter is unlikely due to several astrophysical regions, though some exceptions have been identified, they added.

The new study published earlier this week in the journal Scientific Reports presents the possibility low-mass dark matter particles exist and could be directly detected. These lighter particles would have been missed by all experiments conducted to date, the researchers claim, and neither constraints from particle physics nor cosmological observations can rule out their existence.

“Our candidate particle sounds crazy, but currently there seem to be no experiments or observations which could rule it out,” he added. “Dark Matter is one of the most important unsolved problems in modern physics, and we hope that our suggestion will inspire others to develop detailed particle theory and even experimental tests.”

The proposed lighter dark matter particle has a mass of just 100eV/c^2, or approximately 0.02 percent that of an electron, according to their research. Unlike heavier forms of dark matter, it would not interact with light, though it would surprisingly interact with normal matter.

Also, unlike other candidates, this low-mass dark matter may not even be able to penetrate the Earth’s atmosphere, rending detection from the ground unlikely. As a result, Dr. Bateman and his colleagues plan to incorporate the search for these particles into a space experiment planned by the Macroscopic quantum resonators (MAQRO) consortium.

By suspending a nanoparticle in space and exposing it directly to the flow of dark matter, the study authors believe that they may be able to observe it being pushed downstream. Monitoring the nanoparticle’s position could then shed new light on the existence of low-mass dark matter.

“At the moment, experiments on Dark Matter do not point into a clear direction and, given that also the Large Hadron Collider at CERN has not found any signs of new physics yet, it may be time that we shift our paradigm towards alternative candidates for Dark Matter,” said co-author Dr. Alexander Merle from the Max Planck Institute in Munich, Germany.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

NASA engineer developing daytime star tracker

Chuck Bednar for redOrbit.com – Your Universe Online

Typically, high-altitude scientific balloons only collect data when starlight can be detected at night, but one NASA engineer is working on a new device that may help instruments overcome the sunlight that limits their field of vision and operate before the sun goes down.

Scott Heatwole, a member of the agency’s Wallops Flight Facility (WFF) Balloon Program, is in the process of developing a low-cost precision attitude sensor or “star tracker” that would be able to locate stars and use them as points of reference in the sky during daylight hours.

Doing so would help orient instruments so that they can find their intended research targets more easily. His proposed off-the-shelf solution would also help advance the use of high-altitude research balloons, which are currently capable of carrying scientific instruments into the stratosphere and remaining aloft for several days at a time, according to NASA.

“A precision attitude sensor capable of working in the daylight would extend science operations through the day which would significantly increase the amount of science collected,” Heatwole explained. “Currently, the only precision attitude sensor available in daytime is a sun sensor, and this isn’t ideal because it provides only two axes of attitude and is not precise over a range of targets across the sky.”

He is developing his daytime star tracker to be used specifically with the Wallops Arc Second Pointer (WASP), which the agency claims would be able to use data obtained from the system to direct a balloon-borne scientific payload with incredible accuracy and stability.

WASP to change the future of star tracking

Currently, WASP is outfitted with the commonly used the ST5000 star tracker. However, NASA explains that the device is unable to image during the day, even when operating at altitudes of 120,000 feet. While it is relatively dark that high off the ground, the scattering of sunlight off the atmosphere tends to overwhelm the starlight in most star cameras, the agency explained.

While other researchers have developed similar custom star trackers capable of operating during the day, Heatwole is the first to assemble a package that also includes cameras, computers, and the algorithms required to process data and eliminate excess visible light in real time.

His tracker is comprised of a commercially available firewire camera that is attached to a lens and baffle to help filter out visible light. This allows it to sense points of reference in the near-infrared wavelength bands. A prototype of the device has flown on two WASP missions.

In the first mission, the HyperSpectral Imager for Climate Science (HySICS) collected radiance data as WASP pointed the instrument toward the Earth, the sun, and the Moon. The goal of that mission was to see what the star tracker could detect at an altitude of 120,000 feet.

The second mission carried the Observatory for Planetary Investigations from the Stratosphere (OPIS). The goal of this mission, which took place in October, was to gather time measurements of the bright gas giant Jupiter’s atmospheric structure. During this mission, Heatwole said that the algorithm did not work as expected and was unable to filter out the excess light.

However, he had no plans to abandon his project, and plans to fine-tune the algorithms over the next few months in order to eliminate the extra light experienced during the OPIS mission. After that, he plans to conduct additional tests of the star tracker during a sounding rocket flight this summer, as well as on future WASP missions scheduled for 2016 and 2017.

“We’re trying to increase the capabilities of WASP,” said Heatwole. “No company is going to go out and build this. No one is going to develop an off-the-shelf, low-cost daytime star tracker and put all the components in one package. WASP requires an attitude sensor that is capable in the daytime. That’s what we hope to create.”

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Grannies gone wild? New study examines sex lives of senior citizens

Chuck Bednar for redOrbit.com – Your Universe Online
Over half of all men and nearly one-third of women over the age of 70 are still enjoying active sex lives, with many of them frequently engaging in intercourse, according to new research from the University of Manchester and NatCen Social Research.
While the findings are likely to make those of us who would rather not envision our grandparents “getting it on” a little uncomfortable, it’s a landmark study, being the first to examine the sexual health of individuals over the age of 80.
The research also found that overall health and conflicting partnership factors were more closely linked to a decline in sexual activity and functioning, not just due to increasing age, they added.
Lead author Dr. David Lee, a research fellow at the University of Manchester School of Social Sciences, and his colleagues used data from the English Longitudinal Study of Ageing (ELSA) in their research. Their study has been published in the journal Archives of Sexual Behavior.
“We hope our findings improve public health by countering stereotypes and misconceptions about late-life sexuality, and offer older people a reference against which they may relate their own experiences and expectations,” explained Dr. Lee.
“Our ongoing research is also highlighting the diversity of late-life sexualities, and trying to impose youthful norms of sexual health on older people would be over-simplistic and even unhelpful,” he continued, adding that it was “important” that healthcare professionals become “more open about discussing sexual health with older people.”
A total of 7,000 people responded to the ELSA questionnaire, and all but three percent of them declined to answer questions about their sexual activities and issues. The study discovered that chronic health conditions and poor self-rated health appeared to have the most obvious negative impacts on the sexual health of men compared to women.
Issues most frequently reported by sexually active females related to becoming sexually aroused (32 percent) and achieving orgasm (27 percent). Males, on the other hand, reported that most of their problems were related to erectile difficulties (39 percent).
Older men were more concerned about their sexual activities and function than women, the study authors found, and those worries tended to become more common with increasing age. Sexually-active women said that they were less dissatisfied with their overall sex lives than men, and were also less likely to report decreasing levels of dissatisfaction with increasing age.
The study wasn’t just about sex, however. It also revealed that many people in their seventies and eighties showed affection to their partners in other ways, with 31 percent of men and 20 percent of women reporting frequently kissing or engaging in heavy petting. Among those who reported having any type of sexual activity over the past three months, just one percent of men and 10 percent of women said that they help obligated to do so.
“The fact this is the first time that people over 80 years old have been included in this kind of research highlights how often the public health needs of older people, including sexual health, are ignored or overlooked,” said Caroline Abrahams, Charity Director at Age UK.
“With an ageing population it is important that providers of sexual health services understand the needs of older people in both clinical settings and when developing information and advice,” she added. “These recent findings now need to be used to improve sexual health advice and information for older people.”
—–
Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Unusual nebula discovered: “Mouth Of The Beast”

Astronomers using the ESO’s Very Large Telescope (VLT) in northern Chile have discovered a faint cometary globule nebula located 1,300 light-years from Earth in the constellation Puppis.

The telescope also captured a new image of the cometary globule, which is known as CG4 or God’s Hand. That image depicts the head of CG4, which has a diameter of 1.5 light-years and is similar in appearance to the head of a massive, menacing beast, according to CNET.

While it appears very bright in the image, CG4 is actually a very faint nebula that is difficult for citizen astronomers to detect, the website added. The tail of the globule, which cannot be seen in the photo, is roughly eight light-years long – somewhat tiny, by astronomical standards.

It’s exact nature “remains a mystery,” CNET noted. Despite its name, and its resemblance to a comet, the two thing are in no way related. Rather, cometary globules are a type of cold, dense and compact nebula that has an internal mass between 2 and 100 times the mass of our sun.

The globules often appear to be dark patched in the sky and emanate no light, which is one of the reasons they are so difficult to detect, and even though they are among the coldest objects known to exist in the universe, forming star(s) on the inside cause their cores to create warmth.

If it’s not a comet, what is it?

In a statement, the ESO explains that the head of CG4 is comprised of a thick cloud of dust and gas which is only visible because of the light given off by nearby stars. However, that radiation is actually slowly destroying its head, eroding away the particles that cause light to scatter.

In spite of this, however, the dusty cloud of CG4 still contains enough gas to produce multiple Sun-sized stars – and, in fact, the globule is said to be currently in the process of producing new stars, possibly as the result of radiation from the nearby Gum Nebula reaching CG4.

“Why CG4 and other cometary globules have their distinct form is still a matter of debate among astronomers and two theories have developed,” the ESO said. “Cometary globules, and therefore also CG4, could originally have been spherical nebulae, which were disrupted and acquired their new, unusual form because of the effects of a nearby supernova explosion.”

According to other scientists, cometary globules may have been shaped by stellar winds and ionizing radiation from a hot, massive type of star known as OB stars. The resulting effects may be responsible the globules, as well as the unusual formations known as elephant trunks.

“To find out more, astronomers need to find out the mass, density, temperature, and velocities of the material in the globules,” the ESO explained. “These can be determined by the measurements of molecular spectral lines which are most easily accessible at millimeter wavelengths.”

The image is the result of the ESO’s Cosmic Gems project, which is a program designed to capture images of unusual, interesting, or aesthetically pleasing astronomical objects using the observatory’s telescopes for educational and public outreach purposes.

—–

Follow redOrbit on Twitter, Facebook, Instagram and Pinterest.

Particle entanglement could revolutionize super-computers and communication

Eric Hopton for redOrbit.com – Your Universe Online

Even Sauron would be impressed.

Engineers have created a new micro-ring that “entangles” individual particles of light. This could be an important first step in a whole host of new technologies.

Entanglement, the instantaneous connection between two particles no matter how far they are apart, is one of those weird concepts thrown up by quantum physics that leaves most of us wondering “Could that really be true?” Einstein referred to it as, “spooky action at a distance”. It’s one of the most intriguing phenomena in all of physics, but now it seems it’s producing actual, practical applications to benefit us all.

Loops in silicon wafers

If properly harnessed, entangled photons could revolutionize computing, communications, and cyber security. Though they can be created in the lab and by comparatively large-scale optoelectronic components, finding a practical source of entangled photons that can fit onto an ordinary computer chip has been problematic.

New research, published in The Optical Society’s (OSA) journal Optica, describes how a team of scientists has developed, for the first time, a microscopic component that is small enough to fit onto a standard silicon chip that can generate a continuous supply of entangled photons.

The new design is based on an established silicon technology known as a micro-ring resonator. These resonators are actually loops that are etched onto silicon wafers that can corral and then re-emit particles of light. By tailoring the design of this resonator, the researchers created a novel source of entangled photons that is incredibly small and highly efficient, making it an ideal on-chip component.

“The main advantage of our new source is that it is at the same time small, bright, and silicon based,” said Daniele Bajoni, a researcher at the Università degli Studi di Pavia in Italy and co-author on the paper. “The diameter of the ring resonator is a mere 20 microns, which is about one-tenth of the width of a human hair. Previous sources were hundreds of times larger than the one we developed.”

Two important implications

For many years, scientists and engineers have seen an enormous practical potential for entangled photons. This curious manifestation of quantum physics has two important implications in real-world technology.

First, if something acts on one of the entangled photons, the other will respond to that action instantly, even if it is on the opposite side of a computer chip, or even the opposite side of the galaxy. This behavior could be harnessed to increase the power and speed of computations.

The second implication is that the two photons can be considered, in some sense, a single entity, which would allow for new communication protocols that are immune to spying.

This seemingly impossible behavior is essential, therefore, for the development of certain next-generation technologies, such as computers that are vastly more powerful than even today’s most advanced supercomputers, and secure telecommunications.

Not so fast, though

To make these new technologies work, however, requires a new class of entangled photon emitters which can be readily incorporated into existing silicon chip technologies.

Until now, entangled photon emitters could be scaled down to only a few millimeters in size, which is still far too large for on-chip applications. These emitters also need a lot of power. To overcome these problems, the researchers looked at the potential of ring resonators as a new source for entangled photons. Ring resonators can be easily etched onto a silicon wafer in the same way that other components on semiconductor chips are formed. To power the resonator, a laser beam is directed along an optical fiber to the input side of the sample, and then coupled to the resonator where the photons race around the ring. This creates an ideal environment for the photons to mingle and become entangled.

The researchers observed that, when photons exited the resonator, a remarkably high percentage of them exhibited the telltale characteristics of entanglement.

“Our device is capable of emitting light with striking quantum mechanical properties never observed in an integrated source,” said Bajoni. “The rate at which the entangled photons are generated is unprecedented for a silicon integrated source, and comparable with that available from bulk crystals that must be pumped by very strong lasers.”

“In the last few years, silicon integrated devices have been developed to filter and route light, mainly for telecommunication applications,” said Bajoni. “Our micro-ring resonators can be readily used alongside these devices, moving us toward the ability to fully harness entanglement on a chip.” This research could pave the way for practical use of quantum information technologies, particularly quantum cryptography protocols, which would ensure secure communications in ways that classical cryptography protocols cannot.

According to Bajoni and his colleagues, these protocols have already been demonstrated and tested. Their work has shown how a cheap, small, and reliable source of entangled photons capable of propagation in fiber networks can be produced. As a result, practical applications for entanglement – that “spooky action at a distance” –  may become a reality.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Gas planets could transform into habitable worlds

Chuck Bednar for redOrbit.com – Your Universe Online

Planets capable of supporting life could start out as gaseous worlds similar to Neptune, only to be transformed by the combination of two phenomena that individually can inhibit their potential habitability, University of Washington astronomers have discovered in a new study.

Those phenomena, tidal forces and vigorous stellar activity, could combine transform so-called “mini-Neptunes” (large planets in outer orbits with thick hydrogen atmospheres and solid cores) into gas-free and potentially habitable worlds, UW doctoral student Rodrigo Luger and research assistant professor Rory Barnes explain in this month’s issue of the journal Astrobiology.

The majority of the stars in our galaxy are low-mass stars, or M dwarfs, which as smaller and dimmer than our sun and have close-in habitable zones, they explained. These stars are excellent candidates for study in the search for potentially habitable planets, and astronomers expect that they will be able to find several Earth-like planets orbiting these M dwarfs in the years ahead.

Among those candidates are super-Earths, which are planets greater in mass than ours but smaller than gas giants like Neptune and Uranus. The habitable zone is the region of space surrounding a star where liquid water could be found on the surface of a rocky planet, and worlds located in this area (also known as the Goldilocks zone) are capable of supporting life.

Tidal forces

The researchers explain a tidal force is the gravitational pull that a star has on an orbiting planet, and it is stronger on the side of the planet that faces the host star (the near side) than on the side facing away from it (the far side) because gravity weakens with distance. This tug can stretch a planet into an egg-like shape and cause it to move closer to the star.

“This is the reason we have ocean tides on Earth, as tidal forces from both the moon and the sun can tug on the oceans, creating a bulge that we experience as a high tide,” Luger noted. “Luckily, on Earth it’s really only the water in the oceans that gets distorted, and only by a few feet.”

With close-in planets like those in the habitable zones of M dwarfs, however, the tidal forces are much stronger. This stretching can cause friction in a planet’s interior, which in turn gives off a tremendous amount of energy. This can drive surface volcanism and could even cause a planet to become so hot that it boils away its oceans, becoming completely uninhabitable.

Vigorous stellar activity (we just like saying this)

Vigorous stellar activity can also prevent a planet orbiting a low-mass star from supporting life, the researchers explained. When M dwarfs are young, they are exceptionally bright and emit high amounts of high-energy X-rays and UV radiation. This can cause a planet’s atmosphere to become heated, generating strong winds that can erode it away completely.

In fact, Luger and Barnes have demonstrated that a planet’s entire surface water can be lost due to such stellar activity in the first few hundred million years after its formation. However, Luger said that things are not “as grim as they may sound,” as his team’s computer models have found that these forces can also turn uninhabitable mini-Neptunes into habitable planets.

Mini-Neptunes tend for form far away from their host star, and are initially inhospitable worlds with freezing cold conditions, Luger said. But tidal forces and other processes and cause them to move inward, pulling them into their host star’s habitable zone and exposing them to far higher levels of X-rays and ultraviolet radiation in the process.

This, in turn, can lead to the rapid loss of atmospheric gases to space, which can leave behind what the authors call “habitable evaporated cores” – hydrogen-free, rocky world smack dab in the habitable zone. A planet like this is “likely to have abundant surface water, since its core is rich in water ice,” Luger said.

Welcome to the habitable zone

Once it enters the habitable zone, this ice “can melt and form oceans,” potentially leading to the evolution of life, he added. Before this can happen, however, several other conditions need to be met, including the development of an atmosphere that can create and reuse nutrients globally.

Also, if gas loss is too slow or too fast while a planet is forming, it could prevent this transformation from occurring. If hydrogen and helium loss is too slow, it could keep a rocky terrestrial world from forming. If hydrogen is lost too quickly, it creates runaway greenhouse conditions, causing the surface to become too hot for surface water to exist.

“The bottom line is that this process – the transformation of a mini-Neptune into an Earthlike world – could be a pathway to the formation of habitable worlds around M dwarf stars,” Luger said, noting that future research would be needed to determine if they could truly support life.

“Either way, these evaporated cores are probably lurking out there in the habitable zones of these stars, and many may be discovered in the coming years,” he added.

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

Protein-based treatment could help fight leukemia

Chuck Bednar for redOrbit.com – Your Universe Online

A new protein-based treatment could help treat acute lymphoblastic leukemia (ALL) patients whose cancer cells have developed resistance to contemporary chemotherapy, according to new research published online Monday in the Journal of Clinical Investigation.

ALL is the most common form of childhood cancer, and researchers at Children’s Hospital Los Angeles hope that the new therapy method they designed and developed will be effective against the leukemia cells, while increasing the potency of standard treatment options as well.

According to the study authors, their work demonstrated the efficacy and safety of their fusion protein in mouse models of aggressive human leukemia, which were created using cancer cells taken directly from patients who have ALL.

“Despite advances in available therapies, unmet and urgent needs remain in the fight against leukemia,” principal investigator Dr. Fatih M. Uckun of the of the Children’s Center for Cancer and Blood Diseases at CHLA and the Norris Comprehensive Cancer Center of the University of Southern California (USC), said in a statement.

“We still have children with disease that our drugs can’t help enough. And for patients who relapse, their chances of long-term survival are less than 20 percent. We’ve got to do better,” he added.

Their research focused on a protein known as TNF-related apoptosis-inducing ligand (TRAIL), which functions as a ligand in order to induce cell death or apoptosis. TRAIL is produced by a person’s immune system cells, and has the potential to cause apoptosis in tumor cells by binding to a pair of so-called “death receptors” – TRAIL-receptor 1 and TRAIL-receptor 2.

“TRAIL is a naturally occurring part of the body’s immune system that kills cancer cells without toxicity to normal cells,” Dr. Uckun explained. “However, earlier clinical trials using TRAIL as a potential anti-cancer medicine candidate have not been successful, largely because of its propensity to bind, not only to cancer cells, but also to ‘decoy’ receptors.”

However, he and his colleagues have discovered a previously unknown protein that serves as a natural ligand of human CD19, which is expressed by nearly all ALL cells. They hypothesized that this protein, which is known as CD19-Ligand, could be fused to the part of TRAIL known to kill cancer cells (sTRAIL) to create a powerful weapon against leukemia.

Unlike chemotherapy drugs, which destroy all types of cells in the body, this bioengineered substance would seek out only leukemia drugs carrying CD19. It would then bind to them using CD19 as a “docking site” and destroy them, leaving healthy cells unharmed in the process.

During their experiments, Dr. Uckun’s team was able to assemble the two proteins in one fusion protein which they dubbed CD19L-sTRAIL. They also demonstrated that their efforts converted sTRAIL into a far more potent “membrane-anchored” form capable of triggering apoptosis in even the most aggressive and therapy-resistant type of human leukemia cells.

“Due to its ability to anchor to the surface of cancer cells via CD19, CD19L-sTRAIL was 100,000-fold more potent than sTRAIL, and consistently killed more than  99 percent of aggressive leukemia cells taken directly from children with ALL,” Dr. Uckun said.

It proved effective in both test tubes and in mice, he added. Administering as little as two doses of CD19L-sTRAIL significantly improved the survival rates of rodents infected with a normally fatal dose of ALL cells, and did so without side effects. Furthermore, it was found to be more potent than standard chemotherapy combinations and radiation therapy.

“The biggest challenge is to cure patients who experience a recurrence of their cancer, despite intensive chemotherapy,” Dr. Uckun noted. “We are hopeful that the knowledge gained from this study will open a new range of effective treatment opportunities for children with recurrent leukemia.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.

New protein ‘detonates’ antibiotic-resistant bacteria

Chuck Bednar for redOrbit.com – Your Universe Online

Infections caused by MRSA and other types of antibiotic-resistant bacteria remain one of the primary health threats in the world today, particularly in developing countries marked by the overuse of such treatments and poor sanitation quality.

But researchers from Tel Aviv University have developed a potential new way to combat these pathogens by sequencing the DNA of bacteria resistant to viral toxins and identifying novel proteins that could slow growth in these superbug-causing, antibiotic-resistant bacteria.

The study, which was published last month in the journal Proceedings of the National Academy of Sciences, was led by Professor Udi Qimron from the Department of Clinical Microbiology and Immunology at the Sackler School of Medicine and also involved scientists from the Department of Cell Research and Immunology at TAU’s George S. Wise Faculty of Life Sciences.

“Because bacteria and bacterial viruses have co-evolved over billions of years, we suspected the viruses might contain precisely the weapons necessary to fight the bacteria. So we systematically screened for such proteins in the bacterial viruses for over two and a half years,” Qimron said.

He and his colleagues used a process known as high-throughput DNA sequencing to detect mutations in bacterial genes which had grown resistant to toxic growth inhibitors produced by bacterial viruses. This allowed them to identify a new small protein that could specifically target and inhibits the activity of a protein essential to bacterial cells.

That substance, growth inhibitor gene product (Gp) 0.6, was found to cripple the activity of a protein that is essential to bacterial cells. The protein helps maintain the bacterial cell structure, and causing it to malfunction caused the cell itself to essentially self-destruct.

“The new technology and our new interdisciplinary collaboration, drawing from bioinformatics and molecular biology, promoted our study more than we could have anticipated,” said Qimron. “We hope our approach will be used to further identify new growth inhibitors and their targets across bacterial species and in higher organisms.”

He and his colleagues plan to continue analyzing bacterial viruses with the hopes that they will be able to identify other compounds and processes which could be used to improve the treatment of antibiotic-resistant bacteria using yet uncharacterized bacterial viruses’ proteins. They believe that such research will ultimately lead to a breakthrough in the fight against these superbugs.

Researchers have long pursued ways to combat some of the potentially dangerous illnesses caused by antibiotic-resistant bacteria, ranging from calls by public health officials to use the treatment option less frequently to a treatment option designed to target persisters, special cells produced by pathogens that make them more difficult to kill off.

Last summer, researchers from Arizona State University used an innovative new method that combined biomedicine and geochemistry to discover natural clay elements that have antibacterial properties which could be harnesses to combat strains that have developed resistance.

“Minerals have long had a role in non-traditional medicine,” said Enriqueta Barrera, a program director in the National Science Foundation (NSF) Division of Earth Sciences, which funded the research.

“Yet there is often no understanding of the reaction between the minerals and the human body or agents that cause illness,” Barerra added. “This research explains the mechanism by which clay minerals interfere with the functioning of pathogenic bacteria. The results have the potential to lead to the wide use of clays in the pharmaceutical industry.”

—–

Follow redOrbit on TwitterFacebookGoogle+, Instagram and Pinterest.