Toppled tree turns up medieval skeleton of violently-killed man

For those who have declared, “It’s not like men grow on trees,” Ireland might’ve just proved you wrong—because when a 215-year-old beech tree ripped out of the ground during a spring storm, a man’s skeleton was found in the roots.

“The upper part of the skeleton was raised into the air trapped within the root system,” archaeologist Marion Dowd of Sligo-Leitrim Archaeological Services, the team who investigated the discovery, said on Facebook.

“The lower leg bones, however, remained intact in the ground. Effectively as the tree collapsed, it snapped the skeleton in two.”

leg bones tree skeleton

Analysis indicates that the skeleton is early medieval—carbon dating places him between 1030 and 1200 C.E. The man was between 17 and 20 years old, and most likely from a local Gaelic family in what became Sligo County. He seems to have been some sort of physical laborer (judging by a mild spinal joint disease found in the skeleton), and was 5’10” tall—significantly taller than the 5’5” average for medieval men.

The young man also seems to have died violently, as stab wounds probably made by a knife were found on his ribs and left hand. It appears he was fleeing his attacker when he was killed.

“Whether he died in battle or was killed during a personal dispute, we will never know for sure,” said Dowd.

However he died, though, it appears he was given a formal Christian burial. “He was placed in a grave in an east-west position, hands folded over pelvix [sic] region,” explained Dowd. “So his family or community extended a formal Christian burial to him.”

Nineteenth century records indicate that a church and graveyard existed in the area, but as of yet no evidence of either has been found. “So we don’t know was he buried in a graveyard or as an isolated burial,” said Dowd.

However, the find is still exciting. “This burial gives us an insight into the life and tragic death of a young man in medieval Sligo.”

—–

Feature Image: Sligo-Leitrim Archaeological Services

Story Image: Sligo-Leitrim Archaeological Services/Thorsten Kahlert

 

 

Why are middle class people more artistic?

 

Education level, not wealth or social status, is the reason why middle-class people are more likely to play music, paint, or take up acting as either an amateur or professional, researchers from Oxford University reported in the latest edition of the journal Sociology.

As part of the study, Dr. Aaron Reeves, a sociologist at the British university, surveyed more than 78,000 people and found that 18 percent of those individuals had participated in painting or photography, 10 percent were involved in music, 9 percent in dance, and 2 percent in drama or opera. Six percent had written poetry, plays, or fiction, and 22 percent were involved in no artistic activities.

He also found that having a higher income did not make participation any more likely, as those earning more than £30,000 ($46,000) a year were less likely be involved in artistic endeavors than people who earned less. Similarly, social status was unimportant, as high-level professionals were less likely to be artists, writers, or musicians than those in lower white-collar positions.

The strongest indicator of artistic activity, however, was educational background, as a person with a degree was about four times more likely to participate in painting and photography than those who had not graduated from college, as well as five times more likely to be involved in dance and crafts, and four times more likely to be able to play a musical instrument.

Wealth, social status not reliable indicators of arts participation

The research, which accounted for the influence of a family’s class background by statistical analysis, also found that those most likely to be regularly involved in the arts tended to be from the middle class, largely because they were more likely to be highly educated, Dr. Reeves said.

However, as he explained in a statement, the findings indicate that even though having a middle class background made it more likely that a person will have attended a university, they are still no more likely to take part in the arts after graduation than students from the working class.

“Arts participation, unlike arts consumption and cultural engagement generally, is not closely associated with either social class or social status,” Dr. Reeves explained. “This result deviates from the expectation – unexpectedly, those with higher incomes are less likely to be arts participants. These results show that it is educational attainment alone, and not social status, that is shaping the probability of being an arts participant.”

He added that there are two possible reasons for this link: “First… university graduates are more likely to possess the cultural resources necessary for both arts consumption and arts participation. Second, universities make admissions decisions using information on extracurricular and cultural activities, increasing the likelihood that university graduates are culturally active.”

—–

Feature Image: Thinkstock

Dust discs of nearby red dwarfs shed light on planet formation

 

Talk about a happy accidents: Astronomers have unexpectedly stumbled upon a group of young red dwarf stars located not far from our solar system, and their discovery could provide them with the rare opportunity to study slow-motion planet formation.

As reported in the latest edition of the Monthly Notices of the Royal Astronomical, two of the newly-discovered red dwarf stars have large discs of dust surrounding them, a feature that is indicative of planets that are still in the process of forming. Thus, by studying these stars, the astronomers may be able to get a glimpse of a new solar system as it evolves.

“Orbiting disks of dusty material from which planets form are very rare around stars older than 5-10 million years,” lead researcher Dr. Simon Murphy from the ANU Research School of Astronomy and Astrophysics explained to redOrbit via email. “Our serendipitous discovery of two such disks around what we believe are 16 million years stars is therefore very surprising.”

Planets may have more time to form than we thought

The stars were discovered in a young group known as Scorpius-Centaurus, and based on the research, the researchers concluded that either these stars are younger than 16 million years old (meaning that Scorpius-Centaurus has an unexpectedly large age spread) or that disks around stars with masses far lower than the Sun’s last longer than previously believed.

“Because planets are born in these disks, this implies there could be much more time available to form planets than previously thought, especially rocky planets like the Earth which form through the slow build-up of smaller bodies,” said Dr. Murphy, who was aided on the paper by co-author and University of New South Wales (UNSW) Canberra Professor Warrick Lawson.

“There would also be more time for gas giant planets formed early on to migrate within the disk, potentially disrupting the formation of smaller bodies,” he added. “Further observations of these and other nearby disks, especially at infrared and millimeter wavelengths, allow us to construct a detailed picture of disk temperature, structure, chemistry and mineralogy, as well as how these are influenced by the mass of the parent star.”

So why do these red dwarfs still have their discs, anyway?

The mystery remains: Why did these red dwarf stars still have their rings, when other, similarly-aged stars typically do not? Dr. Murphy said this was “still very much an open question.” He told redOrbit that if the mass of the parent star has an influence on the longevity of disks, then the processes through which the disk is cleared of gas and dust (thus halting the formation of gas giants) “must be less efficient around lower mass stars like red dwarves.”

Two possible solutions to this puzzle in the new study, he said. One involves grain growth, in which larger dust grains form more quickly in disks around higher mass stars, rendering them invisible to detection at infrared wavelengths. The other centers around photo-evaporation, where the intense radiation from a star evaporates its disk. Since higher mass stars are more luminous, Dr. Murphy said, they photo-evaporate their disks more rapidly.

—–

Feature Image: Artist depiction of a dusty disc surrounding a red dwarf. Credit: NASA/JPL-Caltech/T. Pyle (SSC)

You’re not being irrational, you’re simply quantum thinking

 

Next time you’re told you’re being irrational in an argument, use this new scientific comeback: “No, I’m just quantum probabilistic.”

In two new review papers, one in Current Directions in Psychological Science and the other in Trends in Cognitive Sciences, a team of researchers have theorized a mathematical model for the human decision-making process through quantum physics.

So why quantum physics? In theory, by abandoning the conventional approach of thinking through classical probability theory (think back to statistics) and thinking in a way similar to quantum physics, humans can work through complex questions and make decisions in the face of uncertainty.

“We have accumulated so many paradoxical findings in the field of cognition, and especially in decision-making,” said Zheng Joyce Wang, one of the scientists working on the theory and director of the Communication and Psychophysiology Lab at The Ohio State University.

“Whenever something comes up that isn’t consistent with classical theories, we often label it as ‘irrational.’ But from the perspective of quantum cognition, some findings aren’t irrational anymore. They’re consistent with quantum theory—and with how people really behave.”

Classical and ‘rational’ vs. Quantum thinking

As many of those deemed “irrational” have attempted to state, researchers who try to study human behavior using the classical probability approach are likely to find that human behavior isn’t always rational. Noticing this, Wang and her collaborators focused in on how quantum theory, rather than classical reasoning, could open up an understanding of human cognition and behaviors.

“In the social and behavioral sciences as a whole, we use probability models a lot,” Wang stated. “For example, we ask, what is the probability that a person will act a certain way or make a certain decision? Traditionally, those models are all based on classical probability theory—which arose from the classical physics of Newtonian systems. So it’s really not so exotic for social scientists to think about quantum systems and their mathematical principles, too.”

Quantum physics typically works with the ambiguity in the physical world. In the same way, quantum cognition is when humans have to mentally deal with ambiguity. As humans, we aren’t certain about how we feel, what we want, or what we should do, and so we must make decisions based on little information.

“Our brain can’t store everything. We don’t always have clear attitudes about things. But when you ask me a question, like ‘What do you want for dinner?’ I have to think about it and come up with or construct a clear answer right there,” Wang explained. “That’s quantum cognition.”

Why quantum theory works

Think about how a person makes a decision—in most cases, he or she will go back and forth, weighing the options for each possibility as they stack up against each other until a final decision is made.

When using the classical approach to psychology, some of these options might not make sense, and as humans attempt to make decisions, researchers must create new mathematical axioms to account for each behavioral instance. The result? A multitude of psychological models, some conflicting with each other and none applying to every decision-making situation we encounter. We don’t answer the same way every single time, do we?

However, by using a quantum approach, Wang and her collaborators claim that with the same limited set of axioms, researchers can explain the vast varieties of diverse and complex human behaviors and decisions.

“I think the mathematical formalism provided by quantum theory is consistent with what we feel intuitively as psychologists,” Wang continued. “Quantum theory may not be intuitive at all when it is used to describe the behaviors of a particle, but actually is quite intuitive when it is used to describe our typically uncertain and ambiguous minds.”

—–

Feature Image: Thinkstock

Could this combined antibiotic strategy finally defeat MRSA?

 

Scientists are constantly looking for new weapons in the battle against antibiotic-resistant bacteria, and a team of researcher from Virginia Tech has announced the discovery of a new group of antibiotics shown to be effective against methicillin-resistant Staphylococcus aureus, or MRSA.

According to the team’s report, published in Medicinal Chemistry Communications, the prospective new antibiotics are not like current antibiotics because they include iridium, a silvery-white transition metal. New transition metal complexes do not quickly degrade, which is critical for transport of antibiotics to infected locations in the body.

In the study, testing revealed that the iridium compounds are not toxic to animals and animal cells, which led researchers to say they are probably safe for human use.

“So far our findings show that these compounds are safer than other compounds made from transition metals,” said Joseph Merola, a professor of chemistry at Virginia Tech. “One of the reasons for this is that the compounds in this paper that target MRSA are very specific, meaning that a specific structure-function relationship must be met in order to kill the bacteria.”

The antibiotics efficiently kill the bacteria without suppressing mammalian cells, and a form of the antibiotic was tested for toxicity in mice with no observed harmful effects.

“We are still at the beginning of developing and testing these antibiotics but, so far, our preliminary results show a new group of antibiotics that are effective and safe,” said Joseph Falkinham, a professor of microbiology at Virginia Tech. “Within the next few years, we hope to identify various characteristics of these antibiotics, such as their stability, their distribution and concentration in animal tissue, their penetration into white blood cells, and their metabolism in animals.”

Bacteria have not evolved to resist these

“The biggest question scientists have to ask to tackle antibiotic resistance is, how can we stay on top of the bacteria? Fortunately, these new organometallic antibiotics are coming at a time when bacteria have not evolved to resist them,” Merola added.

In another development, a team of American researchers has just announced the successful testing of a three antibiotics in combination against MRSA that aren’t successful individually against the resistant bacterium.

According to the team’s report, published in the journal Nature Chemical Biology, a combination of meropenem, piperacillin, and tazobactam was shown to be effective against MRSA.

“Without treatment, these MRSA-infected mice tend to live less than a day, but the three-drug combination cured the mice,” study author Gautam Dantas, an associate professor of pathology and immunology, said in a news release. “After the treatment, the mice were thriving.”

—–

Feature Image: Thinkstock

Yikes: Japan’s largest volcano unexpectedly erupts

 

Mt. Aso in the Kumamoto prefecture of Japan erupted yesterday, shooting out large rocks along with a pillar of smoke and ash that rose over 6,500 feet in the air.

According to the Japan Meteorological Agency, the eruption came with little warning—none of the usual signs were detected, like tremors or rising magma.

But despite this, the Agency only raised the level volcano alert level to a 3 out of 5—Sakurajima, which never erupted, was at a level 4. A rating of 3 means “Do not approach the volcano,” but does not mean residents need to evacuate (excluding the 2.5-mile-wide area around the crater that was evacuated at the time of the eruption).

“The latest episode follows the eruption that occurred last November, but overall, it seemed like the mountain was settling down,” Tsuneomi Kagiyama, a professor of geophysics at Kyoto University and head of the school’s Aso Volcanological Laboratory, told The Asahi Shimbun. “As for now, the volcano should not be affecting residents and tourists outside the restricted areas.”

No one has been reported as injured or killed, and the eruption is unlikely to have a great influence on global weather and travel, as no major amount of ash is predicted to fall. However, it is likely to have a detrimental effect on local tourism. According to The Japan Times, more than 400 people had cancelled reservations at hotels and inns in Aso, and least 20 flights have been canceled as well, according to the NHK.

“We’ve received many calls from tourism agencies and tourists,” Junichi Inayoshi, head of Aso’s tourism association, told the Times. “Mount Aso is a volcano with many safety measures in place so we want to assure them that it is safe to visit here.”

—–

Feature Image: Thinkstock

Eating chocolate and drinking red wine could help prevent Alzheimer’s

 

Resveratrol, a naturally-occurring antioxidant found in wine, grapes, and chocolate, halts the decline of a key biomarker found in people with mild to moderate Alzheimer’s disease, according to research published this week by the journal Neurology.

In what they are calling the largest nationwide clinical trial to ever study high-dose resveratrol use in people with the neurodegenerative condition, Dr. R. Scott Turner, director of the Memory Disorders Program at Georgetown University Medical Center and his colleagues found that use of the substance resulted in higher amyloid-beta protein levels in a patient’s spinal fluid.

Dr. Turner’s team conducted a randomized, phase II, placebo-controlled, and double-blind clinical trial involving patients with mild to moderate dementia due to Alzheimer’s. The study involved 119 patients and lasted one year. It found that when taken in concentrated doses, resveratrol may be able to slow down the progression of this chronic, incurable disease.

Don’t expect to see these kinds of results by drinking an extra glass of wine or snacking on a candy bar here or there, however – the patients involved in the study were given one gram of the antioxidant by mouth twice daily. In order to get such a high dose of resveratrol from red wine, a person would have to drink nearly 1,000 bottles per day, according to CNN.com.

Wait… isn’t more amyloid-beta a bad thing for Alzheimer’s patients?

While it is true that accumulation of amyloid-beta in a person’s brain is one of the hallmarks of the disease, Alzheimer’s patients actually have lower levels of the protein in other parts of their bodies, CNN.com explained. The findings of the Georgetown study suggest that resveratrol may help maintain the natural balance of amyloid-beta and prevent build-up in the brain.

Dr. Turner and his co-authors reported that people who were treated with increasing doses of the substance over a 12 month period showed little to no change in amyloid-beta40 (Abeta40) levels in blood and cerebrospinal fluid. Conversely, those who received placebos experienced a decline in their Abeta40 levels in comparison to readings taken at the beginning of the experiment.

“A decrease in Abeta40 is seen as dementia worsens and Alzheimer’s disease progresses,” said Dr. Turner. “Still, we can’t conclude from this study that the effects of resveratrol treatment are beneficial. It does appear that resveratrol was able to penetrate the blood brain barrier, which is an important observation. Resveratrol was measured in both blood and cerebrospinal fluid.”

He told CNN that the findings were “encouraging enough” to warrant a larger-scale clinical trial, “because we showed that it is safe and does have significant effects on Alzheimer’s biomarkers.” For now, he said, the best way to get the antioxidant is through diet, but the effects will likely be limited. One glass of red wine per day may help those with mild Alzheimer’s, he noted.

—–

Feature Image: Thinkstock

New computer chip self-destructs Mission Impossible-style

 

Paper copies of important top-secret documents can always be run through the shredder, but what is a person supposed to do if they have to destroy classified data contained on a computer chip? Now, thanks to engineers at Xerox PARC, you can cause it to self-destruct.

1426127509457451439

While that may sound like something out of a James Bond or Mission Impossible movie, the technology is quite real, and was recently put on display at a US Defense Advanced Research Projects Agency (DARPA) event held in St. Louis, according to Engadget and Gizmodo.

So how did the PARC team pull off this incredible feat? They started by making the chips out of Corning’s Gorilla Glass, the same type of material used in many smartphones, rather than plastic and metal. However, they modified it to become tempered glass under extreme stress, causing it to shatter and disintegrate on command when triggered by a laser, radio signal, or switch.

At the DARPA event, the engineers outfitted the chips with a small resistor at the bottom which served as its self-destruct mechanism. When this resistor was heated by a laser, the entire chip wound up instantly shattering due to the invisible stress acting upon it. Instead of just shattering once, however, it continued to crumble until all that remained was a pile of dust.

An extreme solution to high-tech security issues

“The applications we are interested in are data security and things like that,” Gregory Whiting, a senior scientist at PARC in Palo Alto, California, said to IDG News Service. “We really wanted to come up with a system that was very rapid and compatible with commercial electronics.”

“We take the glass and we ion-exchange temper it to build in stress,” he added. “What you get is glass that, because it’s heavily stressed, breaks and fragments into tiny little pieces.” By fabricating a chip containing an encryption key or other valuable information on glass, PARC explained that it could ensure the chip’s destruction, perhaps automatically, if it fell into the wrong hands.

Gizmodo called self-destructing chips “an extreme solution” to the issue of high-tech security, but admitted that it was one that “undoubtedly works.” While the technology will likely be used first by the military or government agencies, it could eventually find its way to the consumer market, enabling people to blow-up lost or stolen smartphones when needed.

—–

Feature Image: Xerox Parc

 

Link between homophobia, mental illness discovered

 

Individuals with homophobic views are more likely to have anger issues, display psychotic behavior, and possess other undesirable psychological traits, according to new research published in a recent edition of the Journal of Sexual Medicine.

According to Live Science and The Telegraph, a team of researchers led by University of Rome professor Emmanuele Jannini recruited 551 Italian university students ranging in age from 18 to 30, and had them complete questionnaires that asked them, among other things, about their levels of homophobia, depression, anxiety, and psychoticism.

To measure their homophobia, the study authors presented their subjects with 25 statements like “Gay people make me nervous” and “I think homosexual people should not work with children.” The students were asked to rate on a scale of one to five how strongly they agreed or disagreed with each statement, and the anonymous results were them compiled by the researchers.

Students were also asked about their attachment style, which categorized whether they tended to have healthy relationships, experience intimacy issues, become to clingy, or seek closeness while not feeling comfortable about trusting others. In addition, they were asked about their coping strategies and defense mechanisms used when facing unpleasant or frightening situations.

Insecure, angry people more likely to have anti-gay views

Jannini’s team found that people who have homophobic attitudes often also tend to have poorly-developed coping mechanisms, that they possess psychoticism (a personality trait that is marked by hostility, anger, and aggression toward others), have some deep-rooted psychological issues – and that many of them are dealing with their own gender issues.

Overall, the better a person’s overall mental health was judged to be based on their responses to the questionnaires, the less likely he or she was to have homophobic tendencies. People who are uncomfortable in close relationships with others (also known as fearful-avoidant attachment) are significantly more homophobic than those secure with close interpersonal relationships, as were those with higher levels of immature defense mechanisms.

High levels of hostility and anger (psychoticism) were also associated with homophobia, Live Science noted, but the opposite turned out to be true when it came to depression, repression, or hypochondria, each of which were linked with lower levels of homophobia. Jannini’s team told The Telegraph that anti-gay views may be linked to a limited capacity to empathize.

“To our knowledge, this is the first study assessing both the psychologic and psychopathologic characteristics that could have a predictive in homophobia development,” the authors wrote. “In fact, we found that psychoticism represented an important risk factor for homophobia, demonstrating that pathologic personality traits are involved in homophobic attitudes.”

—–

Feature Image: Thinkstock

Today marks the halfway point of the Year In Space mission

Halfway into his 12-month stint on the International Space Station (ISS), NASA astronaut Scott Kelly said that he missed fresh air, the outdoors, and his friends and family, but that he was doing well and looked forward to tackling the remainder of one-year mission with “enthusiasm”.

According to AFP reports, Kelly’s comments were aired on NASA television and came during a National Press Club event held Monday to commemorate the halfway point of the Year In Space mission. Russian cosmonaut Mikhail Kornienko, who like Kelly has agreed to stay on the station for twice as long as normal for research purposes, did not participate in the interview.

“I feel pretty good overall,” Kelly told reporters. “What I am looking most forward to is just getting to the end of it with as much energy and enthusiasm as I had at the beginning,” he added, noting that he missed “being with people you care about” and “going outside.”

“This is a very closed environment, you can never leave. The lighting is pretty much the same, the smell… everything is the same,” the astronaut continued. “My ability to move around is really improved over time… Your clarity of thought is greater, your ability to focus. I found that the adaptation has not stopped, and it will be interesting to see six months from now.”

Kelly believes long-term space travel ‘won’t be an issue’

Back in March, Kelly and Kornienko agreed to spend twice the length of a normal mission on the ISS, as NASA and the Russian Federal Space Agency (Roscosmos) study how extended voyages in space impact the human mind and body. The information will be invaluable as NASA looks to send a manned mission to Mars sometime in the 2030s.

Scott Kelly’s identical twin brother Mark, who attended at Monday’s event, has been undergoing analysis on Earth over the past year as well. One of the main objectives of the mission is to track how the physiologies of the two men differ after one spends a year in the weightless environment of the space station and the other remains on the ground. The results will help NASA scientists to prepare for the space agency’s planned manned mission to Mars in the 2030s.

“The one-year mission enables scientists to study much longer-term effects of weightlessness on the human body, yielding information that will be critical if NASA pursues President Obama’s plan to send a mission to Mars by the 2030s,” Spaceflight Insider explained. “This information will also provide medical research to help patients recovering from long periods of bed rest, and also improve monitoring for people with compromised immune systems.”

The extended ISS mission will also reveal how radiation in space may have affected Scott, and reveal more about the bone loss and vision problems previously associated with living in space, the AFP added. Kelly told reporters that he hoped the research would show that humans are able to adapt to extended periods of space travel, telling them, “I think over the long term it won’t be an issue. We as a species, throughout evolution, we have shown we are very adaptable.”

Six months down, six months to go: This one’s for you Scott Kelly.

—–

Feature Image: NASA astronaut Scott Kelly, center, takes medical measurements as part of the Fluid Shifts investigations along with Russian cosmonauts Mikhail Kornienko, left, and Gennady Padalka. Fluid Shifts measures how much fluid shifts from the lower body to the upper body, and determines the impact these shifts have on fluid pressure in the head, changes in vision and eye structures. (Credit: NASA)

Elephants born from stressed moms age faster and produce fewer offspring

Stress doesn’t just take a toll on humans, but elephants, too—according to a new study in Scientific Reports.

The researchers discovered that Asian elephants born during times of high maternal stress produce significantly fewer progeny in their lifetime than elephants born in other times, and decline much more rapidly in older age.

This is the first study to investigate the effects of stress on non-human species with lifespans similar to ours.

“Poor early life conditions have been linked to many disease outcomes in humans, but is unknown whether stress in early life also speeds up ageing rates in long-lived species,” Dr. Hannah Mumby, lead author, explained in a press release. “We found that the decline in reproduction with age is much steeper in the elephants born at the poorer time of year. Even though they reproduce slightly more when they’re young, this still doesn’t compensate for the steep decline and they end up with fewer offspring.”

The team, which is out of the University of Sheffield, came upon this discovery while examining records of elephants from Myanmar. The records spanned three generations and documented the lives and deaths of more than 10,000 elephants that worked for the timber industry.

Then, to determine the most stressful time of year, the researchers measured the amounts of stress hormone metabolites in the fecal matter of 37 female elephants on a monthly basis for 12 months. They found that three particular months spiked stress levels by 46%: June, July, and August.

Why is this time period so stressful?

Those three months span the monsoon season, when elephants work extra hard to drag logs to rivers. The number of calves born during this time is much lower, and those born are least likely to survive. Further, we now know they don’t live as long or reproduce as much as calves born in other seasons.

These results add further weight to the notion that maternal stress (in any species) is likely to affect the offspring, and may help conservationists increase elephant populations. Improving developmental conditions could encourage longer-lived and more reproductive elephants, guaranteeing the survival of one of the world’s best-loved animals.

—–

Feature Image: University of Sheffield

Can migraines be detected through your blood?

 

Migraines are normally diagnosed based on what symptoms a person has, but this process might be more cut and dry in the future—because according to a study published in Neurology, migraines can be detected through your blood.

In the study, 52 women with episodic migraines and 36 without these headaches went through several exams and tests, including a blood draw. (Episodic migraines involve less than 15 headaches per month; these women averaged 5.6.) The blood samples were then tested for various fats that work to regulate inflammation in the brain and help maintain energy levels.

Of these lipids (fats), two stood out in the pack. First, a type known as ceramides were drastically decreased in the women with migraines—they showed average levels of 6,000 nanograms per milliliter of total ceramides in their blood, while the women without headaches had around 10,500 nanograms per milliliter. Further, for every increase in standard deviation, the likelihood of having migraines dropped 92%.

The other kind of fat, sphingomyelin, indicated increased levels of migraines as well, but for every standard deviation increase, risk of migraine also increased by 2.5 times. Or, in other words, the typical migraine sufferer showed increased sphingomyelin and decreased ceramides.

The researchers then studied the reverse: They examined 14 blood samples from both groups of women without knowing whether they had migraines or not, and guessed which ones would have migraines based on blood fats. Using just this, they were correctly able to identify which group all of the blood samples fell under.

More research needed, but “head”ed in the right direction

However, like most studies, there were some limitations—no men were studied, and chronic migraine sufferers were not included. Further, there was a high incidence of migraine with aura—which only occurs in 36% of migraine sufferers.

Nonetheless, the results are still pretty significant.

“While more research is needed to confirm these initial findings, the possibility of discovering a new biomarker for migraine is exciting,” said study author B. Lee Peterlin, DO, of the Johns Hopkins University School of Medicine in Baltimore, in a press release.

Karl Ekbom, MD, PhD, with the Karolinska Institutet in Stockholm, Sweden, who wrote an accompanying comment article, agreed with Peterlin. “This study is a very important contribution to our understanding of the underpinnings of migraine and may have wide-ranging effects in diagnosing and treating migraine if the results are replicated in further studies,” he said.

—–

Feature Image: Thinkstock

Massive aquifer discovered beneath China desert

 

Even though it is widely recognized as one of the driest places on Earth, the Tarim basin in the northwestern Xinjiang province of China is home to a massive subterranean ocean which could have a significant impact on climate change.

According to Discovery News, this paradoxical part of the world is surrounded by mountains that block the passage of moist ocean air, causing it to receive less than four inches of rain each year. However, the Tarim basin is also home to a buried aquifer that contains 10 times the water found in all five of the Great Lakes combined. That’s a lot of water.

While this underground ocean is too salty for the residents of the arid region to use, the authors of a new study published in the journal Geophysical Research Letters claim that it is acting as a large carbon sink, with its alkaline soil helping to dissolve CO2 in the water.

Other deserts could have their own subsurface aquifers

Lead investigator Professor Li Yan of the Chinese Academy of Sciences’ Xinjiang Institute of Ecology and Geography, and his colleagues collected samples of the underground water from nearly 200 different locations, comparing the carbon dioxide content in those samples with snowmelt from the surrounding mountains.

“This is a terrifying amount of water. Never before have people dared to imagine so much water under the sand. Our definition of desert may have to change,” Yan told the South China Morning Post, adding that the discovery was made by accident, as they were searching for carbon, not water.

His team first noticed large amounts of carbon dioxide inexplicably disappearing in Tarim about 10 years ago, and their findings suggest that other large deserts could be home to large quantities of subsurface water. If so, this could make them carbon sinks that are as important as forests and oceans when it comes to staving off the ill effects of global climate change.

The residents of Xinjiang have used melt water for agricultural irrigation for thousands of years, and the alkaline soil is helping CO2 dissolve into the water. By dating the carbon’s age, Yan’s team “recorded a jump of ‘carbon sinking’ after the opening of the ancient Silk Road more than two thousand years ago. CCS [carbon capture and storage] is a 21st century idea, but our ancestors may have been doing it unconsciously for thousands of years.”

—–

Feature Image: NASA

Meet Australia’s largest meat-eating dinosaur: ‘Lightning Claw’

 

Fossilized remains originally found by miners in the Lightning Ridge, New South Wales have led to the discovery of the largest carnivorous dinosaur ever discovered in Australia – a 22-foot long, 110 million-year-old beast that has affectionately been dubbed “Lightning Claw.”

According to Discovery News, paleontologists were able to recover pieces of the hip, ribs, an arm, and a foot of the Cretaceous Period predator, as well as a 10-inch claw that they claim was used as a “grappling hook” to capture prey. The partially complete skeleton is said to be the second most complete ever unearthed on the island, the website added.

“What is fascinating about this discovery,” Dr. Phill Bell from the University of New England said in a statement, “is [that] it changes the popular notion that Australian dinosaurs came from ancestors derived from Africa and South America. Instead, the ‘Lightning Claw’ appears to be the ancestor of all megaraptorids, meaning this group appeared first in Australia.”

The new dinosaur, which does not yet have an official name, is believed to have been a little bit larger than the 16 foot long Australovenator, a megaraptorid that had previously been Australia’s largest meat-eater. Lightning Claw is also believed to be the oldest member of its group.

Lightning Claw was ‘stunning’ and ‘unique’

The fossilized bones of Lightning Claw were first found by opal miners back in 2005, according to the Huffington Post, and a paper detailing the newly-identified species was published online in the journal Gondwana Research earlier this month. Researchers from the University of Bologna in Italy and the Australian Opal Centre were also credited as author on the study.

Co-author Dr. Federico Fanti, paleontologist at the University of Bologna, said it was “crystal clear” that they had discovered a new species, as it was “very different” from fossils previously collected in the area. The fossils, he said, were “made of opal… stunning and unique.”

The Daily Mail reports that Lightning Claw is only the second dinosaur discovered in Australia to be identified from more than a single bone. However, recently-discovered tracks may indicate that even larger dinosaurs may have called the region home. The creature will not be officially described as a new species until additional fossils are found.

—–

Feature Image: Julius T. Csotonyi/University of New England

This antibody attacks HIV in a new way

 

Doctors have been using a range of antibodies to attack HIV for years, and a new report has revealed yet another weapon in the battle against AIDS: an antibody called 8ANC195.

According to the report, published in the journal Cell, 8ANC195 is a broadly neutralizing antibody (bNAb) capable of latching on to a signature HIV protein and effectively neutralizing the virus.

The signature protein on the surface of the virus, called the envelope spike, can take on different conformations during infection, and 8ANC195 is capable of recognizing these different conformations, making the antibody uniquely effective.

The sequence of HIV infection starts when the virus is exposed to human immune cells known as T cells that carry a certain protein, CD4, on their exterior. Envelope spikes on the exterior of the HIV virus identify and bind to the CD4 proteins, and the spikes go from the “closed” conformation to the “open” conformation when the spike adheres to CD4. The open conformation then allows the HIV virus to put its genetic material inside the host cell, compelling it to become a manufacturer of new viruses that can go on to infect other cells.

A different bNAb

Most identified bNAbs only recognize the spike in the closed conformation, and each bNAb recognizes just one particular target, or epitope, of this protein. Some targets permit a more potent neutralization of the virus, and, therefore, some bNAbs are more efficient against HIV than others. Previous research by the study team had identified 8ANC195 as targeting a different epitope than any other known bNAb.

“We previously were able to define the binding site of this antibody on a subunit of the HIV envelope spike, so in this study we solved the three-dimensional structure of this antibody in complex with the entire spike, and showed in detail exactly how the antibody recognizes the virus,” study author Louise Scharf, a postdoctoral researcher at Caltech, said in a news release.

“Our collaborators at Rockefeller have studied this extensively in animal models, showing that if you administer a combination of these antibodies, you greatly reduce how much of the virus can escape and infect the host,” Scharf says. “So 8ANC195 is one more antibody that we can use therapeutically; it targets a different epitope than other potent antibodies, and it has the advantage of being able to recognize these multiple conformations.”

The study team found that 8ANC195 can also recognize the envelope spike in both the closed conformation and a partially open conformation. They noted that the bNAb could potentially be added to the usual “cocktail” of antibodies given to individuals with HIV.

“In addition to supporting the use of 8ANC195 for therapeutic applications, our structural studies of 8ANC195 have revealed an unanticipated new conformation of the HIV envelope spike that is relevant to understanding the mechanism by which HIV enters host cells and bNAbs inhibit this process,” noted study leader Pamela Bjorkman, professor of biology at Caltech.

—–

Feature Image: Louise Scharf/Caltech

Court strikes down EPA-approved insecticide citing rapid bee decline

 

The US Environmental Protections Agency’s (EPA) decision to approve the use of the insecticide sulfoxaflor has been overturned by a San Francisco-based federal appeals court, which ruled that the substance could be toxic to an already struggling bee population.

According to the Los Angeles Times and the San Francisco Examiner, a three-judge panel at the 9th Circuit Court of Appeals ruled that the EPA’s decision to improve sulfoxaflor two years ago “was based on flawed and limited data,” was “not supported by substantial evidence,” and could hasten an already “alarming” decline in the number of honey bees in the US.

Following their ruling, sulfoxaflor, an insect neurotoxin produced by Indianapolis-based Dow AgroSciences and typically used on citrus, cotton, fruit, and nut trees, cannot be used until EPA officials conduct additional testing to determine whether or not it is safe to use. Initial research had reportedly already demonstrated that the insecticide was toxic to honey bees.

“Bees are essential to pollinate important crops and in recent years have been dying at alarming rates,” Judge Mary M. Schroeder wrote for the three-judge panel. She and her colleagues added that “given the precariousness of bee populations, leaving the EPA’s registration of sulfoxaflor in place risks more potential environmental harm than vacating it.”

EPA had ‘no real idea’ how harmful sulfoxaflor was to bees

Environmental law firm Earthjustice, representing a coalition of trade groups and beekeepers which included the American Honey Producers Association and American Beekeeping Federation, challenged the use of sulfoxaflor (sold under the brand names Closer and Transform) in December 2013.

In its decision, the court rules that the EPA recognized the threat that the insecticide posed to bees, but decided that those risks could be mitigated by rules limiting applications – a decision that the judges determined has been made without “any meaningful study.” Without “sufficient data,” they said, the EPA “has no real idea” how harmful it could be to honey bees.

“Our country is facing widespread bee colony collapse, and scientists are pointing to pesticides like sulfoxaflor as the cause,” Greg Loarie, lead council for Earthjustice, said in a statement after the verdict. He called the decision “incredible news for bees, beekeepers, and all of us who enjoy the healthy fruits, nuts, and vegetables that rely on bees for pollination.”

In a statement of its own, Dow AgroSciences, which was allowed to join the case on the side of the EPA, said that it “respectfully disagrees” with the 9th Circuit’s ruling regarding sulfoxaflor. It also said that it intended to “work with EPA to implement the order and to promptly complete additional regulatory work to support the registration of the products.”

—–

Feature Image: Thinkstock

How to keep a robot from taking your job

While you may be willing to welcome our new robot overlords, but if you want to make sure that they don’t take your job, be sure you find employment in a field that requires creativity, dexterity or social interactions, AI experts from the Massachusetts Institute of Technology advise.

In an article published Sunday by BBC News, Erik Brynjolfsson, director of the MIT Center for Digital Business, and Andrew McAfee, a principal research scientist at the Institute, explain that advances in robotics and other forms of digital technology has caused uneasiness among workers and that there is a pervasive sense of dread that they will eliminate existing jobs.

Brynjolfsson and McAfee, the authors of the book The Second Machine Age, explained that while machines “are getting very good at a whole bunch of jobs and tasks, there are still many categories in which humans perform better. And, perhaps more importantly, robots and other forms of automation can aid in the creation of new and better jobs for humans.”

“As a result, while we do expect that some jobs will disappear, other jobs will be created and some existing jobs will become more valuable, For example, machines are currently dominating the jobs in routine information processing,” they added. “On the other hand, jobs such as data scientist didn’t used to exist, but because computers have made enormous data sets analyzable, we now have new jobs for people to interpret these huge pools of information.”

Choose your career field wisely, experts advise

So essentially, even though robots will likely take over performing some jobs, other tasks will become available for workers, creating new jobs and industries, Brynjolfsson and McAfee said. If you want to be extra certain that you’re career isn’t one that’s on the chopping block, the duo pointed out three areas where humans have an advantage of machines.

The first is in creative endeavors such as writing, entrepreneurship and scientific discovery, all of which can be well-playing and rewarding careers, according to Brynjolfsson and McAfee. In fact not only is entrepreneurship a safe career choice, but they said that there is “no better time” to be an entrepreneur than now, “because you can use technology to leverage your invention.”

The second optimal career field is those requiring emotional intelligence, a uniquely human trait that machines simply cannot replicate. People who are motivated and sensitive to other peoples’ needs can excel as teachers, managers, leaders, negotiators, caretakers, nurses and coaches, they explained. Recent studies, they added, show that social skills “are increasingly in demand.”

The third type of employment that people are better suited for than robots are those requiring a high degree of physical dexterity and mobility, Brynjolfsson and McAfee said. Robots tend to be “clumsy and slow,” lacking the agility of flesh-and-blood men and women. Jobs that require this skill set, such as gardening and housekeeping, are ill-suited for machines, and while they are not necessarily high-paying positions, they should be safe from robot takeovers.

“Workers… have to be strategic and aim for the jobs least likely to be overtaken by robots or other machines,” they wrote. “They have to commit to a lifetime of practicing and updating their skills by, for example, taking extra courses online and in classrooms. Lifetime learning and continued training and retraining are key.”

—–

Feature Image: Thinkstock

New antenna would enable direct communication between Earth, Mars rovers

Curiosity’s cell phone signal is about to get a whole lot stronger, as a team of UCLA researchers is working on an antenna that would enable Mars rovers to communicate directly with scientists here on Earth, without needing relay satellites to act as an intermediary.

According to Space.com, electrical engineering professor Yahya Rahmat-Samii and his fellow designers have come up with an antenna that would make it possible for robotic rovers to send transmissions directly to NASA personnel on the ground, while also dramatically increasing the window during which data could be relayed to Earth from the Red Planet.

Currently, Curiosity and the other Mars rovers use a technique called indirect communications, which requires them to send their data to a the Mars Reconnaissance Orbiter’s satellite antenna. The MRO then sends the information to Earth at high transmission rates in the X-band, near 8 GHz and with a radio wavelength of about 1.5 inches, the UCLA team explained.

As a result, the rovers can only send information during two 15-minute periods per day because of orbit conditions, they added. By using a new design which combines several smaller antennas into a single, larger one, however, they could drastically increase communication time and make it possible to establish a direct link between the rover and Earth scientists.

Antenna could be ready in time for the Mars 2020 rover

Rahmat-Samii’s team compares their concept to the way that different organs work together in a person’s body. While each organ functions on its own, when it is combined with other organs, it can maintain an entire human being. In a similar way, they look to combine small antennas (also known as antenna elements) into a larger one capable of sending stronger signals.

By piecing together enough of these antenna elements (256 in this case), the UCLA researchers hope to create an array that can not only improve a rover’s ability to send and receive data, but which is compact enough to fit into a limited area onboard the actual rover. They hope to have a lightweight, functional array ready in time for the launch of NASA’s Mars 2020 rover.

Rahmat-Samii told Space.com that, based on the relative orbital positions of Earth and Mars, the communication window could be up to “several hours” long, and that it could solve a “potential need for augmented direct-to-Earth (DTE) X-band radio communications,” not just for the Mars 2020 rover but for similar NASA missions in the future.

—–

Feature Image: NASA/JPL-Caltech

Elon Musk: Nuke Mars to warm the planet

Attempts to create a human colony on Mars are complicated by the fact that the planet is largely inhabitable, but thankfully SpaceX CEO Elon Musk has come up with a creative way to solve that minor issues – nuke the Red Planet.

According to CNET and USA Today, Musk was appearing on “The Late Show with Stephen Colbert” Wednesday night when he called Mars “a fixer-upper of a planet” and suggested that the fastest way to make it a better place to live would be to “drop thermonuclear weapons over the poles,” thus making the planet warmer in the cosmic blink of an eye.

That prompted Colbert to suggest that Musk was secretly a supervillain, drawing comparisons to Superman’s arch-nemesis Lex Luthor. Of course, Musk also pointed out that there was a slower, safer and less supervillain-y way to accomplish the mission as well – releasing greenhouse gases into the atmosphere, which would has a similar affect as global warming on Earth.

Could nuke-based terraforming actually work?

The Los Angeles Times asked University of Colorado atmospheric and ocean sciences professor Brian Toon about the possibility of using nukes to warm on Mars, and he responded that while it “seems possible to make it Earthlike,” that blowing up bombs was not really a good idea.

Likewise, Penn State University Earth System Science Center director Michael Mann said to US News that using nuclear weapons in this way could cause a whole new set of problems, including the fact that the explosions may “generate so much dust and particles that they block out a significant portion of the incoming sunlight, cooling down the planet.”

“To begin with, there are the attendant dangers of bolting nuclear devices to rockets and hoping they don’t crash and burn here on Earth,” Dr. Seth Shostak from the SETI Institute told the Huffington Post via email. “There’s also the damage you might do to still-unknown life on the Red Planet.  And, of course, there would likely be political fallout of the non-radioactive kind.”

Dr. Shostak added that the bombs could fail to generate the desired effect, and that Mars could just “revert to its former, inhospitable self,” and Arizona State University theoretical physicist Dr. Lawrence Krauss dismissed the entire concept of quick-and-easy terraforming on Mars as a “wildly speculative” idea and called it “silly to expect Mars to become easily habitable.”

—–

Image credit: Thinkstock

Scientists revise Io model because ‘volcanoes were in the wrong places’

Jupiter’s moon is the most volcanically active object of its kind in the solar system, and is home to hundreds of erupting craters capable of shooting lava up to 250 miles (400 kilometers) into the air – but why does it seem like those volcanoes are in all the “wrong” places?

Thanks to new analysis of tidal flow in a subsurface ocean of magma, NASA scientists may have finally discovered the answer: oceans beneath the crusts of this tidally-stressed moon could be more common and longer-lasting than previously expected. A paper discussing their findings was published earlier this summer in the Astrophysical Journal Supplement Series.

“This is the first time the amount and distribution of heat produced by fluid tides in a subterranean magma ocean on Io has been studied in detail,” lead author Robert Tyler of the University of Maryland, College Park and NASA’s Goddard Space Flight Center in Greenbelt, Maryland, said in a statement. “We found that the pattern of tidal heating predicted by our fluid-tide model is able to produce the surface heat patterns that are actually observed on Io.”

The intense geological activity found on Io, Tyler and his colleagues explained, is caused by a kind of gravitational tug-of-war between Jupiter and gravity from the neighboring satellite Europa. The faster-moving Io finishes two orbits every time Europa completes one, meaning that the former feels the gravitational pull of the latter at the same orbital location every time.

This causes Io’s orbit to be distorted into an oval shape, which makes it flex as it travels around Jupiter and causes material within the moon to change position. This shifting causes friction and generates heat, similar to how rubbing your hands together warms them up, the agency said.

Answer lies in a mixture of fluid and solid tidal heating

Earlier theories used to explain Io’s heat generation looked at the moon as a clay-like object that is solid, but able to be deformed. However, when scientists using this explanation in computer models, they learned that the majority of the volcanoes were not located where they should have been – they were located 30 to degree degrees to the East of where models predicted the most intense heat should have been produced.

The results turned out to be too consistent to be dismissed as an anomaly, so Tyler’s team needed to find an alternative to the traditional solid-body tidal heating models. They ultimately came up with an explanation that centered around the interaction between heat produced by fluid flow and heat from solid-body tides, co-author Christopher Hamilton of the University of Arizona said.

“Fluids – particularly ‘sticky’ (or viscous) fluids – can generate heat through frictional dissipation of energy as they move,” Hamilton explained in a statement. He and his colleagues believe that most of the ocean layer is a partially-molten slurry that is mixed with solid rock. As molten rock flows under gravity’s influence, it rubs against the solid rock surrounding it, generating heat.

Hamilton said that this process “can be extremely effective for certain combinations of layer thickness and viscosity which can generate resonances that enhance heat production,” and his team believes that this a mixture of fluid and solid tidal heating may offer the best explanation for the Jovian moon’s volcanic activity.

Their findings may also have implications for the search for life on other planets, according to NASA. Some tidally stressed moons, such as Europa and Enceladus, have liquid water oceans beneath their icy surfaces. Those subsurface oceans may contain the ingredients required for life to exist, and the new study suggests that these oceans may be more common and longer lasting than previously believed – no matter if they’re made of magma, water, or something else.

—–

Pictured is a map of Io created using images from the Voyager 1 and Galileo missions. Image credit: NASA

 

Study shows diet soda drinkers eat more junk food

Soda addicts beware: diet beverage consumers often compensate by eating junk food, at least according to a recent study published in the Journal of the Academy of Nutrition and Dietetics.

Over 22,000 U.S. adults completed the National Health and Nutrition Examination Survey, as conducted by the National Center for Health Statistics. These adults were asked to recall everything they consumed over the course of two nonconsecutive days, including beverages.  University of Illinois kinesiology and community health professor Ruopeng An examined the data, and found several interesting trends.

An first classified respondents by the beverages they consumed daily. The most common were coffee (52.8% of respondents) and sugar-sweetened beverages, like sodas and fruit drinks (42.9%). Next came tea (26.3%), alcohol (22.2%), and diet or sugar-free drinks (21.7%). About 97% of those studied consumed at least one of these beverages daily; more than 25% consumed three or more daily.

Within the 22,000 people surveyed, 90% also consumed energy-dense, nutrient-poor foods (which An termed as “discretionary foods”) like fries or ice cream daily—adding an average of 482 calories from these products onto their caloric intake.

Diet beverage drinkers and coffee drinkers were found to eat proportionally more of these discretionary foods than anyone else, obtaining a greater portion of their daily calories from junk. Interestingly, they didn’t have the lowest number of calories overall—those who drank alcohol daily won that prize. (But eating more junk food and less calories certainly isn’t the recipe for good health, anyways. Just ask the “skinny obese”.)

Are we compensating?

An believes this may be a result of a sort of compensation effect. “It may be that people who consume diet beverages feel justified in eating more, so they reach for a muffin or a bag of chips,” he said. “Or perhaps, in order to feel satisfied, they feel compelled to eat more of these high-calorie foods.”

An also suggested that people select diet beverages out of guilt for previous indulgences. “It may be one – or a mix of – these mechanisms. We don’t know which way the compensation effect goes.”

He also found that obese adults who drank diet beverages consumed more calories via discretionary foods. Switching to diet drinks might not help people control their weight if they don’t monitor their diet, An warned.

“If people simply substitute diet beverages for sugar-sweetened beverages, it may not have the intended effect because they may just eat those calories rather than drink them,” he said. “We’d recommend that people carefully document their caloric intake from both beverages and discretionary foods because both of these add calories – and possibly weight – to the body.”

—–

Image credit: Thinkstock

Pokémon GO! announced: here comes real life Pokémon

Let’s set the mood:

Remember when Pokémon first came out in 1998? Twenty-somethings of the world, rejoice! Pokémon has finally made the move to mobile app form—which means soon you’ll be able to search for and catch Pokémon in the real world.

“The Pokémon video game series has always valued open and social experiences, such as connecting with other players to enjoy trading and battling Pokémon,” wrote The Pokémon Company in a press release. “Pokémon GO’s gameplay experience goes beyond what appears on screen, as players explore their neighborhoods, communities, and the world they live in to discover Pokémon alongside friends and other players.”

That’s right: You’ll be able to run through tall grass or surf in real life to catch Pokémon! (Or you could stick to the streets and whatnot. That works too.)

The Pokémon Company president Tsunekazu Ishihara announced that Pokémon Go will be made for Android and iOS smartphones. Developed as a collaboration between Nintendo, The Pokémon Company, and game developer Niantic, it is set for release in 2016.

Carry a real-life “pokeball”

According to The Pokémon Company, the game will be free to download, but a Bluetooth device called Pokémon GO Plus can be bought—so you don’t have to constantly watch your phone—and there are optional in-app purchases, too. (Does this mean we can we finally buy Master Balls?).

The Poké Ball-shaped Plus device can be worn like a bracelet or a brooch, and will alert players that something important is happening in their game via an LED light and vibrations—like if a wild Pokémon is nearby. Players will also be able to catch Pokémon and perform other simple actions by pressing a button on the device.

Like other Pokémon games, GO is intended to be for people of all ages—for children, families, and young adults who might need a little reminder of a simpler time as they face impending adulthood.

“Our challenge was to develop a great game for smart phone devices that expressed the core values of Pokémon,” said Ishihara in a press release. “Pokémon GO is the answer to that challenge.”

“Pokémon GO is a wonderful combination of Niantic’s real world gaming platform and one of the most beloved franchises in popular culture,” added John Hanke, founder and CEO of Niantic, Inc. “Our partnership with The Pokémon Company and Nintendo is an exciting step forward in real-world gaming and using technology to help players discover the world and people around them.”

“Help players discover the world” seems like a noble idea. Of course… we could just conquer it by forming a real-life Team Rocket.

Watch the trailer for Pokémon GO below:

—–

Image credit: YouTube/The Official Pokémon Channel

Neuroscientist schedules first ever head transplant

Back in February of this year, Italian surgeon and neuroscientist Sergio Canavero shocked the world when he announced that he could (and fully intended to) perform a head transplant on a human being. Following this announcement, Valery Spiridonov, a 30-year-old Russian man afflicted Werdnig-Hoffman disease, a muscle-wasting condition, volunteered to be the first human to be decapitated and have his head attached to another body.

This controversial surgery is scheduled to be performed in China in 2017.

A shaky past

“When I realized that I could participate in something really big and important, I had no doubt left in my mind and started to work in this direction,” Spiridonov, a Russian computer scientist, told CEN. “The only thing I feel is the sense of pleasant impatience, like I have been preparing for something important all my life and it is starting to happen.”

He added that while the procedure is on the books for two years from now, he’s confident that Canavero will take as much time as he needs to make sure the surgery can be performed successfully.

“According to Canavero’s calculations, if everything goes to plan, two years is the time frame needed to verify all scientific calculations and plan the procedure’s details. It isn’t a race. No doubt, the surgery will be done once the doctor and the experts are 99 percent sure of its success.”

Canavero also announced this week that he’s partnered with Chinese surgeon Ren Xiaoping from the Harbin Medical University to perform the procedure.

Xiaoping has lots of experience with head transplants—since 2013 he’s performed 1,000 of them on mice, each requiring a 10-hour operation. Of course, none of them have lived for longer than a few minutes. He already has plans to start experimenting on primates later this year.

The first successful head transplant was performed in 1970—a monkey had its head removed and attached to the body of another. It lived for nine days after the operation, but was paralyzed from the attachment point down and required a ventilator to breathe.

Given that none of these such experiments have resulted in a “hybrid” that lives for longer than a little over a week—and more often than not, only a few minutes—and that the surgeon and patient both seem determined to go through with the operation, we can only hope that Canavero and Xiaoping can refine the procedure enough in the next couple of years to successfully transplant Spiridonov’s head to a new body with no complications.

——

Image credit: Thinkstock

American caffeine addiction dates back at least 1000 years

Starbucks and Coca-Cola might be relatively recent inventions, but America’s craving for caffeinated beverages dates back at least a millennium, according to a new study published this week in the Proceedings of the National Academy of Sciences (PNAS).

In the study, researchers from the University of New Mexico and their colleagues found that people living in the southwestern US and northwestern New Mexico drank highly caffeinated beverages made out of holly and cacao between 750 and 1,400 AD, CBS News reported.

Lead author and distinguished professor of anthropology Patricia Crown explained that one of the drinks was a cacao-based chocolate drink, while the other was what Native Americans used to call “black drink” and was made out of holly. Both were highly caffeinated, but neither one of the main ingredients grew in the southwest and they were likely obtained through trade routes.

Caffeinated beverages probably were only for the wealthy

There was some trade with Mexico at the time, since that was the closest area from which the population could have obtained cacao, and the presence of scarlet macaws at many sites in the Southwest also indicate that there was an established trade relationship with people living in Mesoamerica, Phys.org added.

The holly, on the other hand, might have originated from either what is now the southern US or from Mexico, Crown explained. Residue found in the ceramic bowls and pitcher sherds makes it hard to find details about the black drink consumed by these people. She added the caffeine was probably consumed as part of ritual or political ceremonies.

Crown and her colleagues used sherds (fragments of pottery) from jars, bowls, and pitchers unearthed at archaeological sites throughout the southwest, CBS News reported, and in all 177 sherds were tested. Of those, 40 were found to contain caffeine. The research team noted that they were careful to test sherds that were in use during different time periods to get a complete picture of the trade practices.

Crown also said it was unlikely that everyone was able to drink these beverages. It “seems likely that elites were probably the ones who were able to acquire it because it was coming from such a distance,” she said. “They might serve it to followers to gain loyalty and alliances. But it seems unlikely that everybody could acquire it or everybody could drink it.”

“If you are going to get the ingredients for these drinks on a regular basis,” she added, “you have to have networks and connections in order to acquire them. It really tells us the people in the southwest had those connections to get them regularly enough to use them in ritual activities.”

—–

Pictured is cocoa, the base ingredient for chocolate. This was used to make these ancient caffeinated drinks. Image credit: Thinkstock

Scientists find gigantic ice slab under Martian surface

Researchers from the University of Arizona’s Lunar and Planetary Laboratory (LPL) have discovered an enormous slab of ice, roughly equal to the size of Texas and California combined, located just beneath the surface of Mars in the planet’s northern hemisphere.

According to Space.com, the ice may have been the result of snowfall that occurred on the Red Planet tens of millions of years ago, as evidence suggests that the now cold and dry world once was covered by rivers and lakes. The discovery “could help us understand if locations on Mars were once habitable,” lead author Ali Bramson told the website on Thursday.

Experts already knew about the vast amounts of ice beneath the planet’s surface at high latitudes around the poles, but they recently started discovering water ice hidden in the middle and lower latitudes. Life and liquid water go hand-in-hand on Earth, so scientists believe Mars may have harbored life when it was wet, and living organisms may be hidden in underground aquifers.

Bramson, a planetary scientist at the LPL, and her colleagues, looked at an unusual crater in the Arcadia Planitia region of Mars, located in the mid-latitudes around the same region as the US-Canadian border on Earth. The crater is terraced, they explained, and between 1,075 and 1,410 feet (328 to 430 meters) wide.

How did the ice manage to survive all this time?

The terraced crater is somewhat unusual for Mars, the UA team said in a statement, as most craters are bowl-shaped. Using data from the NASA Mars Reconnaissance Orbiter (MRO) High Resolution Imaging Science Experiment (HiRISE) camera, they set out to learn why this crater had such an odd shape, creating 3D models of the surface to measure the depth of the terraces.

Bramson’s team then used the MRO’s Shallow Radar (SHARAD) instrument to fire radar pulses at Mars, which allowed them to measure how long it took for those signals to penetrate the layers of the surface and rebound to the orbiter. They combined both sets of data to determine the speed of the radar waves, which helped them determine that the layers were made of water ice.

The researchers also said they found an enormous hunk of ice located just below the regolith of the Red Planet. This icy slab is approximately 130 feet thick, Bramson and her colleagues said in research published Wednesday in the journal Geophysical Research Letters.

“Knowing where the ice is and how thick it is can tell you about Mars’ past climates” said associate professor and study co-author Shane Byrne in a statement. “The fact that the ice is so thick and widespread leads us to think it came into place during one of Mars’ past climates when it snowed a bunch, ice accumulated, was buried, and then preserved.”

“There have been a lot of climate changes between now and the tens of millions of years ago when we suspect the ice was put there. But it shouldn’t be stable today, and other past climates of ice instability in this region mean the ice should’ve sublimated away into the dry Martian atmosphere by now. So, that’s what we need to investigate,” he added. “What kept the ice around all this time? There’s no climate model that we have now that explains this.”

—–

Image credit: American Geophysical Union/University of Arizona

Study: Alzheimer’s might be spreadable

 

A new study published in Nature may have just uncovered evidence that Alzheimer’s disease can be transmitted between humans in a way other than genetics—which, if true, could radicalize how the disease is understood.

A contaminated sample

Between 1958 and 1985, around 1800 UK children received therapy in the form of growth hormones taken from cadaver brains. What no one knew at the time, however, was that the sample was contaminated with disease-causing prions.

Prions are malformed versions of proteins that can be transmitted through infected meat or contaminated surgical instruments. While they contain no genetic material, they can reproduce on their own and become infectious. And, they cannot be killed—they aren’t alive. Which means that they have no treatment or cure. So while prions can have an incubation period of decades, once they are activated they cause diseases that fill the brain with holes and eventually kill the host.

According to the authors, scientists have long wondered if diseases involving misfolded proteins or if amyloids could be transmitted like prions, and now they may have proof.

The team examined the brains of eight of those given the growth hormone—all of whom had died from a prion disease known as Creutzfeldt-Jakob Disease (CJD). Of the eight, six had significant amounts of amyloid plaques, which is a hallmark of Alzheimer’s. However, they all died between the ages of 36 and 51.

“This is a highly unusual finding. We wouldn’t have expected to see this Alzheimer’s amyloid deposition in this age group,” author Dr. John Collinge told reporters at a media briefing assembled by Nature. “It’s normally only seen in elderly individuals, unless you have a genetic predisposition to it, and none of these patients did.”

Further examination of 116 other cases of prion-caused deaths (which were not related to the contaminated hormone) showed that there was no evidence of Alzheimer’s. This seems to suggest that the source of the amyloid was not genetic vulnerability or a consequence of Creutzfeldt-Jakob Disease, but instead came from the hormone itself.

Amyloid was also found inside the pituitary glands of Alzheimer’s patients. The pituitary gland is the part of the brain that naturally produces growth hormone—which suggests that amyloid seeds could very well have come from the growth hormone, because it congregates there naturally during the course of the disease.

Moreover, past experiments have shown that amyloid seeds can induce Alzheimer’s in mice and primates—so it could very well be possible for humans as well.

The big “but”

The study had several limitations, especially in the size of the group studied—eight people is hardly conclusive. Further, all of the six patients who showed amyloid plaques might never have developed Alzheimer’s—they died before it could have developed, and they lacked the other hallmark known as tau tangles.

Most importantly, correlation is not causation. There is no definitive proof that the hormone conveyed the Alzheimer’s, because the team couldn’t analyze samples of the growth hormone to confirm it.

According to Collinge, samples of the hormone does still exist, and his team is aiming to study them. “But the experiments take one or two years,” he told New Scientist.

—–

Feature Image: Thinkstock

Icebreaker becomes first US ship to complete solo journey to North Pole

 

A US Coast Guard icebreaker containing more than 50 scientists on an expedition to analyze the changing chemistry of the Arctic Ocean has become the first American surface ship to make it to the North Pole on its own, officials from the Armed Forces branch have confirmed.

According to Discovery News, the Seattle-based US Coast Guard Cutter Healy arrived at the top of the world on September 5, marking the fourth time that a US surface vessel successfully made it to the North Pole, but the first time that no other ships accompanied the ship on its journey.

The 420 foot (128 meter) long, 16,000 ton Healy uses a 30,000-horsepower engine is capable of breaking over 10 feet (three meters) of ice. It is one of the two icebreakers currently owned by the US that works, and one of three total, though last week President Barack Obama said that he would fast-track the construction of a fourth such vessel.

“As the Arctic region continues to open up to development, the data gathered on board Healy during this cruise will become ever more essential to understanding how the scientific processes of the Arctic work, and how to most responsibly exercise stewardship over the region,” the Coast Guard said in a statement.

Hunting for trace metals

Katlin Bowman, a postdoctoral researcher at the University of California, Santa Cruz and one of the 50 scientists and more than 130 total people on board the Healy, explained in a piece for The Huffington Post that she and her colleagues were collecting seawater, particles, sediment ice, and snow analyzing them for trace metals and other elements.

“Our mission,” Bowman wrote, “is a joint effort between the US GEOTRACES and Climate Variability and Predictability (CLIVAR) programs, funded by the National Science Foundation. The dataset will take years to complete and the final product will be the most comprehensive chemical survey of the Arctic Ocean.”

GEOTRACES is an organization whose mission is to monitor and analyze the distributions of key trace elements and isotopes in the ocean, establishing the sensitivity of these distributions to changing environmental conditions, the group’s website said. CLIVAR, on the other hand, is a group that is attempting to better understand the dynamics and interaction of the coupled ocean-atmosphere system by observing and predicting changes to the global climate system.

—–

Feature Image: US Coast Guard/Wikimedia Commons

Depressed people’s brains have a ton of this protein

 

When studying depression, researchers tend to focus on what is missing from the patients, like maybe (but probably not) serotonin, or brain-derived neurotrophic factor (“brain fertilizer”)—but far fewer investigate what the brain has more of.

This means that a new study published in the Proceedings of the National Academy of Sciences is one of the first to suggest that having too much of an essential molecule might lead to depression, and it offers a strong solution for treatment.

A team from the University of Michigan Medical School and the Pritzker Neuropsychiatric Disorders Research Consortium discovered that the brains of people suffering from depression had high levels of a protein known as FGF9 (fibroblast growth factor 9) while thoroughly examining cadaver brain tissue. The main focus was the hippocampi of the brains—the part that shrinks in depression—which were donated by 36 people with and 56 without major depression.

“We call this approach ‘reverse translation’,” said author Huda Akil, a professor of psychiatry and neuroscience. “We start by careful, broad scale analyses in the human brain to discover new molecular players that might play a role in triggering or maintaining the depression.”

Using this technique, they found that FGF9 exists in amounts 32% higher in the hippocampi of depressed brains than in those without it. Further, several other fibroblast growth factors showed decreased expression, which suggests that the whole system necessary for regulating cell growth and development is altered in depression.

The rats confirm it… and solve it?

But the researchers didn’t just stop here. Intrigued by the results, they decided to do multiple experiments in rats to see if these findings held.

They exposed rats to repeated stress for a week and a half (stress is one of the top causes of depression), and found that the levels of FGF9 rose, thus coinciding with socially withdrawn behavior and changes in weight.

Then, they injected FGF9 or a placebo directly into the ventricles (the fluid-filled cavities between the two halves of the brain) of the rats. The FGF9 rats became more anxious, and began to move much less—changes that continued to show up with repeated injections.

The most exciting study, however, was when they injected a virus into the rat hippocampi. The virus was of two types: One that blocked FGF9 expression, and one that was an inactive control. In the FGF9-blocking virus, the levels of the protein dropped nearly 30% as compared to the inactive-virus, and those mice showed less anxiety to boot.

With all these results, the researchers believe that this provides even more evidence that depression goes far beyond something just in your head, or a chemical imbalance; it is a physical illness.

So is this the cure for depression? Probably not—there’s a lot about it we don’t understand. But the research team is optimistic.

“Fixing depression is not easy, because it’s a disorder at the level of the circuits that connect brain cells, and many regions of the brain are involved,” said co-author Elyse Aurbach, a neuroscience doctoral student. “Still, this is the first time FGF9 has been identified as related to depression, and found to be active in a critical area of the brain for the disorder. We and others need to study it further to determine what is going on. It’s very exciting.”

—–

Feature Image:

 

Barrage of small asteroids shattered moon’s upper crust, study finds

 

While most scientists believe that the moon was severely pummeled by an array of asteroids roughly four billion years ago, a new study from research scientists at MIT and elsewhere has found that this event may have had a greater-than-expected impact on the lunar surface.

Known as the Late Heavy Bombardment, the sustained asteroid impacts were so heavy in some regions of the dark side of the moon that they completely shattered the upper crust. These areas, known as the lunar highlands, were left as fractured and porous as they could be – until additional impacts sealed back up the previously created cracks.

Jason Soderblom of the MIT Department of Earth, Atmospheric, and Planetary Sciences and his fellow investigators explained in a statement that they observed this effect in the upper layer of the lunar crust (also known as the megaregolith). Small craters no more than 30 km in diameter dominate this layer, while deeper layers have larger craters and less porous terrain.

The authors, who published their findings in the journal Geophysical Research Letters, used data from the NASA Gravity Recovery and Interior Laboratory (GRAIL) satellites to map the gravity field in and around more than 1,200 craters in the lunar highlands, then used that information in a series of calculations to determine if an impact increased or decreased porosity.

For craters smaller than 30 km in diameter, Soderblom’s team found impacts that both increased and decreased porosity in the upper layer of the moon’s crust. Larger craters, on the other hand, were found much deeper in the moon’s crust and only increased in porosity further down, which indicates that the deeper layers are less fractured than the megaregolith.

Research could provide insight on origins of life, Late Heavy Bombardment

The evolution of the moon’s porosity could give scientists new insight into some of the earliest life-supporting processes occurring in the solar system, as the interaction of water and rock can provide a significant source of energy and may have played a key role in the evolution of life on Earth, Soderblom explained to redOrbit via email.

“A rocky layer that is porous and fractured has an increased surface area, which increases the rates at which water-rock reactions occur. Understanding how porosity formed and evolved in Earth’s crust, therefore provides insight into these reaction rates,” Soderblom said. “The moon has undergone little modification over the lifespan of the solar system, and so it provides us a great way to look back in time at what the Earth, and the other terrestrial planets, might have looked like in the early solar system.”

In addition, he and his co-author hope to discover where these different types of impactors came from, and as a result, to understand more about the origins of the Late Heavy Bombardment. As Soderblom explained, the total number of craters that formed on the moon (its cumulative record of craters) is one of “the great outstanding questions in the history of the solar system.”

However, the far side of the moon simply has too many craters to retrieve this information, he said. Thus, he and his colleagues hope to use the structure of the subsurface to retrieve this data. To this end, they are developing a model that will simulate the evolution of impact-generated porosity in the lunar subsurface.

“Knowing this will allow us to understand the magnitude of the Late Heavy Bombardment – and, in fact, test whether it occurred at all – and to investigate the significance of this event for other planetary bodies throughout the Solar System,” he concluded.

—–

Feature Image: MIT

Experts call genetic modification of human embryos ‘essential’

A new report released by a consortium of scientists and ethics experts argues that it is “essential” that the genetic modification of human embryos be permitted, and that editing the DNA of early stage embryos is of “tremendous value” to medical research, published reports indicate.

According to the Daily Mail, the organization called the Hinxton Group also said that preventing research in this area would be “dangerous,” though they acknowledged that the technology used for such procedures is not yet advanced enough for use in clinical applications.

“We believe that while this technology has… enormous potential,” the group said in a statement, according to BBC News, “it is not sufficiently developed to consider human genome editing for clinical reproductive purposes at this time.” They added that there could be “morally acceptable uses” of this technology in humans, but that “substantial” debate would be required.

The Hinxton Group, an international team of experts based out of John Hopkins University in Baltimore, Maryland, met in Manchester, England last week to discuss issues related to genetic modification technology in response to recent breakthroughs in the field, reports indicate.

Scientists remain divided over the issue

While genetic modification have been used to alter the DNA of animals for more than 30 years, the authors of the new study said that those methods were “inefficient” and “lacked specificity.” They also relied on a series of steps that made them unsafe and inappropriate for use on people, but recent breakthroughs like the Crispr DNA editing technique changes the game.

The Hinxton Group’s report, which the Daily Mail said was backed by all 22 members of the consortium, outlined multiple possible clinical uses for DNA editing techniques, including the correction of disease-causing mutations or giving a person immunity to specific pathogens, but they but acknowledged some alterations could be “more contentious than others.”

Professor Emmanuelle Charpentier, a researchers involved in the development of Crispr, told BBC News, “Personally, I don’t think it is acceptable to manipulate the human germline for the purpose of changing some genetic traits that will be transmitted over generations… I just have a problem right now with regard to the manipulation of the human germlines.”

“Genome editing techniques could be used to ask how cell types are specified in the early embryo and the nature and importance of the genes involved,” countered Robin Lovell-Badge, senior member of the Hinxton Group and head of the stem cell biology lab at the Francis Crick Institute in London. “Understanding gained could lead to improvements in IVF and reduced implantation failure, using treatments that do not involve genome editing.”

—–

Feature Image: Thinkstock

New species of human ancestor found in South Africa

In what National Geographic is calling “one of the greatest fossil discoveries of the past half century,” an international team of scientists working in South Africa have discovered remains belonging to a previously unidentified species of human ancestor.

The discovery, which was led by paleoanthropologist Lee R. Berger, a human evolution studies professor at the University of the Witwatersrand in Johannesburg, was reported in a new research article published this week in the journal eLife. The species was found in a large chamber in the Rising Star Cave and has been named Homo naledi, the New York Times reported.

According to Nat Geo, H. naledi had a tiny, primitive brain and apelike shoulders suitable for climbing. In other ways, however, it bears a remarkable resemblance to modern humans, such as feet that were virtually identical to those of present-day H. sapiens, they added. It also has small body mass and stature similar to small-bodied humans, the researchers wrote.

Berger and his more than 60 fellow researchers found well over 1,500 individual fossil elements belonging to H. naledi in the cave, making the find the largest sample for any hominin species at any single African site, the Times added. Thus far, they have recovered parts belonging to at least 15 unique individuals, and there could be many more fossils yet to be discovered.

An odd mixture of ancient and modern features

The various bones were divided up amongst different members of the research team, and based on their analysis, the creature was found to be a mixture of modern and ancient hominin species. While some of the teeth resembled those of modern humans, the authors also found unusual and primitive premolar roots, Nat Geo explained.

Its skull was, on average, less than half the volume of a modern human skull, and the species was said to have a fully modern hand with curved fingers best suited for tree climbing. Its pelvis had flaring blades similar to those belonging to the Australopithecus afarensis Lucy and its leg bones are ancient looking at the top by become more modern in nature near the bottom.

Ian Tattersall, an expert on human evolution at the American Museum of Natural History in New York who was not involved in the study, called the discovery “very fascinating.” He told the Times that there was “no question there’s at least one new species here, but there may be debate over the Homo designation… the species is quite different from anything else we have seen.”

Paleoanthropologist Fred Grine of the State University of New York at Stony Brook told Nat Geo that H. naledi was “weird as hell,” and Berger added that it was “an animal that appears to have had the cognitive ability to recognize its separation from nature.”

The researchers have not yet determined exactly the age of this new pre-human ancestor, but based on its anatomy, they believe that it must be at least 2.5 to 2.8 million years old. The cave where it was found is likely no more than three million years old, according to the Times.

—–

Feature Image: Skeletal fossils of the hand of Homo naledi pictured in the Wits bone vault at the Evolutionary Studies Institute at the University of the Witwatersrand, Johannesburg, South Africa, on Sept. 13, 2014. The fossil hand is one of many fossils representing a new species of hominin. The broad thumb of Homo naledi suggests it was an expert climber. The Rising Star Expedition, a project that retrieved and analyzed the fossils was led in part by paleoanthropologist John Hawks, professor of anthropology at the University of Wisconsin-Madison. (Credit: John Hawks/University of Wisconsin-Madison)

Humans are actually ‘wired for laziness’

 
No matter how hard you think you’re working, your nervous system is subconsciously trying to make things easier for you by adapting to limit the amount of calories your body burns and thus expending the least amount of energy possible, according to a new study.
In research published Thursday in the journal Current Biology, Dr. Max Donelan of the Simon Fraser University Locomotion Laboratory and his colleagues were studying the energetic costs of walking and other movements when they found that humans are constantly making slight tweaks to their gait to save minute amounts of energy – whether they know it or not.
Dr. Donelan, lead author Jessica Selinger (a student in the SFU Locomotion Lab) and colleagues set out to understand why people move the way that they do, considering there are a multitude of ways to travel from one point to another. Specifically, they wanted to find out to what degree the human body can adapt its movement in response to real-time physiological inputs.
Experiments show how the body tries to burn fewer calories
The researchers put participants into robotic exoskeletons that discouraged them from walking in their typical way by making it harder to swing their legs by putting resistance on the knee during the motion. They found that the nervous system is constantly working behind the scenes to re-optimize locomotion patterns and reduce energy expenditures.
Specifically, their experiments revealed that people adapt their step frequency to achieve a new, optimum energy consumption level in a matter of minutes, and that this takes place even though the overall energy savings is almost negligible – less than five percent, according to the authors. Even so, their findings are proof that the human body is hard-wired for laziness.
But is this a good thing or a bad thing?
“I suppose that depends on your perspective,” Selinger told redOrbit via email. “It’s a good thing if you want to conserve energy, but a bad thing if you are just trying to burn calories. A marathon runner wants to move efficiently throughout the race so they preserve energy for the final push. But, if someone is running simply to lose weight, they may curse their clever nervous system. So, I guess it depends on your goal.”
“The ability to optimize movements to reduce energy expenditure may also be an important survival strategy. While we live in a culture where calories heavy foods are widely available, that isn’t the case for all, and that certainly wasn’t the case for our prehistoric ancestors. The nervous system’s goal of reducing energy expenditure may have helped stave off starvation,” she added.
Researchers trying to learn more about this phenomenon
Moving forward, Selinger, Dr. Donelan, and their colleagues said that they plan to investigate further, with the hopes that they can answer questions about how the human body measures the energy costs associated with specific was of moving. They also hope to solve the mystery of how our systems are actually able to solve complex motion-centered optimization problems.
“We now know that humans can continuously optimize their movements patterns based on energetic cost. But we don’t know how energy use is sensed or measured by the body,” Selinger said. “We might sense it directly using known blood gas receptors that can measure oxygen consumption and carbon dioxide production, or we could sense it indirectly from some proxy signal like muscle activity. We are currently conducting some interesting experiments where we try to directly perturb blood gas sensors and see if this can disrupt the optimization process.”
She added that there are “countless ways we can get from point A to point B. So how do we so quickly discover the optimal coordination patterns? It could be that our nervous system is primed to quickly search only a reduced subspace, such as particular combinations of speeds and step frequencies or particular combinations of muscle activities, rather than attempting to continually search all possible combinations. The nervous system may also employ what we call gradient-based descent or other optimization tricks to speed the search for new energetic optima.”
“For example,” Selinger said, “if a gait parameter changes and the nervous system detects a lower cost movement, it will send a signal to continue to change that gait parameter as long as it continues to reduce cost. Alternatively, people may not initiate optimization based on energetic gradients, but may instead require explicit experience with a novel optimum in order to adapt to it. We are currently conducting experiments to try to understand which, if any, of these mechanisms underlie the optimization process.”
—–
Feature Image: Thinkstock

Psychopaths less likely to yawn along with you, study says

 

People with psychopathic traits are less likely to “catch” a yawn than those who are more empathetic, at least according to a new study published in Personality and Individual Differences.

Psychopaths have a variety of characteristics—selfishness, manipulative drives, impulsiveness, fearlessness—but above all, a lack of empathy. Meanwhile, contagious yawning has long been associated with empathy and bonding, especially in social mammals like humans, chimpanzees, and dogs. Lead researcher Brian Rundle of Baylor University was the one who connected the two.

“You may yawn, even if you don’t have to,” he said in a press release. “We all know it and always wonder why. I thought, ‘If it’s true that yawning is related to empathy, I’ll bet that psychopaths yawn a lot less.’ So I put it to the test.”

Rundle and his team issued 135 college students a standard psychological test known as the Psychopathic Personality Inventory, which aims to determine their degree of cold-heartedness, fearless dominance, and self-centered impulsivity. “It’s not an ‘on/off’ of whether you’re a psychopath,” Rundle clarified. “It’s a spectrum.”

Following the administration of the test, the students were placed in a dim room in front of computers wearing noise-canceling headphones. Electrodes were placed on various parts of their faces and hands to quantitatively measure yawning. Then, they watched twenty 10-second snippets of facial movements including yawning and laughing.

The result: The less empathy a person had (and thus the more psychopathic they were), the less likely they were to yawn or exhibit muscle, nerve, and skin responses associated with empathy.

“The take-home lesson is not that if you yawn and someone else doesn’t, the other person is a psychopath,” Rundle adds. “A lot of people didn’t yawn, and we know that we’re not very likely to yawn in response to a stranger we don’t have empathetic connections with.

“But what we found tells us there is a neurological connection — some overlap — between psychopathy and contagious yawning. This is a good starting point to ask more questions.”

—–

Feature Image: Thinkstock

Half of all Americans have diabetes or pre-diabetes, study says

 

One out of every two Americans either has diabetes or has blood sugar levels high enough to be on the cusp of developing the disease, and one-third of those cases go undiagnosed, new research from the National Institute of Diabetes and Digestive and Kidney Diseases claims.

According to NBC News, Catherine Cowie of the NIDDK, Andy Menke from the global health research company Social & Scientific Systems, and their colleagues looked at annual nationwide survey data from 5,000 people and found that found that between 12 and 14 percent of adults had been diagnosed with diabetes as of 2012, the most recent data available.

The overwhelming majority of those cases were Type 2 diabetes, which is caused by poor diet, obesity, and lack of exercise, the researchers said. Eleven percent of Caucasians were diabetic, compared to 22 percent of African-Americans, 20 percent of Asian-Americans, and 22.6 percent of Hispanics. As many as half of Asian-Americans and Hispanics were undiagnosed.

“Diabetes prevalence significantly increased over time in every age group, in both sexes, in every racial/ethnic group, by all education levels, and in all poverty income (groups),” between 1988-1994 and 2011-2012, the authors wrote in the latest edition of the Journal of the American Medical Association.

A glimmer of hope

In addition, 38 percent of adults fell into the pre-diabetes category, which is used to measure the number of people who had A1c hemoglobin levels between 5.7 and 6.4 percent, Reuters pointed out. While those people do not have full-blown diabetes, they are considered to be at higher risk of developing the disease.

“We need to better educate people on the risk factors for diabetes – including older age, family history and obesity – and improve screening for those at high risk,” Menke, an epidemiologist at Social & Scientific Systems in Silver Spring, Maryland, told Reuters.

“Although obesity and Type 2 diabetes remain major clinical and public health problems in the United States, the current data provide a glimmer of hope,” William Herman and Amy Rothberg from the University of Michigan wrote in an article that accompanied the paper, according to the Los Angeles Times.

They said the findings indicate that the implementation of new policies governing nutrition and physical activity on the federal, state, and local levels, as well as other anti-obesity initiatives, have started to pay dividends. “Progress has been made, but expanded and sustained efforts will be required” for that success to be sustained, Herman and Rothberg wrote.

—–

Feature Image: Thinkstock

Why are these male green frogs suddenly turning into females?

 

These male green frogs are reportedly becoming more feminine due to exposure to estrogen in suburban ponds, scientists at Yale University report in a new Proceedings of the National Academy of Sciences study.

According to Slashgear and United Press International (UPI), the authors of the paper sampled green frogs (Rana clamitans) from 21 ponds in Connecticut. Not only did they find abnormally high numbers of females living in ponds located in the suburbs, but they also discovered that the males in those populations tended to have intersex characteristics.

Lead author Max Lambert, a doctoral student at the Yale School of Forestry & Environmental Studies, suggested that higher levels of estrogen in the water could be turning male green frogs into female by disrupting their endocrine systems. The ponds in question, the authors said, contained high levels of the chemicals called phytoestrogens.

Regular, everyday landscaping may be to blame

Phytoestrogens are estrogenic chemicals produced by plants, and Lambert explained in a press release that clovers and some other types of plants commonly found in lawns naturally produce these substances. In other words, simple landscaping may be the source of the contamination.

“For a frog, the suburbs are very similar to farms and sewage treatment plants,” he said. “Our study didn’t look at the possible causes of this, partly because the potential relationship between lawns or ornamental plantings and endocrine disruption was unexpected… [our lab] is trying to understand how the suburbs influence sexual development in other species.”

The discovery of high estrogen levels in these suburban ponds could also impact other types of creatures that use these waters, including other amphibian species such as wood frogs, gray tree frogs, spring peepers, and salamanders, along with birds, turtles, and even mammals.

According to AFP, the researchers believe future studies are needed to understand exactly why there are such high levels of the female sex hormone in areas where there are lawns, gardens, and shrubbery, as well as to explain the reasons for the link between landscaping and frog offspring sex ratios.

—–

Feature Image: Geoff Giller/Yale

China seeks to land first-ever probe on the dark side of the moon

China wants to be the first space program to send a lunar probe to the dark side of the moon, and it wants to accomplish the feat within the next five years, Chinese Academy of Sciences officials reportedly said in an interview with state broadcast outlet CCTV on Wednesday.

According to the Associated Press (AP), engineer Zou Yongliao said that the proposed Chang’e 4 mission would launch sometime before 2020 and would analyze geological conditions on the far side of the moon. That could lead to the placement of a radio telescope which could help “fill a void” in astronomers’ knowledge of the universe, Zou added.

While probes have explored the surface of the moon’s dark side, no space agency has ever tried to land there, according to CNN.com. Radio transmissions from Earth cannot reach the far side, the AP noted, making it an ideal location for potentially sensitive instruments.

China’s next lunar mission is currently scheduled to launch in 2017, and will involve attempting to land a spaceship on the moon. That spacecraft would collect samples and return to Earth, and if successful, it would make China just the third country to have completed such a mission.

Chang’e 5 spacecraft set to launch before Chang’e 4

Thus far, China’s Chang’e lunar exploration program has already launched a pair of orbiting probes, the AP said, and in 2013 it landed a vehicle with an onboard rover on the lunar surface. Officials there have also reportedly hinted at a possible manned mission to the moon.

In May, Wu Weiren, the chief engineer for China’s Lunar Exploration Program, told CCTV that China planned to send the Chang’e 4 spacecraft to orbit the moon before sending a rover to the surface, CNN.com said. Wu explained that his team would “choose a site” that is “more difficult to land and more technically challenging,” and that their “next move” would “probably see some spacecraft land on the far side of the moon.”

The website also said that China plans to launch its Chang’e 5 spacecraft before the Chang’e 4, with the higher-numbered probe launching in 2017. It will orbit and land on the moon, dig and collect rock samples from up to two meters beneath the moon’s surface, then return to Earth. It will be launched from the new Wenchang Satellite Launch Center in Hainan.

China released detailed images of the proposed Chang’e 5 landing site last week – images that were captured from an orbiting service module some 19 miles (30 kilometers) from the moon, according to CNN. Those pictures had a resolution of one meter, officials from the China State Administration of Science, Technology and Industry for National Defence noted.

—–

Feature Image: Thinkstock

The good in the bad? Having a stroke may cause you to stop craving cigarettes

Suffering a debilitating cerberovascular accident isn’t exactly what anyone would call a typical smoking cessation aid, two new studies have found that cigarette users who suffered a stroke in the insular cortex were more likely to kick the habit than those with strokes elsewhere.

In addition, the researchers behind these new papers found that smokers who experienced a stroke in this region of the brain also typically experienced fewer and less severe withdrawal symptoms than men and women with strokes that affected other parts of the brain.

The results, lead author Dr. Amir Abdolahi, a clinical research scientist at Philips Research North America who conducted the research while an epidemiology doctoral student at the University of Rochester School of Medicine and Dentistry, explained in a statement, “indicate that the insular cortex may play a central role in addiction.”

“When this part of the brain is damaged during stroke, smokers are about twice as likely to stop smoking and their craving and withdrawal symptoms are far less severe,” he added. The two new studies have been published in the journals Addiction and Addictive Behaviors.

Insular cortex stroke patients twice as likely to kick the habit

Most of the prescription medications currently used to treat addiction to tobacco target the brain’s “reward” pathways, the study authors explained. They interfere with the release and binding of dopamine in response to nicotine, and while these drugs are often successful in the short-term, the majority of smokers wind up relapsing within six months.

Building upon recent research that suggested that the insular cortex may also play a key role in the cognitive and emotional processes which lead to drug or tobacco use, Dr. Abdolahi and his colleagues looked at smokers who had their insular cortex damaged during a stroke to see if they were more likely to quit smoking.

They looked at 156 stroke patients admitted to three Rochester, New York area hospitals, all of whom were identified as active smokers, and looked at two different sets of data: whether or not the patients resumed smoking after their strokes, and how severe their cigarette cravings were when they were in the hospital. The stoke locations were determined by CT scans or MRIs.

By measuring factors such as anger, anxiety and cravings during their hospitalization period, during which time the patients were forced to quit smoking, they found that patients who had a stroke in the insular cortex had fewer and far less severe withdrawal symptoms than those with strokes in other parts of the brain. They also found that nearly twice as many insular cortex stroke patients kicked the habit than those with strokes elsewhere (70 percent vs. 37 percent).

Their findings mean that this part of the brain could be a potential target for treatments for smoking or other forms of addiction, though Dr. Abdolahi cautioned that “much more research is needed in order for us to more fully understand the underlying mechanism and specific role of the insular cortex.” However, he added, “it is clear that something is going on in this part of the brain that is influencing addiction.”

—–

Feature Image: Thinkstock

Animal behavior reversed by altering one cell receptor

 

Researchers from the University of Massachusetts Medical School have shown—for the first time ever—that it’s possible to reverse an animal’s behavior by flipping around how one of its cells communicate.

The team turned to C. elegans, a nematode, in order to explore neuron signaling in depth. These nematodes have been extensively studied, to the point where their 302 neurons have a fully fleshed-out roadmap. (No other animal has a completely mapped neural wiring diagram.)

Researchers have also tied many of C. elegans’ behaviors to its neuronal roadmap, including an escape response activated by touching the front half of its body. The neural pathway involved in this response involves the actions of both excitatory and inhibitory neurons—cells that activate other neurons or hinder their actions. The pathway eventually leads to an inhibitory ion channel; once the inhibitory channel is activated, the nematode relaxes its head and changes directions in order to escape a predatory fungus.

Changing to excitatory receptor is exciting

The UMass group was curious as to whether changing this inhibitory receptor into an excitatory one—flipping its purpose completely around—would likewise reverse the nematode’s behavior. So they replaced the channel in a live nematode with an excitatory one to see what would happen, and got an exciting result.

“[W]e were able to completely reverse behavior by simply switching the sign of a synapse in the neural network,” explained co-author Dr. Alkema in a press release. “Now the animal contracts its head and tends to move forward in response to touch.”

Moreover, these results suggest that the neural structure is quite stable. “Surprisingly, the engineered channel does not affect development of and is properly incorporated into the neural circuits of the worm brain,” said Alkema.

Further, Alkema believes this may add to our understanding of evolution.

“Our studies indicate that switching the sign of a synapse not only provides a novel synthetic mechanism to flip behavioral output but could even be an evolutionary mechanism to change behavior,” he added. “As we start to unravel the complexity and design of the neural network, it holds great promise as a novel mechanism to test circuit function or even design new neural circuits in intact animals.”

—–

Feature Image: Wikimedia Commons

Rare pink dolphin reappears in Lousiana; may be pregnant

In 2007, a charter boat captain on Lake Charles in Louisiana had a bit of a shock when a baby dolphin surfaced—and it was pink.

“It was absolutely, stunningly pink,” Erik Rue, the captain, said in 2007. “I had never seen anything like it. It’s the same color throughout the whole body. It looks like it just came out of a paint booth.”

Since then, “Pinkie” has become a bit of a local sensation. Rue actually takes people on chartered boat trips in hopes of seeing her. But besides ferrying around customers, Rue has kept a sharp eye on her.

“I believe I’m [the] first one who saw her and I know I’m the first one to take pictures of her,” Rue told ABC. “I’ve learned a lot since I’ve spend [sic] a lot of time following her around.”

Thanks to his attentiveness, Rue has solved one of the mysteries surrounding the dolphin: her sex. A couple weeks ago, he spotted Pinkie mating, after some observation, discovered she’s, well, a she. “I’ve taken a ton of pictures of her mating and it proved she’s a female,” he said.

But why is she pink?

Rue now wonders if Pinkie is pregnant—and if so, what color her calf will be. Of course, no one is sure why Pinkie is the color she is. Rue initially thought she might be albino, but now isn’t as certain because she isn’t as white as the other known albino dolphins.

“Dolphins have pink bellies, so I just kind of started of thinking that this is a genetic glitch,” he said. “If it was albino I believe it would be white. I’ve changed what I thought over time, as I analyzed the pictures I have. Other than it being pink and her eyes not opening all the way, it’s a perfectly normal dolphin and does all the things the rest of them do.

“It’s interesting to know things like that exist in the world and it’s really beautiful to see that.”

—–

Feature Image: Erik Rue/Snopes

Could this robot be the first permanent ISS crew member?

 

The International Space Station (ISS) may soon have its first permanent resident: a humanoid robot created by researchers from the French National Centre for Scientific Research (CNRS) that is capable of passing along information from one team of astronauts to another.

The robot, which has been named Nao, was created by CNRS senior researcher Peter Ford Dominey and his colleagues, and has been given “an autobiographical memory” so that it can pass on information learned from one group of humans to another.

On the space station, it would use this ability to act as a liaison between different crews as they change every six months, sharing information obtained by the departing astronauts to their successors, Dominey’s team explained in a statement. They presented their findings last week at the 24th International Symposium on Robot and Human Interactive Communication in Japan.

Autobiographical memory, the researchers explained, includes only those events personally experienced by the robot, as well as the context in which they were experienced. It enables the unit to date and locate memories, and to determine who was present during said event.

Following successful simulations, Nao could be headed to space

In order for Nao to be able to understand cooperative behavior, and thus be able to culturally share its knowledge and experiences, it uses a system developed by Dominey and his fellow engineers. Using this system, a human agent can teach the robot new actions through physical demonstrations, visual imitations, or voice commands.

All of these actions are then combined into procedures and stored in Nao’s autobiographical memory, allowing it to reproduce them for other humans as needed. The CNRS team has tested their new system by simulating a scenario that could actually happen on the ISS: Nao helped a scientist fix a damaged electronic card, following his directions during the repair process.

Should the same event happen again, the researchers said, the memory of the event will enable the robot to use a video system to show a new member of the crew how to repair the card. Also, it could answer questions about the previous repair process and help with the new procedure. If another type of failure occurred, it could share its experiences with the original type of failure, while also record the steps required to fix the new issue for use by future crew members.

“These results demonstrate the feasibility of this system, and show that such humanoid robots represent a potential solution for the accumulation and transfer of knowledge,” the CNRS said Monday in a statement. They added that Dominey’s team is now hoping to test Nao “in the real conditions of space operations, with zero gravity.”

—–

Feature Image: Inserm/Patrice Latron

NFL to embed RFID sensors in player shoulder pads

The 2015 National Football League (NFL) season kicks off on Thursday, as the New England Patriots host the Pittsburgh Steelers in a game that will not only kick off the start of a new year of hard-hitting gridiron action, but a new high-tech era for the sport as a whole.

According to CIO, as players take the field this season, their shoulder pads will be equipped with two radio-frequency identification (RFID) sensors roughly the same size as a quarter. Each RFID sensor will emit a unique radio frequency that will be detected by 20 receivers placed throughout each NFL stadium, allowing them to pinpoint each player’s position, speed and more.

The use of two sensors per player (one in each shoulder pad) will allow the technology to tell the way said athlete was facing on each play, the website added. The data will be used for the Xbox One and Windows 10 “NFL 2015” app, allowing users to see detailed stats for players while they view highlight clips, and will also be provided to coaches, players and broadcasters.

“We’ve always had these traditional NFL stats,” Matt Swensson, the league’s senior director of Emerging Products and Technology, told CIO. “The league has been very interested in trying to broaden that and bring new statistics to the fans. Along the way, there’s been more realization about how the data can be leveraged to make workflow more efficient around the game.”

“This type of initiative really opens the doors to do more things at the venue,” he said. “At the Pro Bowl last year, we had a display up that showed what players were on the field. By putting up what players were on the field in real time, it really gave fans more information.”

RFID data currently limited to post-game analysis

As part of its Internet of Things (IoT) initiative, the NFL is partnering with Illinois-based Zebra Technologies, a company that was founded in 1969 and which makes and markets a variety of different tracking, marking and printing technologies, including RFI smart label printers, thermal barcode label and receipt printers, and card and kiosk printers, according to CIO.

Zebra entered the IoT field in 2013, and one of the new products it launched at that time was the MotionWorks Sports Solution system that will be used by the pro football league. The company claims that their tags are capable of “blinking” up to 85 times per second, and that it takes about 120 milliseconds between the time a tag blinks on the field for the data it collects to be delivered to a server with low latency. The location data is said to be accurate to within six inches.

Every NFL venue will be connected to a central command center located in San Jose, California, Jill Stelfox, vice president and general manager of location solutions at Zebra Technologies, told CIO. “When the data is collected in the stadium, it’s sent in the stadium to the broadcaster in the stadium – it never leaves the stadium from a broadcaster perspective – but it’s also distributed out to the NFL cloud,” she added. The entire process just a few seconds.

Swensson said that the technology will not be available during games just yet. “Initially, it’s really more of the post-game,” she said. “Right now, we have a lot of stuff going on on the sidelines. It could just be too much of a distraction during the game. It might be a place we get down the line, but right now it’s not what we’re trying to solve for.”

The information will be available to coaches and player who could review it following a game to evaluate their own or their team’s performances, and could also be used for training. “We’ve just scratched the surface of what we can do with the data,” Swensson told the website. “Every week there’s another thought about how we can expand upon the information we’ve pulled together.”

—–

Feature Image: Thinkstock

What’s the germiest part of a plane?

 

A new study released earlier this week by Travelmath may have you wanting to skip the in-flight meal the next time you’re jetting across country, as tray tables were found to contain the highest amount of bacteria populations per square inch of anywhere on an airplane.

According to the organization’s report, they had a microbiologist collect 26 samples from a total of four flights and five airports to find out the so-called dirtiest places encountered by regular air travelers during their business trips and vacation flights. As it turns out, the tray table was found to be the biggest offender, with 2,155 colony-forming units (CFU) per square inch. Barf.

In comparison, the lavatory flush button was found to contain 265 CFU/square inch, while the overhead air vent contained 285 CFU/square inch, and the seatbelt buckle came in at 230 CFU/square inch. For airports, drinking fountain buttons had the most bacteria with 1,240 CFU/square inch, while the locks on bathroom stalls had just 70 CFU/square inch.

How do those findings compare to other items we encounter on a day-to-day basis? Travelmath, citing statistics purportedly from the National Science Foundation, said that money contains five CFU/square inch, while cell phones have 27. Home toilet seats have 172, home counter tops have 361, and pet bowls have an astonishing 306,000 CFU/square inch.

Just another swab story

RedOrbit asked David Coil, a microbiologist from the University of California, Davis, for his thoughts on the study. “My first reaction,” he said via email, “was ‘great, another germaphobia scare story where they swabbed a few things and made up a story about risk.’ Then I actually read the ‘study’ and indeed, that’s all it is. This is no different than other recent (and weak) scare stories on dishwashers, playgrounds, doorknobs, etc etc.”

Coil explained that he calls this kind of research “swab stories,” and said that the biggest issue with this type of research is that bacteria can be found on just about anything. In fact, Coil said, just about everything that a person touches over the course of a day is covered in bacteria. Even people themselves are covered in and full of these microbes, and that’s not a bad thing.

“The vast majority of bacteria are irrelevant for human health, some are good, and there’s those few bad eggs that get all the press,” Coil said.

“What makes this ‘study’ particularly silly is that they even went one step further and tested for fecal coliforms and found zero, zilch, none,” Coil added. “Of course most pathogens are invisible to such a test, but it’s even more evidence that there’s nothing to be concerned about than the usual swabs story where they just say ‘ack! bacteria!’”

He also pointed out that the house bacteria data credited to the National Science Foundation actually came from NSF International, “a private certification company with a potentially vested interest in scaring people.” He also said that, since Travelmath released averages without giving people a look at the full range of data, the findings are “almost meaningless without some idea of how consistent they are.”

“But, he concluded, “since they’re measurements of something that doesn’t matter, perhaps that’s not such a problem. This is all not to say that people shouldn’t wash their hands, clean surfaces etc. Obviously some bacteria present a health risk and that risk needs to be managed. However, a ‘study’ like this doesn’t actually contribute anything useful to the discussion.”

—–

Feature Image: Thinkstock

Foldable paper microscope capable of 2000x magnification

It costs less than a dollar and looks like an origami project, but based on the buzz currently being generated by a foldable paper microscope created by a team of Stanford University bioengineers, it could radically change the way that researchers magnify objects out in the field.

The device is known as a Foldscope, and the researchers behind it call it a new approach to mass manufacture optical microscopes that are printed and folded from a single flat sheet of paper, not unlike the well-known Japanese art of paper folding. More impressively, the Foldscope costs just 50 cents to make and can magnify objects more than 2,000 times.

Dr. Manu Prakash, an assistant professor in the Stanford Department of Bioengineering, and his colleagues originally published their research in June 2014. Their device, they explained, utilizes sub-micron resolution, weighs less than a pair of nickels, is small enough to fit in a pocket, needs no external power and is durable enough to withstand being stepped on.

“Merging principles of optical design with origami,” they wrote at the time, “enables high-volume fabrication of microscopes from 2D media… This light, rugged instrument can survive harsh field conditions while providing a diversity of imaging capabilities, thus serving wide-ranging applications for cost-effective, portable microscopes in science and education.”

Putting Foldscope to the test in the field

Fast forward to the summer of 2015, when US field biologist Aaron Pomerantz received one of the devices in the mail and decided that he was going to put it to the test during a one-month long expedition to the Peruvian Amazon rainforest – an experience which he detailed in a recent story for the Huffington Post. Long story short: He came away impressed.

“This device is amazing,” Pomerantz, an entomologist and a molecular biologist who runs the website The Next Gen Scientist, wrote. “During my time in the Amazon rainforest, I was able to investigate tiny insects, mites, fungi and plant cells from 140x to 480x magnification without requiring a large and expensive conventional microscope.”

“Some of the diverse arthropod specimens could potentially be new to science, so it was really exciting to document images and videos of these organisms right there in the field by connecting my phone to the Foldscope,” he added.”This device is cheap, easy to use and broadly applicable whether you’re a curious young student, a medical professional in the field or someone who is interested in the numerous tiny things that surround us.”

Dr. Prakash’s team reported on their website that the Foldscope Beta-program is closed, and that they will provide updates if and when the device becomes available in another way. The website also encourages anybody interested in receiving announcements for the next large-scale phase of the trial to send an email with Foldscope in the subject line and a brief description of where they live and what kind of environment they plan to work in.

—–

Feature Image: TED Talks/Prakash Lab

Will drug addiction ever be cured?

 

As Russell Brand once famously wrote, the solution for addiction is simple, but prohibitively difficult: “Don’t pick up a drink or drug, one day at a time.” But this isn’t a cure—the need to consume the drug never fully goes away.

And as it turns out, asking when we will cure addiction is a bit of a loaded question, because as of now many experts wonder if it can even be cured. Addiction—like other mental health diseases—involves such a complicated array of factors that one sole cure will probably never help every single person who is struggling.

Addiction can happen easily and can last eternally, especially in those genetically (or epigenetically) predisposed towards it because it involves an everyday brain process. We humans have evolved a clever way to make sure we survive: by making the things necessary for the species’ continuation (e.g. food, sex) feel good. Experiencing these things releases dopamine, a neurotransmitter that evokes the feeling of pleasure.

Next, memories of the experience are recorded, so that when the cues present in the memory come up again in the future (like the scent of bacon), your brain can remind you that that thing feels good, and can encourage you to pursue it.

In the case of drugs, this brain pathway is hijacked, so that something you don’t need to survive like cocaine starts to feel like something you can’t live without. Drugs make you feel good, and your brain continually and powerfully reminds you of this until you get your fix—often at any means necessary.

Even in the case of those who are clean, the need is often a daily struggle, and the brain is quick to give back in to the disease—like in the case of Macklemore, who relapsed in 2011 after being prescribed cough syrup for an illness.

First we have to remove the triggers

Compounding the issue is the fact that even menial things can remind recovered addicts of what they are no longer taking. Being around people or places or even seeing things associated with drug culture can remind them of using.

And even if there was a cure for this hijacked pathway (or even for the genes), this is entirely ignoring other major causes of addiction—environmental and emotional strain. Sadness, depression, fear, anger, stress, and any sort of pain will drive those suffering to seek relief, and drugs are a potent form of self-medication.

The most famous example of this was a psychology experiment published in 1980 involving rats. Some were in solitary confinement, while others were in a beautiful living space with other rats to play with. All the rats were given access to drugs, but the isolated rats consumed much more of the drugs than the rats in the better, more stimulating environment.

So if we were to fully cure addiction—as in, make it easily treatable so that all signs and symptoms of addiction stop when the drugs consumption does—we would have to first fix the multiple root causes (environment and the need to resolve pain, genetics, and epigenetics) and then tackle the pathways, memories, and behaviors that continue the addiction. Until we can discover a way to do all of these things, addiction will be a disease that cannot be cured.

—–

Feature Image: Thinkstock

California EPA to label Roundup herbicide as ‘carcinogenic’

 

The California Environmental Agency has announced its intentions to have the active ingredient of Roundup—glyphosate—labelled as an agent “known to the state to cause cancer.” In the upcoming months, it will be added to a list of chemicals known to cause cancer, birth defects, and other reproductive harm under the action of Proposition 65. Once this happens, businesses will have to provide “clear and reasonable” warnings before exposing people to Roundup (and other glyphosate products).

Glyphosate goes far beyond Roundup, of course—it’s used in more than 750 different agriculture, forestry, urban, and home products. Further, it’s being used increasingly on genetically modified crops. Ninety-percent of corn and soybean crops were engineered to be resistant to glyphosate, meaning more of the chemical can be used on fields without harming crop yields.

This is potentially problematic, because its widespread use has led to it being detected in air during spraying, in water, in food, and in the blood and urine of agricultural workers—which indicates glyphosate is absorbed and possibly metabolized by humans. Yuck!

Chemical under scrutiny

Many previous studies have given the herbicide a clean report card, but glyphosate has been under a lot of scrutiny as of late. In March, the World Health Organization published a study which lead them to classify the chemical as “probably carcinogenic to humans.” The researchers mainly reviewed studies that examined the effect of glyphosate on rodents—they cited multiple studies in which glyphosate induced various types of cancers in mice and rats, as well as skin tumors in mice.

However, they examined a few human studies as well—several of which linked glyphosate to non-Hodgkin lymphoma. Further, the herbicide was found to induce damage in the DNA and chromosomes of mammals and in human and animal cells in vitro (which generally is how cancers start).

Monsanto, the creator of glyphosate, quickly rebutted the findings of the WHO. “We are outraged with this assessment,” said Dr. Robb Fraley, Monsanto’s Chief Technology Officer, in the Monsanto press release.

“This conclusion is inconsistent with the decades of ongoing comprehensive safety reviews by the leading regulatory authorities around the world that have concluded that all labeled uses of glyphosate are safe for human health. This result was reached by selective ‘cherry picking’ of data and is a clear example of agenda-driven bias.”

But in August, a new collaborative study out of several European universities and Guy’s Hospital in London added more evidence in favor of the WHO’s findings: When rats were administered an ultra-low dose (0.1 parts per billion) of Roundup over the course of two years, many rats experienced kidney and liver damage, and exhibited over 4,000 alterations in the genes of those organs.

For now, California is only labelling glyphosate as a carcinogen instead of restricting or banning it. Environmental groups are celebrating, however, because while it is unclear whether it is harmful to humans, it has decimated monarch butterfly populations.

As glyphosate is so widespread, it has killed an enormous amount of the butterflies’ only food source—milkweed. According to the Center for Biological Diversity, this has led to populations plummeting by 80% over the last 20 years.

—–

Feature Image: Thinkstock

Obese people may get exercise-like benefits from vitamin C

 

Want to enjoy the cardiovascular benefits of exercise without having to do all of that hard work? Well, you can’t. However, new research from the University of Colorado-Boulder suggests that a daily dose of vitamin C supplements could have similar heart-healthy advantages.

The findings, which were presented recently at the 14th International Conference on Endothelin: Physiology, Pathophysiology and Therapeutics in Savannah, Georgia, found that taking 500 mg of vitamin C per day improved the blood vessel tone (one key measure of cardiovascular health) as much taking brisk walks five to seven times per week over a three month period.

Vitamin C use, the authors explained, could reduce the activity of endothelin-1 (ET-1), a protein that constricts the blood vessels, in patients who are overweight or obese. The findings have not yet been published in a peer-reviewed journal and are preliminary, according to the Los Angeles Times, but they suggest that the benefits of the supplements is “substantial.”

Supplement use could benefit those with physical limitations

The trial, which was led by Caitlin Dow, a post-doctoral fellow at the university whose work is focused on nutrition and vascular biology, recruited 15 subjects who underwent the three month walking regimen and 20 others who used the vitamin C supplements. All of the participants were considered sedentary, were overweight or obese, and had impaired vascular tone.

Like most overweight or obese adults who do not exercise enough, their blood vessels were not responsive to experimental conditions with the strength and suppleness typically seen in normal, healthy men and women, the Los Angeles Times said. As a result, the individuals were dealing with a number of health-related problems, including inflammation and clot-promoting blood changes.

The average starting body-mass index (BMI) of the exercise group was 29.3, and the BMI of the vitamin C group was 31.3. Neither group lost any weight during the study, the newspaper noted. After three months of moderate-intensity exercise, the subjects vascular tone returned to healthy levels, but the same was also true for the group receiving 500 mg of daily vitamin C.

“This is not ‘the exercise pill,’” Dow said. While she said that the findings could benefit those unable to work out for prolonged periods of time due to injury or other physical problems, she also noted that regular physical activity has other benefits (lowering “bad” cholesterol, upping metabolic function, improving cognitive function) that supplement use cannot replicate.

—–

Feature Image: Thinkstock

Planck satellite captures stunning new images of Magellanic Clouds

 

The European Space Agency recently released an image captured by their Planck satellite of the two Magellanic Clouds, some of the closest objects to our Milky Way galaxy.

The Large Magellanic Cloud is about 160,000 light-years away, and appears as the large red and orange blob near the middle of the image, while the Small Magellanic Cloud is a bit further away—200,000 light-years—and is the similarly-colored blob closer to the bottom left corner of the image.

Each is classified as a dwarf galaxy, weighing between seven and ten billion times as much as the Sun. In comparison, the Milky Way galaxy weighs at least around 210 billion times as much as the sun, with some estimates being much higher.

If you were wondering how this image looks like some kind of Vincent van Gogh painting, then you’re in luck—we have answers!

The hazy appearance is caused by things floating around in the foreground—hundreds of thousands of light years worth of foreground.

Light and dust from every galaxy between the Planck satellite and the two Magellanic Clouds interfered with the dwarf galaxies’ radiation. The Planck satellite measures cosmic background radiation, and since all of these objects in space give off different kinds of radiation in different levels of intensity, the data gathered by the satellite is represented as the swirly, trippy image you see here.

To revive some 1980s slang in the most cheesy but also the most appropriate way possible: pretty freakin’ stellar, huh?

—–

Feature Image: ESA

Work stress as bad for you as secondhand smoke, study says

 
Working weekends, putting in extra hours, dealing with irritating bosses – odds are, most of us have had to deal with these kinds of on-the-job stresses during our careers, but new research has suggested that these experiences could be as harmful to our health as secondhand smoke.
According to CNN and Daily Mail reports, experts from Harvard University looked at evidence from 228 different studies investigating stress-related issues in the workplace, and found that by working long hours or worrying over job security, we are actually increasing the chances that we will experience a serious illness, and even increase our risk of an early death.
During their research, assistant professor of business administration Joel Goh and his colleagues found that highly demanding jobs increase the risk of having an illness diagnosed by a doctor by 35 percent, while long work hours increased risk of early death by nearly 20 percent. Fears about losing your job increased the odds of adverse health by about 50 percent, they added.
As Goh told CNN, “When you think about how much time individuals typically spend at work, it’s not that surprising.” He added that he hoped the study would help companies re-evaluate how they manage their employees, reducing the demand that work be completed more quickly and the need for people to put in longer hours in order to improve their overall wellbeing.
workplace stress
Re-examining the workplace atmosphere may be beneficial
The Harvard-led team looked at a variety of different job-related stressors believed to adversely impact a person’s health, including their employment status, the number of hours in which they work, whether or not they work shifts, how often work conflicted with family life, job demands, the availability of health insurance, and the perceived level of workplace fairness.
Next, they investigated how each of these factors affected four outcomes, the Daily Mail said: how people rated their physical health, how they rated their mental wellbeing, how likely they were to be diagnosed with a medical condition by a doctor, and the likelihood that they would have a premature death. They found that being unemployed, having little control over a job, or not having health insurance were all about as deadly as exposure to second-hand smoke.
Although some employers have wellness programs that may encourage employees to join a gym, lose some weight, or kick the smoking habit, Goh told Boston.com that they are often inadequate because they only target employee behavior and do not address the underlying causes of stress. A company should instead look at the kind of work environment that they create, he added.
“We’re not prescribing methodology to mitigate stress, but we’re trying to open up conversation to say ‘these things matter,’” the professor explained. “Assuming an employer cares about their employee for benevolent or bottom line reasons, we think this is something many employers haven’t thought on about. We’re trying to say employers have a new control they weren’t aware about.”
—–
Feature Image: Thinkstock
Story Image: NBC/The Office

Apple reportedly working on an ultra-thin iPhone 7

 

It may be less than 48 hours until Apple officially unveils the iPhone 6S and 6S Plus, but new reports currently making the rounds online are suggesting that Apple is already hard at work on a super-thin version of their next, next smartphone.

According to Mashable and Apple Insider, a document from KGI Securities’s analyst Ming-Chi Kuo reveals that Apple is planning to produce an iPhone 7 model that is between 6.0 and 6.5 millimeters thick – considerably thinner than the current model iPhone 6, which is 6.9 millimeters thick, and the thicker, 7.1 meter iPhone 6 Plus.

The rumored iPhone 7 will be made from extremely durable 7000 series aluminum alloy, which is also reportedly going to be used in the iPhone 6 and 6 Plus’ cases as well, thus causing them to be less susceptible to bending. The thin nature of the phone means that Apple will likely have to stick with the same Force Touch technology as its 2016 iPhone upgrade, Kuo said.

Ultimately, the analyst believes that Apple would prefer to upgrade to a glass-on-glass solution for Force Touch sensing, especially if the smartphones begin using displays that are even higher resolution than existing models. However, the reports suggest that this is unlikely happen in the near future, as current glass-on-glass options would make such a thin iPhone impossible.

But let’s not get ahead of ourselves!

While rumors often prove untrue in the tech industry, Mashable pointed out that Kuo “has a solid track record,” as in October 2013, he successfully predicted that Apple was working on a 12-inch Retina MacBook – nearly a year and a half before it was released in March 2015.

Kuo said nothing else regarding the iPhone 7, but according to Apple Insider, expert believe that it will introduce a new chassis to differentiate from the iPhone 6 series, as the company follows a pattern of redesigning its smartphone line every two years with an S-model upgrade that is nearly identical in appearance to its predecessor in between, they noted.

“That’s what’s expected to be unveiled this Wednesday, when Apple is likely to showcase an ‘iPhone 6S’ with a chassis that appears very similar to the iPhone 6,” the website explained. The upgraded device will feature “Force Touch input, a faster ‘A9’ processor, 2 gigabytes of RAM, and higher resolution cameras,” they added, and come in “a new rose gold color option.”

—–

Feature Image: Concept designed by Yasser Farahi